Elon Musk has outlined his plans for the way forward for his xAI mission, together with a brand new open supply highway map for the corporate’s Grok fashions.
Over the weekend, Musk reiterated that his plan is for xAI to be operating the equal of fifty million H100 chips throughout the subsequent 5 years.
Nvidia’s H100 chips have turn out to be the trade commonplace for AI tasks, as a consequence of their superior processing capability and efficiency. A single H100 chip can course of round 900 gigabytes per second, making it good for giant language fashions, which depend on processing billions, even trillions of parameters rapidly, with a view to present quick, correct AI responses.
No different chips have been discovered to carry out as properly for this job, which is why H100’s are in excessive demand, larger than Nvidia itself can sustain with. And all the most important AI tasks need to stockpile as many as they will, however at a value of round $30,000+ per unit, it’s a serious outlay for large-scale tasks.
At current, xAI is operating round 200k H100s inside its Colossus knowledge heart in Memphis, whereas it’s additionally engaged on Colossus II to increase on that capability. OpenAI can also be reportedly working round 200k H100s, whereas Meta is on observe to blow them out of the water with its newest AI investments, which is able to take it as much as round 600k H100 capability.
Meta’s methods aren’t totally operational as but, however by way of compute, it does appear, at this stage, like Meta is main the cost, whereas it’s additionally planning to speculate a whole bunch of billions extra into its AI tasks.
Although there may be one other consideration inside this, in extra customized chips that every firm is engaged on to scale back reliance on Nvidia, and provides themselves a bonus. All three, in addition to Google, are creating their very own {hardware} to energy up their AI fashions, and since these are all inside tasks, there’s no figuring out precisely how a lot they’ll improve capability, and allow larger effectivity on the again of their current H100s.
So we don’t know which firm goes to win out within the AI compute race within the longer-term, however Musk has clearly acknowledged his intention to finally outmanoeuvre his opponents with xAI’s technique.
Perhaps that’ll work, and definitiely, xAI already has a status for increasing its tasks at velocity. However that would additionally result in setbacks, and on stability, I might assume that Meta is probably going greatest positioned to finish up with essentially the most capability on this entrance.
After all, capability doesn’t essentially outline the winner right here, as every mission should provide you with useful, modern use circumstances, and set up partnerships to increase utilization. So there may be extra to it, however from a pure capability standpoint, that is how Elon hopes to energy up his AI mission.
We’ll see if xAI is ready to safe sufficient funding to maintain on this path.
On one other entrance, Elon has additionally outlined xAI’s plans to open supply all of xAI’s fashions, with Grok 2.5 being launched on Hugging Face over the weekend.
That signifies that anyone can view the weights and processes powering xAI’s fashions, with Elon vowing to launch each Grok mannequin on this method.
The concept, then, is to boost transparency, and guarantee belief in xAI’s course of. Which is attention-grabbing, contemplating that Grok 4 particularly checks in on what Elon thinks about sure points, and weighs that into its responses:
I ponder if that can nonetheless be part of its weighting when Grok 4 will get an open supply launch.
As such, I ponder whether that is extra PR than transparency, with a sanitized model of Grok’s fashions being launched.
Both method, it’s one other step for Elon’s AI push, which has now turn out to be a key focus for his X Corp empire.