OpenAI is hitting a roadblock with its latest AI model, GPT-4.5. CEO Sam Altman has confirmed that the company is running out of GPUs, delaying a full-scale launch. While some users will gain early access, the company is scrambling to secure more hardware to expand availability.
A Giant Model Stuck in a Hardware Bottleneck
GPT-4.5 is massive—both in capability and cost. Altman described it as “giant” and “expensive,” requiring tens of thousands of additional GPUs to support wider access.
Right now, OpenAI is rolling it out in phases:
- ChatGPT Pro users get first access starting Thursday.
- ChatGPT Plus subscribers will follow next week.
For many, the wait will depend on how fast OpenAI can acquire new hardware. The shortage is a clear sign that even leading AI companies struggle with the sheer computing power their models demand.
The Price of Progress: GPT-4.5’s Expensive Model
The cost of using GPT-4.5 is making waves. OpenAI has set the price at:
- $75 per million tokens for inputs (roughly 750,000 words).
- $150 per million tokens for outputs.
This marks a sharp increase—30 times the input cost and 15 times the output cost of the previous GPT-4o model. The pricing signals that running such an advanced AI is an expensive affair, limiting widespread accessibility.
OpenAI’s Strategy to Fix the GPU Crisis
Altman acknowledged the company’s growing pains in a post on X, confirming plans to acquire “tens of thousands of GPUs next week.” This should help expand access, particularly for Plus-tier users.
Long-term, OpenAI is looking beyond short-term fixes. The company is working on:
- Developing its own AI chips to reduce reliance on third-party hardware.
- Building a global network of data centers to handle increasing AI demands.
These efforts could make future rollouts smoother, but they won’t solve the immediate issues facing GPT-4.5 users.
‘Wanted to Launch It to Plus, Pro Simultaneously’
Altman expressed frustration over the staggered release, stating OpenAI originally planned to launch GPT-4.5 for both Plus and Pro users at the same time. The GPU shortage forced a last-minute change.
For now, users willing to pay more will get early access, while others will have to wait—yet another sign that AI’s future is increasingly dictated by hardware constraints.