At a closed-door Morgan Stanley tech conference in San Francisco, Sam Altman shared insights on the trajectory of AI and its broader economic implications. While the event remained off-limits to the public and media, analysts have now provided a glimpse into the discussion—offering key takeaways on AI’s deflationary effect, compute shortages, and the evolving role of synthetic data.
One of the most striking points Altman made was how AI is fundamentally deflationary—a perspective he believes is widely underestimated by investors. As AI-powered automation drives efficiency and productivity, it exerts downward pressure on costs across industries. Morgan Stanley analysts echoed this sentiment, noting that AI’s ability to enhance global efficiency could serve as a counterbalance to inflation.
Follow THE FUTURE on LinkedIn, Facebook, Instagram, X and Telegram
The cost of generative AI is plummeting as new techniques make model creation faster and cheaper. With an increasing number of high-performing models available, AI is shifting from a premium technology to a commodity, benefiting developers and businesses that can now access powerful AI tools at a fraction of last year’s cost.
The GPU Bottleneck: OpenAI’s Compute Struggles
Despite AI’s deflationary potential, OpenAI faces a massive bottleneck in computing power. The company revealed that its GPU infrastructure is operating at full capacity—both during training, when AI models are built, and inference, when they are deployed in real-world applications.
This is a major concern in an industry where AI breakthroughs are directly tied to computing resources. Morgan Stanley analysts noted that a small number of dominant players are monopolizing the training of large language models (LLMs), requiring unprecedented levels of compute power. With growing speculation about the long-term demand for GPUs, OpenAI’s admission highlights a critical industry-wide challenge.
Synthetic Data: OpenAI’s Answer To Training Limitations
While compute capacity remains a bottleneck, data supply isn’t an issue for OpenAI. The company emphasized that it can leverage GPUs and existing AI models to generate synthetic data, a practice that is becoming increasingly valuable in training next-generation AI.
Unlike compute power, which remains a scarce resource, synthetic data allows AI systems to continue improving without relying solely on real-world datasets. Morgan Stanley analysts pointed out that while compute shortages and energy constraints may pose long-term challenges, data availability is not among OpenAI’s primary concerns.
As AI continues to evolve, Altman’s insights provide a crucial look at the challenges and opportunities shaping the industry. The deflationary effects of AI, compute limitations, and the rise of synthetic data will all play a pivotal role in determining the future of AI’s impact on the global economy.