OpenAI announced Wednesday that it would begin using chips from Cerebras, a startup based in Sunnyvale, Calif., to expand its computing capabilities for artificial intelligence technologies. The agreement will see OpenAI utilizing a significant number of Cerebras chips, consuming approximately 750 megawatts of electricity, enough to power tens of thousands of homes.
This partnership marks the latest in a series of collaborations for OpenAI as the company seeks to bolster the computational power required to develop and deploy its AI models, including ChatGPT. The company has previously signed deals with Nvidia and AMD for their chips, and is also working with Broadcom to design its own custom chips.
OpenAI's pursuit of enhanced computing infrastructure reflects a broader trend within the tech industry. Companies like OpenAI, Amazon, Google, Meta, and Microsoft are collectively investing hundreds of billions of dollars in new data centers to support the growing demands of AI. These companies are projected to spend over $325 billion on these facilities by the end of this year alone. OpenAI is actively building data centers in Abilene, Texas, and planning additional facilities in other locations across Texas, New Mexico, Ohio, and the Midwest.
The need for such massive computing power stems from the nature of modern AI, particularly large language models like ChatGPT. These models are trained on vast datasets and require immense processing capabilities to learn patterns and generate human-quality text. The more data and the more complex the model, the greater the demand for computing resources. This demand has fueled a surge in innovation in the chip industry, with companies like Cerebras developing specialized hardware designed to accelerate AI workloads.
Cerebras's approach involves building very large chips, known as wafer-scale engines, that can handle massive amounts of data in parallel. This architecture is particularly well-suited for training AI models, which often involve processing large batches of data simultaneously.
OpenAI had previously stated that it would deploy enough Nvidia and AMD chips to consume 16 gigawatts of power, highlighting the scale of its computing ambitions. The addition of Cerebras chips further underscores the company's commitment to securing the resources necessary to remain at the forefront of AI development. The implications of this rapid expansion of AI capabilities are far-reaching, potentially impacting various sectors, from healthcare and education to finance and transportation. As AI models become more powerful and pervasive, questions surrounding their ethical use, potential biases, and societal impact become increasingly important.
Discussion
Join the conversation
Be the first to comment