Wednesday, April 3, 2024

"How The Stargate Project Could Change AI" (MSFT)

Following on April 1's "OpenAI and Microsoft Plan $100 Billion AI ‘Stargate’ (MSFT)".

From PYMNTS.com April 2:

OpenAI and Microsoft’s reported $100 billion data center project could create advanced, human-like language models that understand context, emotions and nuance, potentially revolutionizing artificial intelligence (AI) and commerce, experts said.

The project underscores the growing importance of AI in driving innovation and shaping the future of commerce. Experts said that as major tech players invest heavily in AI research and infrastructure, the development of sophisticated AI systems could revolutionize areas such as personalized marketing and supply chain optimization

“It is important to consider the potential impact on jobs and the workforce,” Jiahao Sun, founder and CEO at FLock.io, a platform for decentralized AI models, told PYMNTS. “As AI becomes more capable in multimodal and integrated into commerce, it may automate industries that currently cannot easily be transferred into a “chatbot” interface, such as manufacturing, healthcare, sports coaching, etc.”

Mega Data Drives Innovation
The Information reported recently about a new initiative involving an AI supercomputer named “Stargate,” set to launch in 2028. The project, sources close to the discussions said, is expected to be funded by Microsoft and would surpass the cost of the biggest data centers to date. Stargate represents the most significant endeavor in a series of supercomputers planned by Microsoft and OpenAI.

The scale of the data center project could accelerate the development and deployment of large AI models, Sun said. Such models have the potential to develop more human-like language models and enhanced computer vision systems.

Sun even suggested that such advancements could lead to the creation of intelligent machines that can understand context, emotions and nuance, accurately perceive and interpret complex visual scenes, and interact with the world in ways that closely resemble human cognition.

AI, or more specifically, large language models (LLMs) like OpenAI’s ChatGPT, consume enormous resources in the process of training and frequently re-training the models on billions of parameters, Tim Negris, chief marketing officer at MOCA Systems, a provider of data center construction planning software, told PYMNTS.

These resources include thousands of GPU processors, predominantly from Nvidia, which in turn consume massive amounts of electricity and generate a great deal of heat, which requires powerful cooling systems. While the Microsoft/OpenAI data center construction project is one of the largest so far, similar mega-projects are being led by the other so-called “hyperscalers,” which include MetaAmazon Web ServicesGoogle, and more recently, Oracle....

....“Expect to see more $100 billion data centers popping up soon”....

....MUCH MORE