Date:

Global AI Computing to Consume Power of Multiple NYCs by 2026

AI’s Compute Appetite Continues to Grow, Requiring Massive Data Centers

New York City-sized Data Centers to Support AI Deployment

Thomas Graham, co-founder of chip startup Lightmatter, told Mandeep Singh of Bloomberg Intelligence that data centers equivalent to eight times the power draw of New York City will be under construction come 2026 to serve deployment of AI.

Nvidia and Its Partners Build Larger Data Centers to Meet AI Compute Needs

Nvidia and its partners and customers have steadily built larger and larger computer facilities around the world to handle the compute-intensive needs of training giant artificial intelligence (AI) programs such as GPT-4. This effort will gain continued importance as more AI models are put into production, says one startup serving the tech giants.

The Next Stage of AI’s Compute Appetite: Inference

Graham turned the question around, suggesting that the next stage of AI’s compute appetite is putting trained neural nets into production. "If you view training as R&D, inferencing is really deployment, and as you’re deploying that, you’re going to need large computers to run your models," said Graham.

Lightmatter’s Optical Computing Technology

Lightmatter, founded in 2018, is developing a chip technology that can join multiple processors together on a single semiconductor die using optical connections — which can replace conventional network links between dozens, hundreds, or even thousands of chips needed to build AI data centers. Optical interconnects can move data faster than copper wires at a fraction of the energy draw.

The Future of AI Data Centers

For a sense of the scale of the deployment, Graham pointed out that there are at least a dozen new AI data centers planned or in construction now that require a gigawatt of power to run. "Just for context, New York City pulls five gigawatts of power on an average day. So, multiple NYCs." By 2026, it’s expected the world’s AI processing will require 40 gigawatts of power "specifically for AI data centers, so eight NYCs."

Conclusion

The growth of AI’s compute appetite is expected to continue, with massive data centers needed to support the deployment of AI models. As the demand for AI computing infrastructure increases, innovative companies like Lightmatter are working to develop new technologies to support this growth.

FAQs

Q: What is the next stage of AI’s compute appetite?
A: Putting trained neural nets into production.

Q: What is Lightmatter’s chip technology?
A: A chip technology that can join multiple processors together on a single semiconductor die using optical connections.

Q: How much power will AI data centers require in 2026?
A: 40 gigawatts of power, equivalent to eight NYCs.

Q: What is the scale of the deployment of AI data centers?
A: At least a dozen new AI data centers are planned or in construction, requiring a gigawatt of power to run.

Latest stories

Read More

LEAVE A REPLY

Please enter your comment!
Please enter your name here