Meta Unveils Plans for AI Training Infrastructure with Nvidia H100 GPUs

By
Miguel Ángel Delgado Escobar
1 min read
⚠️ Heads up: this article is from our "experimental era" — a beautiful mess of enthusiasm ✨, caffeine ☕, and user-submitted chaos 🤹. We kept it because it’s part of our journey 🛤️ (and hey, everyone has awkward teenage years 😅).

Meta has revealed insights into its AI training infrastructure, indicating its current dependence on nearly 50,000 Nvidia H100 GPUs for training its open source Llama 3 LLM. By the end of 2024, Meta anticipates the deployment of over 350,000 Nvidia H100 GPUs, along with additional computing power from other sources. This transition aligns with Meta's commitment to expand its AI capabilities by manufacturing its own AI chips, like the Artemis, and producing custom RISC-V silicon. Furthermore, Meta is also focused on optimizing storage solutions for AI training, developing a Linux Filesystem in Userspace backed by a version of its 'Tectonic' distributed storage solution. The company's progressive approach towards AI infrastructure reflects its determination to enhance AI research and development across various domains. The revelation of Meta's strategies presents notable insights into the company's future endeavors towards AI training and hardware development.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings

We use cookies on our website to enable certain functions, to provide more relevant information to you and to optimize your experience on our website. Further information can be found in our Privacy Policy and our Terms of Service . Mandatory information can be found in the legal notice