Magic AI Startup Raises $320M Funding for Supercomputers

Magic AI Startup Raises $320M Funding for Supercomputers

By
Lorenzo Cruz
3 min read

Magic Secures $320 Million in Funding for AI-Powered Coding Revolution

Magic, an AI startup, has just received a substantial financial boost of $320 million, propelling its total funding to an impressive $465 million. Notable figures such as former Google CEO Eric Schmidt and Alphabet's CapitalG are counted among their backers.

But money is not the only thing on Magic's side. They have joined forces with Google Cloud to construct two supercomputers. The first, Magic-G4, will utilize Nvidia H100 GPUs, while the second, Magic G5, will incorporate Nvidia’s next-gen Blackwell chips. These supercomputers play a vital role in efficiently training and executing AI models.

Established in 2022 by Eric Steinberger and Sebastian De Ro, Magic is on a mission to revolutionize coding. Their tools function as an automated pair programmer, offering assistance with coding tasks such as writing, reviewing, debugging, and planning changes. Think of it as having an incredibly astute companion who never tires.

What sets Magic apart from the crowd? Their models feature extraordinary long context windows, enabling them to handle enormous volumes of code without losing track. Their most recent model, LTM-2-mini, boasts an impressive 100 million-token context window, equivalent to 10 million lines of code or 750 novels. This is an absolute game-changer!

Magic's extensive context capability helps prevent errors and maintains the AI's focus. They have already demonstrated remarkable results, such as autonomously creating a password strength meter and a calculator. Presently, they are training an even larger model to further expand boundaries.

So, if you find yourself knee-deep in code, Magic is here to provide a helping hand. With their groundbreaking technology and substantial funding, they are on track to make coding a seamless experience. Stay tuned for more magic!

Key Takeaways

  • Magic secures $320 million, bringing total funding to $465 million.
  • Backers include ex-Google CEO Eric Schmidt and Alphabet’s CapitalG.
  • Partnership with Google Cloud to build supercomputers using Nvidia GPUs.
  • AI tools aim to automate coding tasks, similar to GitHub Copilot.
  • Magic’s LTM-2-mini model boasts a 100 million-token context window, the largest in the industry.

Analysis

Magic's raised $320 million, supported by tech heavyweights, is propelling the development of AI coding tools. This influx reinforces their partnership with Google Cloud, enhancing supercomputer capabilities. In the short term, competitors such as GitHub Copilot face increasing innovation pressure. In the long term, Magic's expanded AI models could dominate coding automation, reshaping software development workflows. Nvidia will benefit from GPU demand, while investors anticipate substantial returns as Magic expands.

Did You Know?

  • Nvidia H100 GPUs and Nvidia’s next-gen Blackwell chips:
    • Nvidia H100 GPUs: These are high-performance graphics processing units designed by Nvidia, primarily used for accelerating complex computations in areas like artificial intelligence and deep learning. The H100 series is known for its advanced architecture that significantly boosts the speed and efficiency of AI model training and inference.
    • Nvidia’s next-gen Blackwell chips: Blackwell is the codename for Nvidia's upcoming generation of GPUs, expected to succeed the current generation. These chips are anticipated to introduce further advancements in processing power, energy efficiency, and AI capabilities, making them ideal for the demanding tasks of supercomputing and large-scale AI model training.
  • 100 million-token context window:
    • Context window in AI: This refers to the maximum amount of text data that an AI model can consider and process at once during its operations. A larger context window allows the AI to maintain more information and context from the input data, improving its ability to generate coherent and relevant outputs.
    • 100 million-token context window: This is an exceptionally large context window, significantly larger than what is commonly used in current AI models. It allows the AI to handle vast amounts of information, such as millions of lines of code, without losing track of context. This capability is particularly useful in complex tasks like coding, where maintaining a broad understanding of the codebase is crucial.
  • GitHub Copilot:
    • GitHub Copilot: Developed by GitHub in collaboration with OpenAI, Copilot is an AI-powered tool designed to assist software developers by providing code suggestions and completions directly within their integrated development environments (IDEs). It uses machine learning models trained on a large corpus of code to help programmers write code more efficiently, offering solutions to common coding problems and speeding up the development process.

You May Also Like

This article is submitted by our user under the News Submission Rules and Guidelines. The cover photo is computer generated art for illustrative purposes only; not indicative of factual content. If you believe this article infringes upon copyright rights, please do not hesitate to report it by sending an email to us. Your vigilance and cooperation are invaluable in helping us maintain a respectful and legally compliant community.

Subscribe to our Newsletter

Get the latest in enterprise business and tech with exclusive peeks at our new offerings