Nvidia Launches NIM: Streamlining AI Model Deployment
Nvidia announces NIM, a new platform to simplify the deployment of AI models, aiming to create an ecosystem of AI-ready containers with curated microservices. NIM supports models from various providers and is integrated into frameworks such as Deepset, LangChain, and LlamaIndex. The platform also involves partnerships with Amazon, Google, and Microsoft to make NIM microservices available on their respective platforms. Nvidia's GPU is positioned as the best place to run inference for these models, and NIM is regarded as the optimal software package, with plans to expand its capabilities over time. NIM's current users include leading companies like Box, Cloudera, and Dropbox. According to Jensen Huang, NIM's microservices are the building blocks for enterprises to become AI-powered companies.