Using NVIDIA NIM Microservices

You can use NVIDIA NIM Microservices available on build.nvidia.com models with AI Accelerator. The models can run in the build.nvidia.com or in your environment under your control.

You can learn how to use NVIDIA NIM with AI Accelerator in both scenarios:

Using NVIDIA NIM Microservices (Hosted)

Learn how to use NVIDIA NIM Microservices in the build.nvidia.com hosted by NVIDIA.

Using NVIDIA NIM Microservices available in your environment

Deploy NVIDIA NIM Microservices from build.nvidia.com using AI Factory's KServe-based Model Serving with Kubernetes YAML. No Docker required.


Could this page be better? Report a problem or suggest an addition!