Change the repository type filter
All
Repositories list
541 repositories
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
- CUDA Core Compute Libraries
- Documentation repository for NVIDIA Cloud Native Technologies
- C++ and Python support for the CUDA Quantum programming model for heterogeneous quantum-classical workflows
- NVIDIA device plugin for Kubernetes
- A GPU-accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep learning training and inference applications.
- TensorRT Model Optimizer is a unified library of state-of-the-art model optimization techniques such as quantization, pruning, distillation, etc. It compresses deep learning models for downstream deployment frameworks like TensorRT-LLM or TensorRT to optimize inference speed on NVIDIA GPUs.
- A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper and Ada GPUs, to provide better performance with lower memory utilization in both training and inference.
- AIStore: scalable storage for AI applications