On-demand GPU clusters for multi-node training & fine-tuning
GPU instances billed by the minute
Private large-scale GPU clusters
Inference endpoints & API
Free inference playground
NVIDIA's latest generation of infrastructure for enterprise AI.
PCIe server with up to 8x customizable NVIDIA Tensor Core GPUs and dual Xeon or AMD EPYC prosessors.
Lambda's GPU workstation designer for AI. Up to four fully customizable NVIDIA GPUs.
Lambda's GPU desktop for deep learning. Configured with two NVIDIA RTX 4500 Ada or RTX 5000 Ada.
Lambda's single GPU desktop. Configured with a single NVIDIA RTX 4000 Ada.
TLDR
Published on February 28, 2022 by Chuan Li