Don’t miss out on NVIDIA Blackwell! Join the waitlist.

Lambda Stack is
All the AI Software You Need

Lambda Stack provides a one line installation and managed upgrade path for: PyTorch®, TensorFlow, CUDA, cuDNN, and NVIDIA Drivers. It's compatible with Ubuntu 22.04 LTS and 24.04 LTS. No more futzing with your Linux AI software, Lambda Stack is here.
TensorFlow_logo-color Pytorch_logo-color Keras_logo-color Ubuntu_logo-color NVIDIA_Cuda_logo-color

Install Lambda Stack in one command

To install Lambda Stack on your desktop, run this command on a fresh Ubuntu installation (22.04 and 24.04). For servers, see the server installation section below.
wget -nv -O- https://lambdalabs.com/install-lambda-stack.sh | sh -
sudo reboot

Install CUDA, Pytorch, and Tensorflow on Ubuntu with a single line

lambda_stack

Always updated AI software stack. Usable everywhere.

Lambda Stack can run on your laptop, workstation, server, cluster, inside a container, on the cloud, and comes pre-installed on every Lambda GPU Cloud instance. It provides up-to-date versions of PyTorch®, TensorFlow, CUDA, CuDNN, NVIDIA Drivers, and everything you need to be productive for AI.
under_desk_icon-large

Under Desk

lambda_api-large-2

On-Premise

colocation_icon-large-1

Container

cloud_icon-large-1

Cloud

Keep your AI software up-to-date with one command

Run this command and all of your AI software, from PyTorch® to CUDA, will be updated. Like Magic.

sudo apt-get update && sudo apt-get dist-upgrade

CUDA_update

Compatible with your Docker and NGC containers

If you're already using GPU docker images or NGC containers, rest assured that Lambda Stack can run them.

After you've installed Lambda Stack, you can install a version of GPU accelerated Docker with this command:

sudo apt-get install docker.io nvidia-container-toolkit

compatible_with_docker_and_NGC

NVIDIA NGC Tutorial: Run a PyTorch Docker Container on Ubuntu with Lambda Stack

We've written open source Lambda Stack GPU Dockerfiles

Lambda Stack's open source Dockerfiles let you create Docker images that already have Lambda Stack pre-installed. They're available in our git repository: https://github.com/lambdal/lambda-stack-dockerfiles/.
open_source_lambda_stack_GPU_dockerfiles

Lambda Stack supports air gapped / behind the firewall installations

You can install an air gapped copy of Lambda Stack to be delivered securely behind your firewall.
lambda_stack_supports_behind_firewall_installations-1

Everyone loves Lambda Stack — used by the F500, research labs, and the DOD

Every laptop, workstation, and server that we ship comes pre-installed with Lambda Stack. It's loved by thousands of Lambda customers.
Intuitive_logo writer_logo sony_logo samsung_logo pika_logo

Lambda Stack includes a system-wide package, a Dockerfile, and a Docker image

Lambda Stack is not only a system-wide installation of all of your favorite frameworks and drivers, but also a convenient "everything included" deep learning Docker image. Now you'll have your team up and running with GPU-accelerated Docker images in minutes instead of weeks. To learn more about how to set up Lambda Stack GPU Dockerfiles check out our tutorial:

https://lambdalabs.com/blog/set-up-a-tensorflow-gpu-docker-container-using-lambda-stack-dockerfile/

Works with Ubuntu 22.04 and 24.04
Docker images of Lambda Stack + Ubuntu: Lambda Stack Dockerfiles
Included Deep Learning frameworks: TensorFlow, Keras, PyTorch®, Triton
Included GPU software: CUDA, cuDNN, NVIDIA drivers
Includes dev tools: Git, Vim, Emacs, Valgrind, tmux, screen, htop, build-essential 

Create an Ubuntu 22.04 Docker image with PyTorch® & TensorFlow support

# Build a Docker image for Ubuntu 22.04 (jammy). You can substitute jammy for focal or noble to change the ubuntu version.
sudo docker build -t lambda-stack:22.04 -f Dockerfile.focal git://github.com/lambdal/lambda-stack-dockerfiles.git

Using Lambda Stack with python virtual environments

We're often asked how to best use Lambda Stack with a python virtual environment. You have two choices: use Lambda Stack as a way to install CUDA, CuDNN, and NVIDIA drivers; or, use Lambda Stack as a way to manage TensorFlow and PyTorch® as well as CUDA, CuDNN, NVIDIA drivers. Here's how to do that: 

python3 -m venv lambda-stack-with-tensorflow-pytorch --system-site-packages
source lambda-stack-with-tensorflow-pytorch/bin/activate

 

Here's how to do it where the TensorFlow version is managed within the virtual environment:

python3 -m venv lambda-stack-without-tensorflow
source lambda-stack-without-tensorflow/bin/activate
# Note, we need to install libcudnn8 separately.
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/libcudnn8_8.1.1.33-1+cuda11.2_amd64.deb
https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/libcudnn8-dev_8.1.1.33-1+cuda11.2_amd64.deb
sudo dpkg -i libcudnn8_8.1.1.33-1+cuda11.2_amd64.deb
sudo dpkg -i libcudnn8-dev_8.1.1.33-1+cuda11.2_amd64.deb
sudo apt-get install -f  # resolve dependency errors you saw earlier
pip install tensorflow-gpu

Install Lambda Stack on Ubuntu 22.04 and Ubuntu 24.04 servers

This headless installation will work for servers running Ubuntu 22.04 and Ubuntu 24.04 without a GUI (i.e. Ubuntu 22.04 server edition and Ubuntu 24.04 server edition).

wget -nv -O- https://lambdalabs.com/install-lambda-stack.sh | I_AGREE_TO_THE_CUDNN_LICENSE=1 sh -

Use Lambda Stack in a shell script, Dockerfile, Ansible file, etc.

If you want to integrate Lambda Stack installation into a script, you'll likely want to avoid all user input prompts. To use Lambda Stack in this way, you must have read and agreed to the CUDNN license.

wget -nv -O- https://lambdalabs.com/install-lambda-stack.sh | I_AGREE_TO_THE_CUDNN_LICENSE=1 sh -

How to update / upgrade to the latest Lambda Stack

Do this if a new version of PyTorch®, TensorFlow (or any other framework) is released and you want to upgrade.

sudo apt-get update && sudo apt-get dist-upgrade

This will upgrade all packages, including dependencies such as CUDA, cuDNN, and NVIDIA drivers.