Persistent storage for Lambda Cloud is expanding. Filesystems are now available for all regions except Utah, which is coming very soon.
The Lambda Deep Learning Blog
Lambda Cloud Clusters now available with NVIDIA GH200 Grace Hopper Superchip
November 13, 2023
DeepChat 3-Step Training At Scale: Lambda’s Instances of NVIDIA H100 SXM5 vs A100 SXM4
October 12, 2023
Lambda Launches New Hyperplane Server with NVIDIA H100 GPUs and AMD EPYC 9004 series CPUs
September 07, 2023
- gpu-cloud (23)
- tutorials (23)
- benchmarks (21)
- announcements (14)
- lambda cloud (13)
- hardware (11)
- NVIDIA H100 (10)
- tensorflow (9)
- gpus (8)
- NVIDIA A100 (7)
- deep learning (6)
- hyperplane (6)
- training (6)
- LLMs (5)
- company (5)
- gpu clusters (5)
- CNNs (4)
- generative networks (4)
- news (4)
- presentation (4)
- rtx a6000 (4)
On-demand HGX H100 systems with 8x NVIDIA H100 SXM instances are now available on Lambda Cloud for only $2.59/hr/GPU.
Lambda Demos streamlines the process of hosting your own machine learning demos. Host a Gradio app using your existing repository URL in just a few clicks.
Tired of waiting in a queue to try out Stable Diffusion or another ML app? Lambda GPU Cloud’s Demos feature makes it easy to host your own ML apps.
Lambda Cloud has deployed a fleet of NVIDIA H100 Tensor Core GPUs, making it one of the FIRST to market with general-availability, on-demand H100 GPUs. The high-performance GPUs enable faster training times, better model accuracy, and increased productivity.
In early April, NVIDIA H100 Tensor Core GPUs, the fastest GPU type on the market, will be added to Lambda Cloud. NVIDIA H100 80GB PCIe Gen5 instances will go live first, with SXM to follow very shortly after.
Learn how to use mpirun to launch a LLaMA inference job across multiple cloud instances if you do not have a multi-GPU workstation or server.
Lambda's GPU cloud has a new team feature that allows you to invite your team to join your account for easy collaboration and more.
Lambda and Hugging Face are collaborating on a 2-week sprint to fine-tune OpenAI's Whisper model in as many languages as possible.
This blog walks through how to fine tune stable diffusion to create a text-to-naruto character model, emphasizing the importance of prompt engineering.
In this blog post, we go over the most recent updates we made to Lambda on-demand GPU cloud in September, 2022.
How to fine tune Stable Diffusion on a Pokemon dataset to create a text to Pokemon image model. Use the guide to train your own Stable Diffusion models.
How Lambda Cloud can save a Machine Learning Engineer time and money to train state of the art YoloV5 object detection models.