Amazon Deep Learning AMIs

Quickly build deep learning applications

The Amazon Deep Learning AMIs provide machine learning practitioners and researchers with the infrastructure and tools to accelerate deep learning in the cloud, at any scale. You can quickly launch Amazon EC2 instances pre-installed with popular deep learning frameworks such as Apache MXNet and Gluon, TensorFlow, Microsoft Cognitive Toolkit, Caffe, Caffe2, Theano, Torch, Pytorch, and Keras to train sophisticated, custom AI models, experiment with new algorithms, or to learn new skills and techniques.

Whether you need Amazon EC2 GPU or CPU instances, there is no additional charge for the Deep Learning AMIs – you only pay for the Amazon Web Services resources needed to store and run your applications.

Choosing an Amazon Deep Learning AMI

Even for experienced machine learning practitioners, getting started with deep learning can be time consuming and cumbersome. The three types of AMIs we offer support the various needs of developers. To help guide you through the getting started process, also visit the AMI selection guide and more deep learning resources.

Conda AMI

For developers who want pre-installed pip packages of deep learning frameworks in separate virtual environments, the Deep Learning Conda-based AMI is available in in Ubuntu and Amazon Linux versions.

Base AMI

For developers who want a clean slate to set up private deep learning engine repositories or custom builds of deep learning engines, the Deep Learning Base AMI is available in Ubuntu and Amazon Linux versions.

AMI with Source Code

For developers who want pre-installed deep learning frameworks and their source code in a shared Python environment, this Deep Learning AMI is available for P3 instances in CUDA 9 Ubuntu and Amazon Linux versions as well as for P2 instances in CUDA 8 Ubuntu and Amazon Linux versions.

Support for deep learning frameworks

The Amazon Deep Learning AMIs support all the popular deep learning frameworks allowing you to define models and then train them at scale. Built for Amazon Linux and Ubuntu, the AMIs come pre-configured with Apache MXNet and Gluon, TensorFlow, Microsoft Cognitive Toolkit, Caffe, Caffe2, Theano, Torch, PyTorch, and Keras, enabling you to quickly deploy and run any of these frameworks at scale.

Accelerate your model training

To expedite your development and model training, the Amazon Deep Learning AMIs include the latest NVIDIA GPU-acceleration through pre-configured CUDA and cuDNN drivers, as well as the Intel Math Kernel Library (MKL), in addition to installing popular Python packages and the Anaconda Platform.

  • The Amazon Deep Learning AMIs run on Amazon EC2 P2 instances, as well as P3 instances that take advantage of NVIDIA's Volta architecture. The AMIs are pre-installed with NVIDIA CUDA and cuDNN drivers to substantially accelerate the time to complete your computations.
  • The Amazon Deep Learning AMIs run on Amazon EC2 Intel-based C5 instances designed for inference.
  • The AMIs come installed with Jupyter notebooks loaded with Python 2.7 and Python 3.5 kernels, along with popular Python packages, including the Amazon SDK for Python.
  • To simplify package management and deployment, the Amazon Deep Learning AMIs install the Anaconda2 and Anaconda3 Data Science Platform, for large-scale data processing, predictive analytics, and scientific computing.

Documentation

We have three types of Amazon Deep Learning AMIs available to support the various needs of machine learning practitioners. Visit our AMI selection guide, simple tutorials, and more deep learning resources to get started today.

Close
Hot Contact Us

Hotline Contact Us

1010 0766
Beijing Region
Operated By Sinnet
1010 0966
Ningxia Region
Operated By NWCD