General Purpose Instances

Current Generation

M4 instances are based on a custom Intel Broadwell or Haswell processors. M4 instances deliver fixed performance and provide customers with a set of resources for a high level of consistent processing performance on a low-cost platform. Instances in this family are ideal for applications that require balanced CPU and memory performance. Examples of applications that will benefit from the performance of General Purpose instances include encoding, high traffic content management systems, and other enterprise applications.

  • m4.large: 8 GiB of memory, 2 vCPU, EBS-only, 64-bit platform
  • m4.xlarge: 16 GiB of memory, 4 vCPUs, EBS-only, 64-bit platform
  • m4.2xlarge: 32 GiB of memory, 8 vCPUs, EBS-only, 64-bit platform
  • m4.4xlarge: 64 GiB of memory, 16 vCPUs, EBS-only, 64-bit platform
  • m4.10xlarge: 160 GiB of memory, 40 vCPUs, EBS-only, 64-bit platform
  • m4.16xlarge: 256 GiB of memory, 64 vCPUs, EBS-only, 64-bit platform
 
* These instances may launch on an Intel Xeon E5-2686 v4 Broadwell or E5-2676 v3 Haswell processor
** This instance will launch on Intel Xeon E5-2676 v3 Haswell processor
*** This instance will launch on Intel Xeon E5-2686 v4 Broadwell processor
Previous Generations
  • m3.medium: 3.75 GiB of memory, 1 vCPU, 4 GB of SSD-based local instance storage, 64-bit platform
  • m3.large: 7.5 GiB of memory, 2 VCPUs, 32 GB of SSD-based local instance storage, 64-bit platform
  • m3.xlarge: 15 GiB of memory, 4 vCPUs, 80 GB of SSD-based local instance storage, 64-bit platform
  • m3.2xlarge: 30 GiB of memory, 8 vCPUs, 160 GB of SSD-based local instance storage, 64-bit platform
  • m1.small: 1.7 GiB of memory, 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), 160 GB of local instance storage, 32-bit or 64-bit platform

Compute Optimized Instances

Instances of this family have proportionally more CPU resources than memory (RAM) and are well suited for compute-intensive applications.

C5 instances deliver the best price/compute performance in the EC2 product family, offering up to a 49% improvement in price/performance compared to C4 instances. C5 instances are ideal for running compute-heavy workloads like batch processing, distributed analytics, high-performance computing (HPC), machine/deep learning inference, ad serving, highly scalable multiplayer gaming, and video encoding. Amazon EC2 C5d instances deliver C5 instances equipped with local NVMe-based SSD block level storage physically connected to the host server.

  • c5.large: 4 GiB of memory, 2 vCPUs, 64-bit platform
  • c5.xlarge: 8 GiB of memory, 4 vCPUs, 64-bit platform
  • c5.2xlarge: 16 GiB of memory, 8 vCPUs, 64-bit platform
  • c5.4xlarge: 32 GiB of memory, 16 vCPUs, 64-bit platform
  • c5.9xlarge: 72 GiB of memory, 36 vCPUs, 64-bit platform
  • c5.18xlarge: 144 GiB of memory, 72 vCPUs, 64-bit platform
  • c5d.large: 4 GiB of memory, 2 vCPUs, 64-bit platform
  • c5d.xlarge: 8 GiB of memory, 4 vCPUs, 64-bit platform
  • c5d.2xlarge: 16 GiB of memory, 8 vCPUs, 64-bit platform
  • c5d.4xlarge: 32 GiB of memory, 16 vCPUs, 64-bit platform
  • c5d.9xlarge: 72 GiB of memory, 36 vCPUs, 64-bit platform
  • c5d.18xlarge: 144 GiB of memory, 72 vCPUs, 64-bit platform
Previous Generations
  • c4.large: 3.75 GiB of memory, 2 vCPUs, 64-bit platform
  • c4.xlarge: 7.5 GiB of memory, 4 vCPUs, 64-bit platform
  • c4.2xlarge: 15 GiB of memory, 8 vCPUs, 64-bit platform
  • c4.4xlarge: 30 GiB of memory, 16 vCPUs, 64-bit platform
  • c4.8xlarge: 60 GiB of memory, 36 vCPUs, 64-bit platform
  • c3.large: 3.75 GiB of memory, 2 vCPUs, 32 GB of SSD-based local instance storage, 64-bit platform
  • c3.xlarge: 7.5 GiB of memory, 4 vCPUs, 80 GB of SSD-based local instance storage, 64-bit platform
  • c3.2xlarge: 15 GiB of memory, 8 vCPUs, 160 GB of SSD-based local instance storage, 64-bit platform
  • c3.4xlarge: 30 GiB of memory, 16 vCPUs, 320 GB of SSD-based local instance storage, 64-bit platform
  • c3.8xlarge: 60 GiB of memory, 32 vCPUs, 640 GB of SSD-based local instance storage, 64-bit platform

Memory Optimized Instances

X1 Instances are optimized for large-scale, enterprise-class, in-memory applications and have the lowest price per GiB of RAM among Amazon EC2 instance types.

  • x1.16xlarge: 976 GiB of memory, 64 vCPUs, 1 x 1,920 GB of SSD-based instance storage, 64-bit platform, 10 Gigabit Ethernet
  • x1.32xlarge: 1,952 GiB of memory, 128 vCPUs, 2 x 1,920 GB of SSD-based instance storage, 64-bit platform, 20 Gigabit Ethernet

R5 instances deliver 5% additional memory per vCPU and up to a 50% price/GiB improvement over R4 instances. R5 instances are ideally suited for applications such as high-performance databases, distributed in-memory caches, in-memory databases, and big data analytics. Amazon EC2 R5d instances deliver R5 instances equipped with local NVMe-based SSD block level storage physically connected to the host server.

  • r5.large: 16 GiB of memory, 2 vCPUs, 64-bit platform
  • r5.xlarge: 32 GiB of memory, 4 vCPUs, 64-bit platform
  • r5.2xlarge: 64 GiB of memory, 8 vCPUs, 64-bit platform
  • r5.4xlarge: 128 GiB of memory, 16 vCPUs, 64-bit platform
  • r5.12xlarge: 384 GiB of memory, 48 vCPUs, 64-bit platform
  • r5.24xlarge: 768 GiB of memory, 96 vCPUs, 64-bit platform
  • r5d.large: 16 GiB of memory, 2 vCPUs, 64-bit platform
  • r5d.xlarge: 32 GiB of memory, 4 vCPUs, 64-bit platform
  • r5d.2xlarge: 64 GiB of memory, 8 vCPUs, 64-bit platform
  • r5d.4xlarge: 128 GiB of memory, 16 vCPUs, 64-bit platform
  • r5d.12xlarge: 384 GiB of memory, 48 vCPUs, 64-bit platform
  • r5d.24xlarge: 768 GiB of memory, 96 vCPUs, 64-bit platform
Previous Generations
  • r4.large: 15.25 GiB of memory, 2 vCPUs, 64-bit platform
  • r4.xlarge: 30.5 GiB of memory, 4 vCPUs, 64-bit platform
  • r4.2xlarge: 61 GiB of memory, 8 vCPUs, 64-bit platform
  • r4.4xlarge: 122 GiB of memory, 16 vCPUs, 64-bit platform
  • r4.8xlarge: 244 GiB of memory, 32 vCPUs, 64-bit platform
  • r4.16xlarge: 488 GiB of memory, 64 vCPUs, 64-bit platform
  • r3.large: 15.25 GiB of memory, 2 vCPUs, 1 x 32 GB of SSD-based instance storage, 64-bit platform
  • r3.xlarge: 30.5 GiB of memory, 4 vCPUs, 1 x 80 GB of SSD-based instance storage, 64-bit platform
  • r3.2xlarge: 61 GiB of memory, 8 vCPUs, 1 x 160 GB of SSD-based instance storage, 64-bit platform
  • r3.4xlarge: 122 GiB of memory, 16 vCPUs, 1 x 320 GB of SSD-based instance storage, 64-bit platform
  • r3.8xlarge: 244 GiB of memory, 32 vCPUs, 2 x 320 GB of SSD-based instance storage, 64-bit platform, 10 Gigabit Ethernet

Storage Optimized Instances

Instances of this family provide very high disk I/O performance or proportionally higher storage density per instance, and are ideally suited for applications that benefit from high sequential I/O performance across very large data sets. Storage-optimized instances also provide high levels of CPU, memory and network performance.

  • i3.large: 15.25 GiB of memory, 2 vCPUs, 1 x 0.475 NVMe SSD, 64-bit platform
  • i3.xlarge: 30.5 GiB of memory, 4 vCPUs, 1 x 0.95 NVMe SSD, 64-bit platform
  • i3.2xlarge: 61 GiB of memory, 8 vCPUs, 1 x 1.9 NVMe SSD, 64-bit platform
  • i3.4xlarge: 122 GiB of memory, 16 vCPUs, 2 x 1.9 NVMe SSD, 64-bit platform
  • i3.8xlarge: 244 GiB of memory, 32 vCPUs, 4 x 1.9 NVMe SSD, 64-bit platform
  • i3.16xlarge: 488 GiB of memory, 64 vCPUs, 8 x 1.9 NVMe SSD, 64-bit platform
Previous Generations
  • i2.xlarge: 30.5 GiB of memory, 4 vCPUs, 800 GB of SSD-based instance storage, 64-bit platform
  • i2.2xlarge: 61 GiB of memory, 8 vCPUs, 2 x 800 GB of SSD-based instance storage, 64-bit platform
  • i2.4xlarge: 122 GiB of memory, 16 vCPUs, 4 x 800 GB of SSD-based instance storage, 64-bit platform
  • i2.8xlarge: 244 GiB of memory, 32 vCPUs, 8 x 800 GB of SSD-based instance storage, 64-bit platform, 10 Gigabit Ethernet

Instances of this family provide low cost storage and very high disk throughput and are ideally suited for applications that benefit from high sequential I/O performance across very large datasets on local storage.

  • d2.xlarge: 30.5 GiB of memory, 4 vCPUs, 3 x 2000 GB of HDD-based instance storage, 64-bit platform
  • d2.2xlarge: 61 GiB of memory, 8 vCPUs, 6 x 2000 GB of HDD -based instance storage, 64-bit platform
  • d2.4xlarge: 122 GiB of memory, 16 vCPUs, 12 x 2000 GB of HDD-based instance storage, 64-bit platform
  • d2.8xlarge: 244 GiB of memory, 36 vCPUs, 24 x 2000 GB of HDD -based instance storage, 64-bit platform, 10 Gigabit Ethernet

Accelerated Computing Instances

Instances of this family provide access to workload accelerators such as GPU. They are ideal for applications such as machine learning, computational fluid dynamics, computational finance, seismic analysis, molecular modeling, genomics, and other high-performance computing workloads.

  • p3.2xlarge: 1 GPU, 8 vCPUs, 61 GiB of memory, up to 10 Gbps network performance
  • p3.8xlarge, 4 GPU, 32 vCPUs, 244 GiB of memory, 10 Gbps network performance
  • p3.16xlarge, 8 GPU, 64 vCPUs, 488 GiB of memory, 25 Gbps network performance
  • p3dn.24xlarge, 8 GPU, 96 vCPUs, 768 GiB of memory, 100 Gbps network performance
  • Backed by the NVIDIA Tesla M60 GPUs, G3 instances are ideal for graphics workloads such as 3D rendering, 3D visualizations, graphics-intensive remote workstations, video encoding, and virtual reality applications. g3.4xlarge: 1 GPU, 16 vCPUs, 122 GiB of memory, up to 10Gbit network performance
  • g3.8xlarge: 2 GPUs, 32 vCPUs, 244 GiB of memory, 10Gbit network performance
  • g3.16xlarge: 4 GPUs, 64 vCPUs, 488 GiB of memory, 20Gbit network performance
Previous Generations
  • p2.xlarge: 1 GPU, 4 vCPUs, 61GiB of memory, high network performance
  • p2.8xlarge: 8 GPU, 32 vCPUs, 488GiB of memory, 10Gbit network performance
  • p2.16xlarge: 16 GPU, 64 vCPUs, 732GiB of memory, 20Gbit network performance

Micro Instances

Micro instances (t1.micro) provide a small amount of consistent CPU resources and allow you to increase CPU capacity in short bursts when additional cycles are available. They are well suited for lower throughput applications and web sites that require additional compute cycles periodically. You can learn more about how you can use Micro instances and appropriate applications in the Amazon EC2 documentation.

  • t1.micro: (Default) 613 MiB of memory, up to 2 ECUs (for short periodic bursts), EBS storage only, 32-bit or 64-bit platform

EC2 Compute Unit (ECU) – One EC2 Compute Unit (ECU) provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor.