Specifications for Amazon EC2 accelerated computing instances - Amazon EC2

Specifications for Amazon EC2 accelerated computing instances

Accelerated computing instances use hardware accelerators, or co-processors, to perform functions, such as floating point number calculations, graphics processing, or data pattern matching, more efficiently than is possible in software running on CPUs.

For information on previous generation instance types of this category, such as G3 instances, see Specifications for Amazon EC2 previous generation instances.

Pricing

For pricing information, see Amazon EC2 On-Demand Pricing.

Instance families and instance types

Instance family Available instance types
DL1 dl1.24xlarge
DL2q dl2q.24xlarge
F1 f1.2xlarge | f1.4xlarge | f1.16xlarge
F2 f2.12xlarge | f2.48xlarge
G4ad g4ad.xlarge | g4ad.2xlarge | g4ad.4xlarge | g4ad.8xlarge | g4ad.16xlarge
G4dn g4dn.xlarge | g4dn.2xlarge | g4dn.4xlarge | g4dn.8xlarge | g4dn.12xlarge | g4dn.16xlarge | g4dn.metal
G5 g5.xlarge | g5.2xlarge | g5.4xlarge | g5.8xlarge | g5.12xlarge | g5.16xlarge | g5.24xlarge | g5.48xlarge
G5g g5g.xlarge | g5g.2xlarge | g5g.4xlarge | g5g.8xlarge | g5g.16xlarge | g5g.metal
G6 g6.xlarge | g6.2xlarge | g6.4xlarge | g6.8xlarge | g6.12xlarge | g6.16xlarge | g6.24xlarge | g6.48xlarge
G6e g6e.xlarge | g6e.2xlarge | g6e.4xlarge | g6e.8xlarge | g6e.12xlarge | g6e.16xlarge | g6e.24xlarge | g6e.48xlarge
Gr6 gr6.4xlarge | gr6.8xlarge
Inf1 inf1.xlarge | inf1.2xlarge | inf1.6xlarge | inf1.24xlarge
Inf2 inf2.xlarge | inf2.8xlarge | inf2.24xlarge | inf2.48xlarge
P2 p2.xlarge | p2.8xlarge | p2.16xlarge
P3 p3.2xlarge | p3.8xlarge | p3.16xlarge
P3dn p3dn.24xlarge
P4d p4d.24xlarge
P4de p4de.24xlarge
P5 p5.48xlarge
P5e p5e.48xlarge
P5en p5en.48xlarge
Trn1 trn1.2xlarge | trn1.32xlarge
Trn1n trn1n.32xlarge
Trn2 trn2.48xlarge
Trn2u trn2u.48xlarge
VT1 vt1.3xlarge | vt1.6xlarge | vt1.24xlarge

Instance family summary

Instance family Hypervisor Processor type (architecture) Metal instances available Dedicated Hosts support Spot support Hibernation support Supported operating systems
DL1 Nitro v3 Intel (x86_64) Linux
DL2q Nitro v3 Intel (x86_64) Linux
F1 Xen Intel (x86_64) Linux
F2 Nitro v4 AMD (x86_64) Linux
G4ad Nitro v3 AMD (x86_64) Windows | Linux
G4dn Nitro v3 Intel (x86_64) Windows | Linux
G5 Nitro v3 AMD (x86_64) Windows | Linux
G5g Nitro v2 AWS Graviton (arm64) Linux
G6 Nitro v4 AMD (x86_64) Windows | Linux
G6e Nitro v4 AMD (x86_64) Windows | Linux
Gr6 Nitro v4 AMD (x86_64) Windows | Linux
Inf1 Nitro v3 Intel (x86_64) Linux
Inf2 Nitro v4 AMD (x86_64) Linux
P2 Xen Intel (x86_64) Windows | Linux
P3 Xen Intel (x86_64) Windows | Linux
P3dn Nitro v3 Intel (x86_64) Windows | Linux
P4d Nitro v3 Intel (x86_64) Linux
P4de Nitro v3 Intel (x86_64) Linux
P5 Nitro v4 AMD (x86_64) Linux
P5e Nitro v4 AMD (x86_64) Linux
P5en Nitro v5 Intel (x86_64) Linux
Trn1 Nitro v4 Intel (x86_64) Linux
Trn1n Nitro v4 Intel (x86_64) Linux
Trn2 Nitro v5 Intel (x86_64) Linux
Trn2u Nitro v5 Intel (x86_64) Linux
VT1 Nitro v3 Intel (x86_64) Linux

Performance specifications

Instance type Burstable Memory (GiB) Processor vCPUs CPU cores Threads per core Accelerators Accelerator memory
DL1
dl1.24xlarge 768.00 Intel Xeon P-8275CL 96 48 2 8 x Habana Gaudi HL-205 GPU 256 GiB (8 x 32 GiB)
DL2q
dl2q.24xlarge 768.00 Intel Xeon Cascade Lake 96 48 2 8 x Qualcomm Qualcomm AI100 inference accelerator 125 GiB (8 x 15 GiB)
F1
f1.2xlarge 122.00 Intel Xeon E5-2686v4 8 4 2 1 x Xilinx Virtex UltraScale (VU9P) FPGA 64 GiB (1 x 64 GiB)
f1.4xlarge 244.00 Intel Xeon E5-2686v4 16 8 2 2 x Xilinx Virtex UltraScale (VU9P) FPGA 128 GiB (2 x 64 GiB)
f1.16xlarge 976.00 Intel Xeon E5-2686v4 64 32 2 8 x Xilinx Virtex UltraScale (VU9P) FPGA 512 GiB (8 x 64 GiB)
F2
f2.12xlarge 512.00 AMD EPYC 7R13 48 24 2 2 x Xilinx Virtex UltraScale+ (VU47P) FPGA 160 GiB (2 x 80 GiB)
f2.48xlarge 2048.00 AMD EPYC 7R13 192 96 2 8 x Xilinx Virtex UltraScale+ (VU47P) FPGA 640 GiB (8 x 80 GiB)
G4ad
g4ad.xlarge 16.00 2nd Gen AMD EPYC 7R32 4 2 2 1 x AMD Radeon Pro V520 GPU 8 GiB (1 x 8 GiB)
g4ad.2xlarge 32.00 2nd Gen AMD EPYC 7R32 8 4 2 1 x AMD Radeon Pro V520 GPU 8 GiB (1 x 8 GiB)
g4ad.4xlarge 64.00 2nd Gen AMD EPYC 7R32 16 8 2 1 x AMD Radeon Pro V520 GPU 8 GiB (1 x 8 GiB)
g4ad.8xlarge 128.00 2nd Gen AMD EPYC 7R32 32 16 2 2 x AMD Radeon Pro V520 GPU 16 GiB (2 x 8 GiB)
g4ad.16xlarge 256.00 2nd Gen AMD EPYC 7R32 64 32 2 4 x AMD Radeon Pro V520 GPU 32 GiB (4 x 8 GiB)
G4dn
g4dn.xlarge 16.00 Intel Xeon P-8259L 4 2 2 1 x NVIDIA T4 GPU 16 GiB (1 x 16 GiB)
g4dn.2xlarge 32.00 Intel Xeon P-8259L 8 4 2 1 x NVIDIA T4 GPU 16 GiB (1 x 16 GiB)
g4dn.4xlarge 64.00 Intel Xeon P-8259L 16 8 2 1 x NVIDIA T4 GPU 16 GiB (1 x 16 GiB)
g4dn.8xlarge 128.00 Intel Xeon P-8259L 32 16 2 1 x NVIDIA T4 GPU 16 GiB (1 x 16 GiB)
g4dn.12xlarge 192.00 Intel Xeon P-8259L 48 24 2 4 x NVIDIA T4 GPU 64 GiB (4 x 16 GiB)
g4dn.16xlarge 256.00 Intel Xeon P-8259L 64 32 2 1 x NVIDIA T4 GPU 16 GiB (1 x 16 GiB)
g4dn.metal 384.00 Intel Xeon P-8259L 96 48 2 8 x NVIDIA T4 GPU 128 GiB (8 x 16 GiB)
G5
g5.xlarge 16.00 2nd Gen AMD EPYC 7R32 4 2 2 1 x NVIDIA A10G GPU 24 GiB (1 x 24 GiB)
g5.2xlarge 32.00 2nd Gen AMD EPYC 7R32 8 4 2 1 x NVIDIA A10G GPU 24 GiB (1 x 24 GiB)
g5.4xlarge 64.00 2nd Gen AMD EPYC 7R32 16 8 2 1 x NVIDIA A10G GPU 24 GiB (1 x 24 GiB)
g5.8xlarge 128.00 2nd Gen AMD EPYC 7R32 32 16 2 1 x NVIDIA A10G GPU 24 GiB (1 x 24 GiB)
g5.12xlarge 192.00 2nd Gen AMD EPYC 7R32 48 24 2 4 x NVIDIA A10G GPU 96 GiB (4 x 24 GiB)
g5.16xlarge 256.00 2nd Gen AMD EPYC 7R32 64 32 2 1 x NVIDIA A10G GPU 24 GiB (1 x 24 GiB)
g5.24xlarge 384.00 2nd Gen AMD EPYC 7R32 96 48 2 4 x NVIDIA A10G GPU 96 GiB (4 x 24 GiB)
g5.48xlarge 768.00 2nd Gen AMD EPYC 7R32 192 96 2 8 x NVIDIA A10G GPU 192 GiB (8 x 24 GiB)
G5g
g5g.xlarge 8.00 AWS Graviton2 Processor 4 4 1 1 x NVIDIA T4g GPU 16 GiB (1 x 16 GiB)
g5g.2xlarge 16.00 AWS Graviton2 Processor 8 8 1 1 x NVIDIA T4g GPU 16 GiB (1 x 16 GiB)
g5g.4xlarge 32.00 AWS Graviton2 Processor 16 16 1 1 x NVIDIA T4g GPU 16 GiB (1 x 16 GiB)
g5g.8xlarge 64.00 AWS Graviton2 Processor 32 32 1 1 x NVIDIA T4g GPU 16 GiB (1 x 16 GiB)
g5g.16xlarge 128.00 AWS Graviton2 Processor 64 64 1 2 x NVIDIA T4g GPU 32 GiB (2 x 16 GiB)
g5g.metal 128.00 AWS Graviton2 Processor 64 64 1 2 x NVIDIA T4g GPU 32 GiB (2 x 16 GiB)
G6
g6.xlarge 16.00 AMD EPYC 7R13 4 2 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
g6.2xlarge 32.00 AMD EPYC 7R13 8 4 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
g6.4xlarge 64.00 AMD EPYC 7R13 16 8 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
g6.8xlarge 128.00 AMD EPYC 7R13 32 16 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
g6.12xlarge 192.00 AMD EPYC 7R13 48 24 2 4 x NVIDIA L4 GPU 357 GiB (4 x 89 GiB)
g6.16xlarge 256.00 AMD EPYC 7R13 64 32 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
g6.24xlarge 384.00 AMD EPYC 7R13 96 48 2 4 x NVIDIA L4 GPU 357 GiB (4 x 89 GiB)
g6.48xlarge 768.00 AMD EPYC 7R13 192 96 2 8 x NVIDIA L4 GPU 1430 GiB (8 x 178 GiB)
G6e
g6e.xlarge 32.00 AMD EPYC 7R13 4 2 2 1 x NVIDIA L40S GPU 44 GiB (1 x 44 GiB)
g6e.2xlarge 64.00 AMD EPYC 7R13 8 4 2 1 x NVIDIA L40S GPU 44 GiB (1 x 44 GiB)
g6e.4xlarge 128.00 AMD EPYC 7R13 16 8 2 1 x NVIDIA L40S GPU 44 GiB (1 x 44 GiB)
g6e.8xlarge 256.00 AMD EPYC 7R13 32 16 2 1 x NVIDIA L40S GPU 44 GiB (1 x 44 GiB)
g6e.12xlarge 384.00 AMD EPYC 7R13 48 24 2 4 x NVIDIA L40S GPU 715 GiB (4 x 178 GiB)
g6e.16xlarge 512.00 AMD EPYC 7R13 64 32 2 1 x NVIDIA L40S GPU 44 GiB (1 x 44 GiB)
g6e.24xlarge 768.00 AMD EPYC 7R13 96 48 2 4 x NVIDIA L40S GPU 715 GiB (4 x 178 GiB)
g6e.48xlarge 1536.00 AMD EPYC 7R13 192 96 2 8 x NVIDIA L40S GPU 2861 GiB (8 x 357 GiB)
Gr6
gr6.4xlarge 128.00 AMD EPYC 7R13 16 8 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
gr6.8xlarge 256.00 AMD EPYC 7R13 32 16 2 1 x NVIDIA L4 GPU 22 GiB (1 x 22 GiB)
Inf1
inf1.xlarge 8.00 Intel Xeon P-8259L 4 2 2 1 x AWS Inferentia inference accelerator 8 GiB (1 x 8 GiB)
inf1.2xlarge 16.00 Intel Xeon P-8259L 8 4 2 1 x AWS Inferentia inference accelerator 8 GiB (1 x 8 GiB)
inf1.6xlarge 48.00 Intel Xeon P-8259L 24 12 2 4 x AWS Inferentia inference accelerator 32 GiB (4 x 8 GiB)
inf1.24xlarge 192.00 Intel Xeon P-8259L 96 48 2 16 x AWS Inferentia inference accelerator 128 GiB (16 x 8 GiB)
Inf2
inf2.xlarge 16.00 AMD EPYC 7R13 4 2 2 1 x AWS Inferentia inference accelerator 32 GiB (1 x 32 GiB)
inf2.8xlarge 128.00 AMD EPYC 7R13 32 16 2 1 x AWS Inferentia2 inference accelerator 32 GiB (1 x 32 GiB)
inf2.24xlarge 384.00 AMD EPYC 7R13 96 48 2 6 x AWS Inferentia inference accelerator 192 GiB (6 x 32 GiB)
inf2.48xlarge 768.00 AMD EPYC 7R13 192 96 2 12 x AWS Inferentia inference accelerator 384 GiB (12 x 32 GiB)
P2
p2.xlarge 61.00 Intel Xeon E5-2686v4 4 2 2 1 x NVIDIA K80 GPU 12 GiB (1 x 12 GiB)
p2.8xlarge 488.00 Intel Xeon E5-2686v4 32 16 2 8 x NVIDIA K80 GPU 96 GiB (8 x 12 GiB)
p2.16xlarge 732.00 Intel Xeon E5-2686 v4 64 32 2 16 x NVIDIA K80 GPU 192 GiB (16 x 12 GiB)
P3
p3.2xlarge 61.00 Intel Xeon E5-2686 v4 8 4 2 1 x NVIDIA V100 GPU 16 GiB (1 x 16 GiB)
p3.8xlarge 244.00 Intel Xeon E5-2686 v4 32 16 2 4 x NVIDIA V100 GPU 64 GiB (4 x 16 GiB)
p3.16xlarge 488.00 Intel Xeon E5-2686 v4 64 32 2 8 x NVIDIA V100 GPU 128 GiB (8 x 16 GiB)
P3dn
p3dn.24xlarge 768.00 Intel Xeon Platinum 8175 96 48 2 8 x NVIDIA V100 GPU 256 GiB (8 x 32 GiB)
P4d
p4d.24xlarge 1152.00 Intel Xeon Platinum 8175 96 48 2 8 x NVIDIA A100 GPU 320 GiB (8 x 40 GiB)
P4de
p4de.24xlarge 1152.00 Intel Xeon Platinum 8175 96 48 2 8 x NVIDIA A100 GPU 640 GiB (8 x 80 GiB)
P5
p5.48xlarge 2048.00 AMD EPYC 7R13 192 96 2 8 x NVIDIA H100 GPU 640 GiB (8 x 80 GiB)
P5e
p5e.48xlarge 2048.00 AMD EPYC 7R13 192 96 2 8 x NVIDIA H200 GPU 1128 GiB (8 x 141 GiB)
P5en
p5en.48xlarge 2048.00 Intel Xeon Sapphire Rapids 192 96 2 8 x NVIDIA NVIDIA GPU 1128 GiB (8 x 141 GiB)
Trn1
trn1.2xlarge 32.00 Intel Xeon Ice Lake 8375C 8 4 2 1 x AWS Trainium accelerators 32 GiB (1 x 32 GiB)
trn1.32xlarge 512.00 Intel Xeon Ice Lake 8375C 128 64 2 16 x AWS Trainium accelerators 512 GiB (16 x 32 GiB)
Trn1n
trn1n.32xlarge 512.00 Intel Xeon Ice Lake 128 64 2 16 x AWS Trainium accelerators 512 GiB (16 x 32 GiB)
Trn2
trn2.48xlarge 2048.00 Intel Xeon Sapphire Rapids 192 96 2 16 x AWS Trainium2 accelerators 8192 GiB (16 x 512 GiB)
Trn2u
trn2u.48xlarge 2048.00 Intel Xeon Sapphire Rapids 192 96 2
VT1
vt1.3xlarge 24.00 Intel Cascade Lake P-8259CL 12 6 2 1 x Xilinx U30 media accelerator 24 GiB (1 x 24 GiB)
vt1.6xlarge 48.00 Intel Cascade Lake P-8259CL 24 12 2 2 x Xilinx U30 media accelerator 48 GiB (2 x 24 GiB)
vt1.24xlarge 192.00 Intel Cascade Lake P-8259CL 96 48 2 8 x Xilinx U30 media accelerator 192 GiB (8 x 24 GiB)

Network specifications

Instance type Baseline / Burst bandwidth (Gbps) EFA ENA ENA Express Network cards Max. network interfaces IP addresses per interface IPv6
DL1
dl1.24xlarge 4x 100 Gigabit 4 60 50
DL2q
dl2q.24xlarge 100 Gigabit 1 15 50
F1
f1.2xlarge 1 Up to 10 Gigabit 1 4 15
f1.4xlarge 1 Up to 10 Gigabit 1 8 30
f1.16xlarge 25 Gigabit 1 8 50
F2
f2.12xlarge 25 Gigabit 1 8 30
f2.48xlarge 100 Gigabit 1 15 50
G4ad
g4ad.xlarge 1 2.0 / 10.0 1 2 4
g4ad.2xlarge 1 4.167 / 10.0 1 2 4
g4ad.4xlarge 1 8.333 / 10.0 1 3 10
g4ad.8xlarge 15 Gigabit 1 4 15
g4ad.16xlarge 25 Gigabit 1 8 30
G4dn
g4dn.xlarge 1 5.0 / 25.0 1 3 10
g4dn.2xlarge 1 10.0 / 25.0 1 3 10
g4dn.4xlarge 1 20.0 / 25.0 1 3 10
g4dn.8xlarge 50 Gigabit 1 4 15
g4dn.12xlarge 50 Gigabit 1 8 30
g4dn.16xlarge 50 Gigabit 1 4 15
g4dn.metal 100 Gigabit 1 15 50
G5
g5.xlarge 1 2.5 / 10.0 1 4 15
g5.2xlarge 1 5.0 / 10.0 1 4 15
g5.4xlarge 1 10.0 / 25.0 1 8 30
g5.8xlarge 25 Gigabit 1 8 30
g5.12xlarge 40 Gigabit 1 15 50
g5.16xlarge 25 Gigabit 1 8 30
g5.24xlarge 50 Gigabit 1 15 50
g5.48xlarge 100 Gigabit 1 7 50
G5g
g5g.xlarge 1 1.25 / 10.0 1 4 15
g5g.2xlarge 1 2.5 / 10.0 1 4 15
g5g.4xlarge 1 5.0 / 10.0 1 8 30
g5g.8xlarge 12 Gigabit 1 8 30
g5g.16xlarge 25 Gigabit 1 15 50
g5g.metal 25 Gigabit 1 15 50
G6
g6.xlarge 1 2.5 / 10.0 1 4 15
g6.2xlarge 1 5.0 / 10.0 1 4 15
g6.4xlarge 1 10.0 / 25.0 1 8 30
g6.8xlarge 25 Gigabit 1 8 30
g6.12xlarge 40 Gigabit 1 8 30
g6.16xlarge 25 Gigabit 1 15 50
g6.24xlarge 50 Gigabit 1 15 50
g6.48xlarge 100 Gigabit 1 15 50
G6e
g6e.xlarge 1 2.5 / 20.0 1 4 15
g6e.2xlarge 1 5.0 / 20.0 1 4 15
g6e.4xlarge 20 Gigabit 1 8 30
g6e.8xlarge 25 Gigabit 1 8 30
g6e.12xlarge 100 Gigabit 1 10 30
g6e.16xlarge 35 Gigabit 1 15 50
g6e.24xlarge 200 Gigabit 2 20 50
g6e.48xlarge 400 Gigabit 4 40 50
Gr6
gr6.4xlarge 1 10.0 / 25.0 1 8 30
gr6.8xlarge 25 Gigabit 1 8 30
Inf1
inf1.xlarge 1 5.0 / 25.0 1 4 10
inf1.2xlarge 1 5.0 / 25.0 1 4 10
inf1.6xlarge 25 Gigabit 1 8 30
inf1.24xlarge 100 Gigabit 1 11 30
Inf2
inf2.xlarge 1 2.083 / 15.0 1 4 15
inf2.8xlarge 1 16.667 / 25.0 1 8 30
inf2.24xlarge 50 Gigabit 1 15 50
inf2.48xlarge 100 Gigabit 1 15 50
P2
p2.xlarge High 1 4 15
p2.8xlarge 10 Gigabit 1 8 30
p2.16xlarge 25 Gigabit 1 8 30
P3
p3.2xlarge 1 Up to 10 Gigabit 1 4 15
p3.8xlarge 10 Gigabit 1 8 30
p3.16xlarge 25 Gigabit 1 8 30
P3dn
p3dn.24xlarge 100 Gigabit 1 15 50
P4d
p4d.24xlarge 4x 100 Gigabit 4 60 50
P4de
p4de.24xlarge 4x 100 Gigabit 4 60 50
P5
p5.48xlarge 3200 Gigabit 32 64 50
P5e
p5e.48xlarge 3200 Gigabit 32 64 50
P5en
p5en.48xlarge 3200 Gigabit 16 64 50
Trn1
trn1.2xlarge 1 3.125 / 12.5 1 4 15
trn1.32xlarge 8x 100 Gigabit 8 40 50
Trn1n
trn1n.32xlarge 16x 100 Gigabit 16 80 50
Trn2
trn2.48xlarge 16x 200 Gigabit 16 32 50
Trn2u
trn2u.48xlarge 16x 200 Gigabit 16 32 50
VT1
vt1.3xlarge 3.12 Gigabit 1 4 15
vt1.6xlarge 6.25 Gigabit 1 8 30
vt1.24xlarge 25 Gigabit 1 15 50
Note

1 These instances have a baseline bandwidth and can use a network I/O credit mechanism to burst beyond their baseline bandwidth on a best effort basis. Other instances types can sustain their maximum performance indefinitely. For more information, see instance network bandwidth.

Amazon EBS specifications

The following table indicates which instance types are Amazon EBS optimized by default and which optionally support it. It also describes their EBS-optimized performance, including dedicated bandwidth to Amazon EBS, the typical maximum aggregate throughput that can be achieved on that dedicated connection with a streaming read workload and 128 KiB I/O size, and the maximum IOPS the instance type can support when using a 16 KiB I/O size. Instance types not listed do not support Amazon EBS optimization.

Important

An instance's EBS performance is bounded by the instance's performance limits, or the aggregated performance of its attached volumes, whichever is smaller. To achieve maximum EBS performance, an instance must have attached volumes that provide a combined performance equal to or greater than the maximum instance performance. For example, to achieve 80,000 IOPS for r6i.16xlarge, the instance must have at least 5 gp3 volumes provisioned with 16,000 IOPS each (5 volumes x 16,000 IOPS = 80,000 IOPS).

We recommend that you choose an EBS–optimized instance type that provides more dedicated Amazon EBS throughput than your application needs; otherwise, the connection between Amazon EBS and Amazon EC2 can become a performance bottleneck.

Instance type Baseline / Maximum bandwidth (Mbps) Baseline / Maximum throughput (MB/s, 128 KiB I/O) Baseline / Maximum IOPS (16 KiB I/O) NVMe EBS optimization 2
DL1
dl1.24xlarge 19000.00 2375.00 80000.00 default
DL2q
dl2q.24xlarge 19000.00 2375.00 80000.00 default
F1
f1.2xlarge 1700.00 212.50 12000.00 default
f1.4xlarge 3500.00 437.50 44000.00 default
f1.16xlarge 14000.00 1750.00 75000.00 default
F2
f2.12xlarge 15000.00 1875.00 60000.00 default
f2.48xlarge 60000.00 7500.00 240000.00 default
G4ad
g4ad.xlarge 1 400.00 / 3170.00 50.00 / 396.25 1700.00 / 13333.00 default
g4ad.2xlarge 1 800.00 / 3170.00 100.00 / 396.25 3400.00 / 13333.00 default
g4ad.4xlarge 1 1580.00 / 3170.00 197.50 / 396.25 6700.00 / 13333.00 default
g4ad.8xlarge 3170.00 396.25 13333.00 default
g4ad.16xlarge 6300.00 787.50 26667.00 default
G4dn
g4dn.xlarge 1 950.00 / 3500.00 118.75 / 437.50 3000.00 / 20000.00 default
g4dn.2xlarge 1 1150.00 / 3500.00 143.75 / 437.50 6000.00 / 20000.00 default
g4dn.4xlarge 4750.00 593.75 20000.00 default
g4dn.8xlarge 9500.00 1187.50 40000.00 default
g4dn.12xlarge 9500.00 1187.50 40000.00 default
g4dn.16xlarge 9500.00 1187.50 40000.00 default
g4dn.metal 19000.00 2375.00 80000.00 default
G5
g5.xlarge 1 700.00 / 3500.00 87.50 / 437.50 3000.00 / 15000.00 default
g5.2xlarge 1 850.00 / 3500.00 106.25 / 437.50 3500.00 / 15000.00 default
g5.4xlarge 4750.00 593.75 20000.00 default
g5.8xlarge 16000.00 2000.00 65000.00 default
g5.12xlarge 16000.00 2000.00 65000.00 default
g5.16xlarge 16000.00 2000.00 65000.00 default
g5.24xlarge 19000.00 2375.00 80000.00 default
g5.48xlarge 19000.00 2375.00 80000.00 default
G5g
g5g.xlarge 1 1188.00 / 4750.00 148.50 / 593.75 6000.00 / 20000.00 default
g5g.2xlarge 1 2375.00 / 4750.00 296.88 / 593.75 12000.00 / 20000.00 default
g5g.4xlarge 4750.00 593.75 20000.00 default
g5g.8xlarge 9500.00 1187.50 40000.00 default
g5g.16xlarge 19000.00 2375.00 80000.00 default
g5g.metal 19000.00 2375.00 80000.00 default
G6
g6.xlarge 1 1000.00 / 5000.00 125.00 / 625.00 4000.00 / 20000.00 default
g6.2xlarge 1 2000.00 / 5000.00 250.00 / 625.00 8000.00 / 20000.00 default
g6.4xlarge 8000.00 1000.00 32000.00 default
g6.8xlarge 16000.00 2000.00 64000.00 default
g6.12xlarge 20000.00 2500.00 80000.00 default
g6.16xlarge 20000.00 2500.00 80000.00 default
g6.24xlarge 30000.00 3750.00 120000.00 default
g6.48xlarge 60000.00 7500.00 240000.00 default
G6e
g6e.xlarge 1 1000.00 / 5000.00 125.00 / 625.00 4000.00 / 20000.00 default
g6e.2xlarge 1 2000.00 / 5000.00 250.00 / 625.00 8000.00 / 20000.00 default
g6e.4xlarge 8000.00 1000.00 32000.00 default
g6e.8xlarge 16000.00 2000.00 64000.00 default
g6e.12xlarge 20000.00 2500.00 80000.00 default
g6e.16xlarge 20000.00 2500.00 80000.00 default
g6e.24xlarge 30000.00 3750.00 120000.00 default
g6e.48xlarge 60000.00 7500.00 240000.00 default
Gr6
gr6.4xlarge 8000.00 1000.00 32000.00 default
gr6.8xlarge 16000.00 2000.00 64000.00 default
Inf1
inf1.xlarge 1 1190.00 / 4750.00 148.75 / 593.75 4000.00 / 20000.00 default
inf1.2xlarge 1 1190.00 / 4750.00 148.75 / 593.75 6000.00 / 20000.00 default
inf1.6xlarge 4750.00 593.75 20000.00 default
inf1.24xlarge 19000.00 2375.00 80000.00 default
Inf2
inf2.xlarge 1 1250.00 / 10000.00 156.25 / 1250.00 6000.00 / 40000.00 default
inf2.8xlarge 10000.00 1250.00 40000.00 default
inf2.24xlarge 30000.00 3750.00 120000.00 default
inf2.48xlarge 60000.00 7500.00 240000.00 default
P2
p2.xlarge 750.00 93.75 6000.00 default
p2.8xlarge 5000.00 625.00 32500.00 default
p2.16xlarge 10000.00 1250.00 65000.00 default
P3
p3.2xlarge 1750.00 218.75 10000.00 default
p3.8xlarge 7000.00 875.00 40000.00 default
p3.16xlarge 14000.00 1750.00 80000.00 default
P3dn
p3dn.24xlarge 19000.00 2375.00 80000.00 default
P4d
p4d.24xlarge 19000.00 2375.00 80000.00 default
P4de
p4de.24xlarge 19000.00 2375.00 80000.00 default
P5
p5.48xlarge 80000.00 10000.00 260000.00 default
P5e
p5e.48xlarge 80000.00 10000.00 260000.00 default
P5en
p5en.48xlarge 100000.00 12500.00 400000.00 default
Trn1
trn1.2xlarge 1 5000.00 / 20000.00 625.00 / 2500.00 16250.00 / 65000.00 default
trn1.32xlarge 80000.00 10000.00 260000.00 default
Trn1n
trn1n.32xlarge 80000.00 10000.00 260000.00 default
Trn2
trn2.48xlarge 80000.00 10000.00 260000.00 default
Trn2u
trn2u.48xlarge 80000.00 10000.00 260000.00 default
VT1
vt1.3xlarge 1 2375.00 / 4750.00 296.88 / 593.75 10000.00 / 20000.00 default
vt1.6xlarge 4750.00 593.75 20000.00 default
vt1.24xlarge 19000.00 2375.00 80000.00 default
Note

1 These instances can support maximum performance for 30 minutes at least once every 24 hours, after which they revert to their baseline performance. Other instances can sustain the maximum performance indefinitely. If your workload requires sustained maximum performance for longer than 30 minutes, use one of these instances.

2 default indicates that instances are enabled for EBS optimization by default. supported indicates that instances can optionally be enabled for EBS optimization For more information, see Amazon EBS–optimized instances.

Instance store specifications

The following table shows the instance store volume configuration for supported instance types, along with the aggregated IOPS performance with 4,096 byte block size at queue depth saturation.

Instance type Instance store volumes Instance store type 100% random read IOPS / Write IOPS Needs initialization 1 TRIM support 2
DL1
dl1.24xlarge 4 x 1000 GB NVMe SSD 1,000,000 / 800,000
F1
f1.2xlarge 1 x 470 GB NVMe SSD
f1.4xlarge 1 x 940 GB NVMe SSD
f1.16xlarge 4 x 940 GB NVMe SSD
F2
f2.12xlarge 2 x 940 GB NVMe SSD 800,000 / 250,000
f2.48xlarge 8 x 940 GB NVMe SSD 3,200,000 / 1,000,000
G4ad
g4ad.xlarge 1 x 150 GB NVMe SSD 10,417 / 8,333
g4ad.2xlarge 1 x 300 GB NVMe SSD 20,833 / 16,667
g4ad.4xlarge 1 x 600 GB NVMe SSD 41,667 / 33,333
g4ad.8xlarge 1 x 1200 GB NVMe SSD 83,333 / 66,667
g4ad.16xlarge 2 x 1200 GB NVMe SSD 166,666 / 133,332
G4dn
g4dn.xlarge 1 x 125 GB NVMe SSD 42,500 / 32,500
g4dn.2xlarge 1 x 225 GB NVMe SSD 42,500 / 32,500
g4dn.4xlarge 1 x 225 GB NVMe SSD 85,000 / 65,000
g4dn.8xlarge 1 x 900 GB NVMe SSD 250,000 / 200,000
g4dn.12xlarge 1 x 900 GB NVMe SSD 250,000 / 200,000
g4dn.16xlarge 1 x 900 GB NVMe SSD 250,000 / 200,000
g4dn.metal 2 x 900 GB NVMe SSD 500,000 / 400,000
G5
g5.xlarge 1 x 250 GB NVMe SSD 40,625 / 20,313
g5.2xlarge 1 x 450 GB NVMe SSD 40,625 / 20,313
g5.4xlarge 1 x 600 GB NVMe SSD 125,000 / 62,500
g5.8xlarge 1 x 900 GB NVMe SSD 250,000 / 125,000
g5.12xlarge 1 x 3800 GB NVMe SSD 312,500 / 156,250
g5.16xlarge 1 x 1900 GB NVMe SSD 250,000 / 125,000
g5.24xlarge 1 x 3800 GB NVMe SSD 312,500 / 156,250
g5.48xlarge 2 x 3800 GB NVMe SSD 625,000 / 312,500
G6
g6.xlarge 1 x 250 GB NVMe SSD 40,625 / 20,000
g6.2xlarge 1 x 450 GB NVMe SSD 40,625 / 20,000
g6.4xlarge 1 x 600 GB NVMe SSD 125,000 / 40,000
g6.8xlarge 2 x 450 GB NVMe SSD 250,000 / 80,000
g6.12xlarge 4 x 940 GB NVMe SSD 312,500 / 125,000
g6.16xlarge 2 x 940 GB NVMe SSD 250,000 / 80,000
g6.24xlarge 4 x 940 GB NVMe SSD 312,500 / 156,248
g6.48xlarge 8 x 940 GB NVMe SSD 625,000 / 312,496
G6e
g6e.xlarge 1 x 250 GB NVMe SSD 40,625 / 20,000
g6e.2xlarge 1 x 450 GB NVMe SSD 40,625 / 20,000
g6e.4xlarge 1 x 600 GB NVMe SSD 125,000 / 40,000
g6e.8xlarge 2 x 450 GB NVMe SSD 250,000 / 80,000
g6e.12xlarge 2 x 1900 GB NVMe SSD 312,500 / 125,000
g6e.16xlarge 2 x 950 GB NVMe SSD 250,000 / 80,000
g6e.24xlarge 2 x 1900 GB NVMe SSD 312,500 / 156,250
g6e.48xlarge 4 x 1900 GB NVMe SSD 625,000 / 312,500
Gr6
gr6.4xlarge 1 x 600 GB NVMe SSD 125,000 / 40,000
gr6.8xlarge 2 x 450 GB NVMe SSD 250,000 / 80,000
P3dn
p3dn.24xlarge 2 x 900 GB NVMe SSD 700,000 / 340,000
P4d
p4d.24xlarge 8 x 1000 GB NVMe SSD 2,000,000 / 1,600,000
P4de
p4de.24xlarge 8 x 1000 GB NVMe SSD 2,000,000 / 1,600,000
P5
p5.48xlarge 8 x 3800 GB NVMe SSD 4,400,000 / 2,200,000
P5e
p5e.48xlarge 8 x 3800 GB NVMe SSD 4,400,000 / 2,200,000
P5en
p5en.48xlarge 8 x 3800 GB NVMe SSD 4,400,000 / 2,200,000
Trn1
trn1.2xlarge 1 x 474 GB NVMe SSD 107,500 / 45,000
trn1.32xlarge 4 x 1900 GB NVMe SSD 1,720,000 / 720,000
Trn1n
trn1n.32xlarge 4 x 1900 GB NVMe SSD 1,720,000 / 720,000
Trn2u
trn2u.48xlarge 4 x 1900 GB NVMe SSD 1,720,000 / 720,000

1 Volumes attached to certain instances suffer a first-write penalty unless initialized. For more information, see Optimize disk performance for instance store volumes.

2 For more information, see Instance store volume TRIM support.

Security specifications

Instance type EBS encryption Instance store encryption Encryption in transit AMD SEV-SNP NitroTPM Nitro Enclaves
DL1
dl1.24xlarge
DL2q
dl2q.24xlarge Instance store not supported
F1
f1.2xlarge
f1.4xlarge
f1.16xlarge
F2
f2.12xlarge
f2.48xlarge
G4ad
g4ad.xlarge
g4ad.2xlarge
g4ad.4xlarge
g4ad.8xlarge
g4ad.16xlarge
G4dn
g4dn.xlarge
g4dn.2xlarge
g4dn.4xlarge
g4dn.8xlarge
g4dn.12xlarge
g4dn.16xlarge
g4dn.metal
G5
g5.xlarge
g5.2xlarge
g5.4xlarge
g5.8xlarge
g5.12xlarge
g5.16xlarge
g5.24xlarge
g5.48xlarge
G5g
g5g.xlarge Instance store not supported
g5g.2xlarge Instance store not supported
g5g.4xlarge Instance store not supported
g5g.8xlarge Instance store not supported
g5g.16xlarge Instance store not supported
g5g.metal Instance store not supported
G6
g6.xlarge
g6.2xlarge
g6.4xlarge
g6.8xlarge
g6.12xlarge
g6.16xlarge
g6.24xlarge
g6.48xlarge
G6e
g6e.xlarge
g6e.2xlarge
g6e.4xlarge
g6e.8xlarge
g6e.12xlarge
g6e.16xlarge
g6e.24xlarge
g6e.48xlarge
Gr6
gr6.4xlarge
gr6.8xlarge
Inf1
inf1.xlarge Instance store not supported
inf1.2xlarge Instance store not supported
inf1.6xlarge Instance store not supported
inf1.24xlarge Instance store not supported
Inf2
inf2.xlarge Instance store not supported
inf2.8xlarge Instance store not supported
inf2.24xlarge Instance store not supported
inf2.48xlarge Instance store not supported
P2
p2.xlarge Instance store not supported
p2.8xlarge Instance store not supported
p2.16xlarge Instance store not supported
P3
p3.2xlarge Instance store not supported
p3.8xlarge Instance store not supported
p3.16xlarge Instance store not supported
P3dn
p3dn.24xlarge
P4d
p4d.24xlarge
P4de
p4de.24xlarge
P5
p5.48xlarge
P5e
p5e.48xlarge
P5en
p5en.48xlarge
Trn1
trn1.2xlarge
trn1.32xlarge
Trn1n
trn1n.32xlarge
Trn2
trn2.48xlarge Instance store not supported
Trn2u
trn2u.48xlarge
VT1
vt1.3xlarge Instance store not supported
vt1.6xlarge Instance store not supported
vt1.24xlarge Instance store not supported