쿠키 기본 설정 선택

당사는 사이트와 서비스를 제공하는 데 필요한 필수 쿠키 및 유사한 도구를 사용합니다. 고객이 사이트를 어떻게 사용하는지 파악하고 개선할 수 있도록 성능 쿠키를 사용해 익명의 통계를 수집합니다. 필수 쿠키는 비활성화할 수 없지만 '사용자 지정' 또는 ‘거부’를 클릭하여 성능 쿠키를 거부할 수 있습니다.

사용자가 동의하는 경우 AWS와 승인된 제3자도 쿠키를 사용하여 유용한 사이트 기능을 제공하고, 사용자의 기본 설정을 기억하고, 관련 광고를 비롯한 관련 콘텐츠를 표시합니다. 필수가 아닌 모든 쿠키를 수락하거나 거부하려면 ‘수락’ 또는 ‘거부’를 클릭하세요. 더 자세한 내용을 선택하려면 ‘사용자 정의’를 클릭하세요.

What are AWS Deep Learning Containers? - AWS Deep Learning Containers
이 페이지는 귀하의 언어로 번역되지 않았습니다. 번역 요청

What are AWS Deep Learning Containers?

AWS Deep Learning Containers are pre-built Docker images that make it easier to run popular deep learning frameworks and tools on AWS. They provide a consistent, up-to-date, secure, and optimized runtime environment for your deep learning applications hosted on AWS infrastructure. To get started, see Getting Started with AWS Deep Learning Containers.

Key Features

Pre-Installed Deep Learning Frameworks

AWS Deep Learning Containers include pre-installed and configured versions of leading deep learning frameworks such as TensorFlow and PyTorch. This eliminates the need to build and maintain your own Docker images from scratch.

Hardware Acceleration

AWS Deep Learning Containers are optimized for CPU-based, GPU-accelerated, and AWS silicon-based deep learning. They support CUDA, cuDNN, and other necessary libraries for leveraging the power of GPU-based Amazon EC2 instances, as well as AWS-designed chips like Graviton CPUs and GPUs, AWS Trainium, and Intel's Habana-Gaudi processors.

AWS Service Integration

AWS Deep Learning Containers seamlessly integrate with a variety of AWS services, including SageMaker AI, Amazon Elastic Container Service (ECS), Amazon Elastic Kubernetes Service (EKS), Amazon EC2, and AWS ParallelCluster. This makes it easy to deploy and run your deep learning models and applications on AWS infrastructure.

Secure and Regularly Updated

AWS regularly maintains and updates the AWS Deep Learning Containers to ensure you have access to the latest versions of deep learning frameworks and dependencies. This helps keep your AWS-based deep learning environment secure and up-to-date, without the overhead of managing security patches and updates yourself. Keeping your deep learning containers updated with the latest security patches can be a resource-intensive task, but AWS Deep Learning Containers eliminate this burden by providing regular, automatic updates. This ensures your deep learning environment remains secure and current, without requiring significant manual effort on your part. By automating the update process, AWS Deep Learning Containers allow you to focus on developing your deep learning models and applications, rather than worrying about the underlying infrastructure and security upkeep, which can improve your team's productivity and allow you to more efficiently leverage the latest deep learning capabilities in your AWS-hosted projects.

Use Cases

AWS Deep Learning Containers are particularly useful in the following AWS-based deep learning scenarios:

Model Training

Use AWS Deep Learning Containers to train your deep learning models on CPU-based, GPU-accelerated, or AWS silicon-powered Amazon EC2 instances, or leverage multi-node training on AWS ParallelCluster or SageMaker Hyperpod.

Model Deployment

Deploy your trained models using the AWS Deep Learning Containers for scalable, production-ready inference on AWS, such as through SageMaker AI.

Experimentation and Prototyping

Quickly spin up deep learning development environments on AWS using the pre-configured containers. AWS Deep Learning Containers are the default option for notebook in SageMaker AI Studio, making it easy to get started with experimentation and prototyping.

Continuous Integration and Delivery

Integrate the containers into your AWS-based CI/CD pipelines, such as those using Amazon ECS or Amazon EKS, for consistent, automated deep learning workloads.

프라이버시사이트 이용 약관쿠키 기본 설정
© 2025, Amazon Web Services, Inc. 또는 계열사. All rights reserved.