Accessing and listing an Amazon S3 bucket - Amazon Simple Storage Service

Accessing and listing an Amazon S3 bucket

To list and access your Amazon S3 buckets, you can use various tools. Review the following tools to determine which approach fits your use case:

  • Amazon S3 console: With the Amazon S3 console, you can easily access a bucket and modify the bucket's properties. You can also perform most bucket operations by using the console UI, without having to write any code.

  • AWS CLI: If you need to access multiple buckets, you can save time by using the AWS Command Line Interface (AWS CLI) to automate common and repetitive tasks. Scriptability and repeatability for common actions are frequent considerations as organizations scale. For more information, see Developing with Amazon S3 using the AWS CLI.

  • Amazon S3 REST API: You can use the Amazon S3 REST API to write your own programs and access buckets programmatically. Amazon S3 supports an API architecture in which your buckets and objects are resources, each with a resource URI that uniquely identifies the resource. For more information, see Developing with Amazon S3 using the REST API.

Depending on the use case for your Amazon S3 bucket, there are different recommended methods to access the underlying data in your buckets. The following list includes common use cases for accessing your data.

  • Static websites – You can use Amazon S3 to host a static website. In this use case, you can configure your S3 bucket to function like a website. For an example that walks you through the steps of hosting a website on Amazon S3, see Tutorial: Configuring a static website on Amazon S3.

    To host a static website with security settings like Block Public Access enabled, we recommend using Amazon CloudFront with Origin Access Control (OAC) and implementing additional security headers, such as HTTPS. For more information, see Getting started with a secure static website.

    Note

    Amazon S3 supports both virtual-hosted–style and path-style URLs for static website access. Because buckets can be accessed using path-style and virtual-hosted–style URLs, we recommend that you create buckets with DNS-compliant bucket names. For more information, see Bucket restrictions and limitations.

  • Shared datasets – As you scale on Amazon S3, it's common to adopt a multi-tenant model, where you assign different end customers or business units to unique prefixes within a shared bucket. By using Amazon S3 access points, you can divide one large bucket policy into separate, discrete access point policies for each application that needs to access the shared dataset. This approach makes it simpler to focus on building the right access policy for an application without disrupting what any other application is doing within the shared dataset. For more information, see Managing data access with Amazon S3 access points.

  • High-throughput workloads – Mountpoint for Amazon S3 is a high-throughput open source file client for mounting an Amazon S3 bucket as a local file system. With Mountpoint, your applications can access objects stored in Amazon S3 through file-system operations, such as open and read. Mountpoint automatically translates these operations into S3 object API calls, giving your applications access to the elastic storage and throughput of Amazon S3 through a file interface. For more information, see Working with Mountpoint for Amazon S3.

  • Multi-Region applications – Amazon S3 Multi-Region Access Points provide a global endpoint that applications can use to fulfill requests from S3 buckets that are located in multiple AWS Regions. You can use Multi-Region Access Points to build multi-Region applications with the same architecture that's used in a single Region, and then run those applications anywhere in the world. Instead of sending requests over the public internet, Multi-Region Access Points provide built-in network resilience with acceleration of internet-based requests to Amazon S3. For more information, see Multi-Region Access Points in Amazon S3.

  • Building new applications – You can use the AWS SDKs when developing applications with Amazon S3. The AWS SDKs simplify your programming tasks by wrapping the underlying Amazon S3 REST API. To build connected mobile and web applications, you can use the AWS Mobile SDKs and the AWS Amplify JavaScript library. For more information, see Developing with Amazon S3 using the AWS SDKs, and explorers.

  • Secure Shell (SSH) File Transfer Protocol (SFTP) – If you’re trying to securely transfer sensitive data over the internet, you can use an SFTP-enabled server with your Amazon S3 bucket. AWS SFTP is a network protocol that supports the full security and authentication functionality of SSH. With this protocol, you have fine-grained control over user identity, permissions, and keys or you can use IAM policies to manage access. To associate an SFTP enabled server with your Amazon S3 bucket, make sure to create your SFTP-enabled server first. Then, you set up user accounts, and associate the server with an Amazon S3 bucket. For a walkthrough of this process, see AWS Transfer for SFTP – Fully Managed SFTP Service for Amazon S3 in AWS Blogs.

Listing a bucket

To list all of your buckets, you must have the s3:ListAllMyBuckets permission. To access a bucket, make sure to also obtain the required AWS Identity and Access Management (IAM) permissions to list the contents of the specified bucket. For an example bucket policy that grants access to an S3 bucket, see Allowing an IAM user access to one of your buckets. If you're encountering an HTTP Access Denied (403 Forbidden) error, see Bucket policies and IAM policies.

You can list your bucket by using the Amazon S3 console, the AWS CLI, or the AWS SDKs.

  1. Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/.

  2. In the left navigation pane, choose Buckets.

  3. From the General purpose buckets list, choose the bucket that you want to view.

    Note

    The General purpose buckets list includes buckets that are located in all AWS Regions.

To use the AWS CLI to access an S3 bucket or generate a listing of S3 buckets, use the ls command. When you list all of the objects in your bucket, note that you must have the s3:ListBucket permission.

To use this example command, replace DOC-EXAMPLE-BUCKET1 with the name of your bucket.

$ aws s3 ls s3://DOC-EXAMPLE-BUCKET1

The following example command lists all the Amazon S3 buckets in your account:

$ aws s3 ls

For more information and examples, see List bucket and objects.

You can also access an Amazon S3 bucket by using the ListBuckets API operation. For examples of how to use this operation with different AWS SDKs, see Use ListBuckets with an AWS SDK or command line tool.