Úselo PutObject con un AWS SDK o CLI - AWS Ejemplos de código de SDK

Hay más ejemplos de AWS SDK disponibles en el GitHub repositorio de ejemplos de AWS Doc SDK.

Las traducciones son generadas a través de traducción automática. En caso de conflicto entre la traducción y la version original de inglés, prevalecerá la version en inglés.

Úselo PutObject con un AWS SDK o CLI

En los siguientes ejemplos de código, se muestra cómo utilizar PutObject.

Los ejemplos de acciones son extractos de código de programas más grandes y deben ejecutarse en contexto. Puede ver esta acción en contexto en los siguientes ejemplos de código:

.NET
AWS SDK for .NET
nota

Hay más en marcha GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

/// <summary> /// Shows how to upload a file from the local computer to an Amazon S3 /// bucket. /// </summary> /// <param name="client">An initialized Amazon S3 client object.</param> /// <param name="bucketName">The Amazon S3 bucket to which the object /// will be uploaded.</param> /// <param name="objectName">The object to upload.</param> /// <param name="filePath">The path, including file name, of the object /// on the local computer to upload.</param> /// <returns>A boolean value indicating the success or failure of the /// upload procedure.</returns> public static async Task<bool> UploadFileAsync( IAmazonS3 client, string bucketName, string objectName, string filePath) { var request = new PutObjectRequest { BucketName = bucketName, Key = objectName, FilePath = filePath, }; var response = await client.PutObjectAsync(request); if (response.HttpStatusCode == System.Net.HttpStatusCode.OK) { Console.WriteLine($"Successfully uploaded {objectName} to {bucketName}."); return true; } else { Console.WriteLine($"Could not upload {objectName} to {bucketName}."); return false; } }

Cargar un objeto con cifrado del lado del servidor.

using System; using System.Threading.Tasks; using Amazon.S3; using Amazon.S3.Model; /// <summary> /// This example shows how to upload an object to an Amazon Simple Storage /// Service (Amazon S3) bucket with server-side encryption enabled. /// </summary> public class ServerSideEncryption { public static async Task Main() { string bucketName = "amzn-s3-demo-bucket"; string keyName = "samplefile.txt"; // If the AWS Region defined for your default user is different // from the Region where your Amazon S3 bucket is located, // pass the Region name to the Amazon S3 client object's constructor. // For example: RegionEndpoint.USWest2. IAmazonS3 client = new AmazonS3Client(); await WritingAnObjectAsync(client, bucketName, keyName); } /// <summary> /// Upload a sample object include a setting for encryption. /// </summary> /// <param name="client">The initialized Amazon S3 client object used to /// to upload a file and apply server-side encryption.</param> /// <param name="bucketName">The name of the Amazon S3 bucket where the /// encrypted object will reside.</param> /// <param name="keyName">The name for the object that you want to /// create in the supplied bucket.</param> public static async Task WritingAnObjectAsync(IAmazonS3 client, string bucketName, string keyName) { try { var putRequest = new PutObjectRequest { BucketName = bucketName, Key = keyName, ContentBody = "sample text", ServerSideEncryptionMethod = ServerSideEncryptionMethod.AES256, }; var putResponse = await client.PutObjectAsync(putRequest); // Determine the encryption state of an object. GetObjectMetadataRequest metadataRequest = new GetObjectMetadataRequest { BucketName = bucketName, Key = keyName, }; GetObjectMetadataResponse response = await client.GetObjectMetadataAsync(metadataRequest); ServerSideEncryptionMethod objectEncryption = response.ServerSideEncryptionMethod; Console.WriteLine($"Encryption method used: {0}", objectEncryption.ToString()); } catch (AmazonS3Exception ex) { Console.WriteLine($"Error: '{ex.Message}' when writing an object"); } } }

Coloca un objeto mediante una solicitud condicional.

/// <summary> /// Uploads an object to Amazon S3 with a conditional request. Prevents overwrite using an IfNoneMatch condition for the object key. /// </summary> /// <param name="objectKey">The key of the object to upload.</param> /// <param name="bucket">The source bucket of the object.</param> /// <param name="content">The content to upload as a string.</param> /// <returns>The ETag if the conditional write is successful, empty otherwise.</returns> public async Task<string> PutObjectConditional(string objectKey, string bucket, string content) { try { var putObjectRequest = new PutObjectRequest { BucketName = bucket, Key = objectKey, ContentBody = content, IfNoneMatch = "*" }; var putResult = await _amazonS3.PutObjectAsync(putObjectRequest); _logger.LogInformation($"Conditional write successful for key {objectKey} in bucket {bucket}."); return putResult.ETag; } catch (AmazonS3Exception e) { if (e.ErrorCode == "PreconditionFailed") { _logger.LogError("Conditional write failed: Precondition failed"); } else { _logger.LogError($"Unexpected error: {e.ErrorCode}"); throw; } return string.Empty; } }
  • Para obtener más información sobre la API, consulta PutObjectla Referencia AWS SDK for .NET de la API.

Bash
AWS CLI con el script Bash
nota

Hay más información. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

############################################################################### # function errecho # # This function outputs everything sent to it to STDERR (standard error output). ############################################################################### function errecho() { printf "%s\n" "$*" 1>&2 } ############################################################################### # function copy_file_to_bucket # # This function creates a file in the specified bucket. # # Parameters: # $1 - The name of the bucket to copy the file to. # $2 - The path and file name of the local file to copy to the bucket. # $3 - The key (name) to call the copy of the file in the bucket. # # Returns: # 0 - If successful. # 1 - If it fails. ############################################################################### function copy_file_to_bucket() { local response bucket_name source_file destination_file_name bucket_name=$1 source_file=$2 destination_file_name=$3 response=$(aws s3api put-object \ --bucket "$bucket_name" \ --body "$source_file" \ --key "$destination_file_name") # shellcheck disable=SC2181 if [[ ${?} -ne 0 ]]; then errecho "ERROR: AWS reports put-object operation failed.\n$response" return 1 fi }
  • Para obtener más información sobre la API, consulta PutObjectla Referencia de AWS CLI comandos.

C++
SDK para C++
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

bool AwsDoc::S3::putObject(const Aws::String &bucketName, const Aws::String &fileName, const Aws::S3::S3ClientConfiguration &clientConfig) { Aws::S3::S3Client s3Client(clientConfig); Aws::S3::Model::PutObjectRequest request; request.SetBucket(bucketName); //We are using the name of the file as the key for the object in the bucket. //However, this is just a string and can be set according to your retrieval needs. request.SetKey(fileName); std::shared_ptr<Aws::IOStream> inputData = Aws::MakeShared<Aws::FStream>("SampleAllocationTag", fileName.c_str(), std::ios_base::in | std::ios_base::binary); if (!*inputData) { std::cerr << "Error unable to read file " << fileName << std::endl; return false; } request.SetBody(inputData); Aws::S3::Model::PutObjectOutcome outcome = s3Client.PutObject(request); if (!outcome.IsSuccess()) { std::cerr << "Error: putObject: " << outcome.GetError().GetMessage() << std::endl; } else { std::cout << "Added object '" << fileName << "' to bucket '" << bucketName << "'."; } return outcome.IsSuccess(); }
  • Para obtener más información sobre la API, consulta PutObjectla Referencia AWS SDK for C++ de la API.

CLI
AWS CLI

Ejemplo 1: cargar un objeto en Amazon S3

El siguiente ejemplo de put-object comando carga un objeto en Amazon S3.

aws s3api put-object \ --bucket amzn-s3-demo-bucket \ --key my-dir/MySampleImage.png \ --body MySampleImage.png

Para obtener más información sobre la carga de objetos, consulte Carga de objetos < http://docs.aws.amazon. com/AmazonS3/latest/dev/UploadingObjects.html> en la guía para desarrolladores de Amazon S3.

Ejemplo 2: cargar un archivo de vídeo a Amazon S3

El siguiente ejemplo de put-object comando carga un archivo de vídeo.

aws s3api put-object \ --bucket amzn-s3-demo-bucket \ --key my-dir/big-video-file.mp4 \ --body /media/videos/f-sharp-3-data-services.mp4

Para obtener más información sobre la carga de objetos, consulte Carga de objetos < http://docs.aws.amazon. com/AmazonS3/latest/dev/UploadingObjects.html> en la guía para desarrolladores de Amazon S3.

  • Para obtener más información sobre la API, consulte la Referencia PutObjectde AWS CLI comandos.

Go
SDK para Go V2
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

Coloque un objeto en un bucket con la API de bajo nivel.

import ( "bytes" "context" "errors" "fmt" "io" "log" "os" "time" "github.com/aws/aws-sdk-go-v2/aws" "github.com/aws/aws-sdk-go-v2/feature/s3/manager" "github.com/aws/aws-sdk-go-v2/service/s3" "github.com/aws/aws-sdk-go-v2/service/s3/types" "github.com/aws/smithy-go" ) // BucketBasics encapsulates the Amazon Simple Storage Service (Amazon S3) actions // used in the examples. // It contains S3Client, an Amazon S3 service client that is used to perform bucket // and object actions. type BucketBasics struct { S3Client *s3.Client } // UploadFile reads from a file and puts the data into an object in a bucket. func (basics BucketBasics) UploadFile(ctx context.Context, bucketName string, objectKey string, fileName string) error { file, err := os.Open(fileName) if err != nil { log.Printf("Couldn't open file %v to upload. Here's why: %v\n", fileName, err) } else { defer file.Close() _, err = basics.S3Client.PutObject(ctx, &s3.PutObjectInput{ Bucket: aws.String(bucketName), Key: aws.String(objectKey), Body: file, }) if err != nil { var apiErr smithy.APIError if errors.As(err, &apiErr) && apiErr.ErrorCode() == "EntityTooLarge" { log.Printf("Error while uploading object to %s. The object is too large.\n"+ "To upload objects larger than 5GB, use the S3 console (160GB max)\n"+ "or the multipart upload API (5TB max).", bucketName) } else { log.Printf("Couldn't upload file %v to %v:%v. Here's why: %v\n", fileName, bucketName, objectKey, err) } } else { err = s3.NewObjectExistsWaiter(basics.S3Client).Wait( ctx, &s3.HeadObjectInput{Bucket: aws.String(bucketName), Key: aws.String(objectKey)}, time.Minute) if err != nil { log.Printf("Failed attempt to wait for object %s to exist.\n", objectKey) } } } return err }

Cargue un objeto en un bucket con un administrador de transferencias.

import ( "bytes" "context" "errors" "fmt" "log" "time" "github.com/aws/aws-sdk-go-v2/aws" "github.com/aws/aws-sdk-go-v2/feature/s3/manager" "github.com/aws/aws-sdk-go-v2/service/s3" "github.com/aws/aws-sdk-go-v2/service/s3/types" "github.com/aws/smithy-go" ) // S3Actions wraps S3 service actions. type S3Actions struct { S3Client *s3.Client S3Manager *manager.Uploader } // UploadObject uses the S3 upload manager to upload an object to a bucket. func (actor S3Actions) UploadObject(ctx context.Context, bucket string, key string, contents string) (string, error) { var outKey string input := &s3.PutObjectInput{ Bucket: aws.String(bucket), Key: aws.String(key), Body: bytes.NewReader([]byte(contents)), ChecksumAlgorithm: types.ChecksumAlgorithmSha256, } output, err := actor.S3Manager.Upload(ctx, input) if err != nil { var noBucket *types.NoSuchBucket if errors.As(err, &noBucket) { log.Printf("Bucket %s does not exist.\n", bucket) err = noBucket } } else { err := s3.NewObjectExistsWaiter(actor.S3Client).Wait(ctx, &s3.HeadObjectInput{ Bucket: aws.String(bucket), Key: aws.String(key), }, time.Minute) if err != nil { log.Printf("Failed attempt to wait for object %s to exist in %s.\n", key, bucket) } else { outKey = *output.Key } } return outKey, err }
  • Para obtener más información sobre la API, consulta PutObjectla Referencia AWS SDK para Go de la API.

Java
SDK para Java 2.x
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

Cargue un archivo en un bucket con un S3Client.

/** * Uploads a local file to an AWS S3 bucket asynchronously. * * @param bucketName the name of the S3 bucket to upload the file to * @param key the key (object name) to use for the uploaded file * @param objectPath the local file path of the file to be uploaded * @return a {@link CompletableFuture} that completes with the {@link PutObjectResponse} when the upload is successful, or throws a {@link RuntimeException} if the upload fails */ public CompletableFuture<PutObjectResponse> uploadLocalFileAsync(String bucketName, String key, String objectPath) { PutObjectRequest objectRequest = PutObjectRequest.builder() .bucket(bucketName) .key(key) .build(); CompletableFuture<PutObjectResponse> response = getAsyncClient().putObject(objectRequest, AsyncRequestBody.fromFile(Paths.get(objectPath))); return response.whenComplete((resp, ex) -> { if (ex != null) { throw new RuntimeException("Failed to upload file", ex); } }); }

Usa un S3 TransferManager para subir un archivo a un bucket. Vea el archivo completo y pruébelo.

import org.slf4j.Logger; import org.slf4j.LoggerFactory; import software.amazon.awssdk.transfer.s3.S3TransferManager; import software.amazon.awssdk.transfer.s3.model.CompletedFileUpload; import software.amazon.awssdk.transfer.s3.model.FileUpload; import software.amazon.awssdk.transfer.s3.model.UploadFileRequest; import software.amazon.awssdk.transfer.s3.progress.LoggingTransferListener; import java.net.URI; import java.net.URISyntaxException; import java.net.URL; import java.nio.file.Paths; import java.util.UUID; public String uploadFile(S3TransferManager transferManager, String bucketName, String key, URI filePathURI) { UploadFileRequest uploadFileRequest = UploadFileRequest.builder() .putObjectRequest(b -> b.bucket(bucketName).key(key)) .source(Paths.get(filePathURI)) .build(); FileUpload fileUpload = transferManager.uploadFile(uploadFileRequest); CompletedFileUpload uploadResult = fileUpload.completionFuture().join(); return uploadResult.response().eTag(); }

Cargue un objeto en un bucket y configure las etiquetas con un S3Client.

/** * Puts tags on an Amazon S3 object. * * @param s3 An {@link S3Client} object that represents the Amazon S3 client. * @param bucketName The name of the Amazon S3 bucket. * @param objectKey The key of the Amazon S3 object. * @param objectPath The file path of the object to be uploaded. */ public static void putS3ObjectTags(S3Client s3, String bucketName, String objectKey, String objectPath) { try { Tag tag1 = Tag.builder() .key("Tag 1") .value("This is tag 1") .build(); Tag tag2 = Tag.builder() .key("Tag 2") .value("This is tag 2") .build(); List<Tag> tags = new ArrayList<>(); tags.add(tag1); tags.add(tag2); Tagging allTags = Tagging.builder() .tagSet(tags) .build(); PutObjectRequest putOb = PutObjectRequest.builder() .bucket(bucketName) .key(objectKey) .tagging(allTags) .build(); s3.putObject(putOb, RequestBody.fromBytes(getObjectFile(objectPath))); } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } /** * Updates the tags associated with an object in an Amazon S3 bucket. * * @param s3 an instance of the S3Client class, which is used to interact with the Amazon S3 service * @param bucketName the name of the S3 bucket containing the object * @param objectKey the key (or name) of the object in the S3 bucket * @throws S3Exception if there is an error updating the object's tags */ public static void updateObjectTags(S3Client s3, String bucketName, String objectKey) { try { GetObjectTaggingRequest taggingRequest = GetObjectTaggingRequest.builder() .bucket(bucketName) .key(objectKey) .build(); GetObjectTaggingResponse getTaggingRes = s3.getObjectTagging(taggingRequest); List<Tag> obTags = getTaggingRes.tagSet(); for (Tag sinTag : obTags) { System.out.println("The tag key is: " + sinTag.key()); System.out.println("The tag value is: " + sinTag.value()); } // Replace the object's tags with two new tags. Tag tag3 = Tag.builder() .key("Tag 3") .value("This is tag 3") .build(); Tag tag4 = Tag.builder() .key("Tag 4") .value("This is tag 4") .build(); List<Tag> tags = new ArrayList<>(); tags.add(tag3); tags.add(tag4); Tagging updatedTags = Tagging.builder() .tagSet(tags) .build(); PutObjectTaggingRequest taggingRequest1 = PutObjectTaggingRequest.builder() .bucket(bucketName) .key(objectKey) .tagging(updatedTags) .build(); s3.putObjectTagging(taggingRequest1); GetObjectTaggingResponse getTaggingRes2 = s3.getObjectTagging(taggingRequest); List<Tag> modTags = getTaggingRes2.tagSet(); for (Tag sinTag : modTags) { System.out.println("The tag key is: " + sinTag.key()); System.out.println("The tag value is: " + sinTag.value()); } } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } /** * Retrieves the contents of a file as a byte array. * * @param filePath the path of the file to be read * @return a byte array containing the contents of the file, or null if an error occurs */ private static byte[] getObjectFile(String filePath) { FileInputStream fileInputStream = null; byte[] bytesArray = null; try { File file = new File(filePath); bytesArray = new byte[(int) file.length()]; fileInputStream = new FileInputStream(file); fileInputStream.read(bytesArray); } catch (IOException e) { e.printStackTrace(); } finally { if (fileInputStream != null) { try { fileInputStream.close(); } catch (IOException e) { e.printStackTrace(); } } } return bytesArray; } }

Cargue un objeto en un bucket y configure los metadatos con un S3Client.

import software.amazon.awssdk.core.sync.RequestBody; import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.s3.S3Client; import software.amazon.awssdk.services.s3.model.PutObjectRequest; import software.amazon.awssdk.services.s3.model.S3Exception; import java.io.File; import java.util.HashMap; import java.util.Map; /** * Before running this Java V2 code example, set up your development * environment, including your credentials. * <p> * For more information, see the following documentation topic: * <p> * https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html */ public class PutObjectMetadata { public static void main(String[] args) { final String USAGE = """ Usage: <bucketName> <objectKey> <objectPath>\s Where: bucketName - The Amazon S3 bucket to upload an object into. objectKey - The object to upload (for example, book.pdf). objectPath - The path where the file is located (for example, C:/AWS/book2.pdf).\s """; if (args.length != 3) { System.out.println(USAGE); System.exit(1); } String bucketName = args[0]; String objectKey = args[1]; String objectPath = args[2]; System.out.println("Putting object " + objectKey + " into bucket " + bucketName); System.out.println(" in bucket: " + bucketName); Region region = Region.US_EAST_1; S3Client s3 = S3Client.builder() .region(region) .build(); putS3Object(s3, bucketName, objectKey, objectPath); s3.close(); } /** * Uploads an object to an Amazon S3 bucket with metadata. * * @param s3 the S3Client object used to interact with the Amazon S3 service * @param bucketName the name of the S3 bucket to upload the object to * @param objectKey the name of the object to be uploaded * @param objectPath the local file path of the object to be uploaded */ public static void putS3Object(S3Client s3, String bucketName, String objectKey, String objectPath) { try { Map<String, String> metadata = new HashMap<>(); metadata.put("author", "Mary Doe"); metadata.put("version", "1.0.0.0"); PutObjectRequest putOb = PutObjectRequest.builder() .bucket(bucketName) .key(objectKey) .metadata(metadata) .build(); s3.putObject(putOb, RequestBody.fromFile(new File(objectPath))); System.out.println("Successfully placed " + objectKey + " into bucket " + bucketName); } catch (S3Exception e) { System.err.println(e.getMessage()); System.exit(1); } } }

Cargue un objeto en un bucket y configure un valor de retención de objetos con un S3Client.

import software.amazon.awssdk.regions.Region; import software.amazon.awssdk.services.s3.S3Client; import software.amazon.awssdk.services.s3.model.PutObjectRetentionRequest; import software.amazon.awssdk.services.s3.model.ObjectLockRetention; import software.amazon.awssdk.services.s3.model.S3Exception; import java.time.Instant; import java.time.LocalDate; import java.time.LocalDateTime; import java.time.ZoneOffset; /** * Before running this Java V2 code example, set up your development * environment, including your credentials. * <p> * For more information, see the following documentation topic: * <p> * https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/get-started.html */ public class PutObjectRetention { public static void main(String[] args) { final String usage = """ Usage: <key> <bucketName>\s Where: key - The name of the object (for example, book.pdf).\s bucketName - The Amazon S3 bucket name that contains the object (for example, bucket1).\s """; if (args.length != 2) { System.out.println(usage); System.exit(1); } String key = args[0]; String bucketName = args[1]; Region region = Region.US_EAST_1; S3Client s3 = S3Client.builder() .region(region) .build(); setRentionPeriod(s3, key, bucketName); s3.close(); } /** * Sets the retention period for an object in an Amazon S3 bucket. * * @param s3 the S3Client object used to interact with the Amazon S3 service * @param key the key (name) of the object in the S3 bucket * @param bucket the name of the S3 bucket where the object is stored * * @throws S3Exception if an error occurs while setting the object retention period */ public static void setRentionPeriod(S3Client s3, String key, String bucket) { try { LocalDate localDate = LocalDate.parse("2020-07-17"); LocalDateTime localDateTime = localDate.atStartOfDay(); Instant instant = localDateTime.toInstant(ZoneOffset.UTC); ObjectLockRetention lockRetention = ObjectLockRetention.builder() .mode("COMPLIANCE") .retainUntilDate(instant) .build(); PutObjectRetentionRequest retentionRequest = PutObjectRetentionRequest.builder() .bucket(bucket) .key(key) .bypassGovernanceRetention(true) .retention(lockRetention) .build(); // To set Retention on an object, the Amazon S3 bucket must support object // locking, otherwise an exception is thrown. s3.putObjectRetention(retentionRequest); System.out.print("An object retention configuration was successfully placed on the object"); } catch (S3Exception e) { System.err.println(e.awsErrorDetails().errorMessage()); System.exit(1); } } }
  • Para obtener más información sobre la API, consulte PutObjectla Referencia AWS SDK for Java 2.x de la API.

JavaScript
SDK para JavaScript (v3)
nota

Hay más información. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

Cargue el objeto.

import { readFile } from "node:fs/promises"; import { PutObjectCommand, S3Client, S3ServiceException, } from "@aws-sdk/client-s3"; /** * Upload a file to an S3 bucket. * @param {{ bucketName: string, key: string, filePath: string }} */ export const main = async ({ bucketName, key, filePath }) => { const client = new S3Client({}); const command = new PutObjectCommand({ Bucket: bucketName, Key: key, Body: await readFile(filePath), }); try { const response = await client.send(command); console.log(response); } catch (caught) { if ( caught instanceof S3ServiceException && caught.name === "EntityTooLarge" ) { console.error( `Error from S3 while uploading object to ${bucketName}. \ The object was too large. To upload objects larger than 5GB, use the S3 console (160GB max) \ or the multipart upload API (5TB max).`, ); } else if (caught instanceof S3ServiceException) { console.error( `Error from S3 while uploading object to ${bucketName}. ${caught.name}: ${caught.message}`, ); } else { throw caught; } } };
Kotlin
SDK para Kotlin
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

suspend fun putS3Object( bucketName: String, objectKey: String, objectPath: String, ) { val metadataVal = mutableMapOf<String, String>() metadataVal["myVal"] = "test" val request = PutObjectRequest { bucket = bucketName key = objectKey metadata = metadataVal body = File(objectPath).asByteStream() } S3Client { region = "us-east-1" }.use { s3 -> val response = s3.putObject(request) println("Tag information is ${response.eTag}") } }
  • Para obtener más información sobre la API, consulta PutObjectla referencia sobre el AWS SDK para la API de Kotlin.

PHP
SDK para PHP
nota

Hay más información al respecto. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

Cargue un objeto en un bucket.

$s3client = new Aws\S3\S3Client(['region' => 'us-west-2']); $fileName = __DIR__ . "/local-file-" . uniqid(); try { $this->s3client->putObject([ 'Bucket' => $this->bucketName, 'Key' => $fileName, 'SourceFile' => __DIR__ . '/testfile.txt' ]); echo "Uploaded $fileName to $this->bucketName.\n"; } catch (Exception $exception) { echo "Failed to upload $fileName with error: " . $exception->getMessage(); exit("Please fix error with file upload before continuing."); }
  • Para obtener más información sobre la API, consulta PutObjectla Referencia AWS SDK for PHP de la API.

PowerShell
Herramientas para PowerShell

Ejemplo 1: este comando carga el archivo único “local-sample.txt” a Amazon S3 y crea un objeto con la clave “sample.txt” en el bucket “test-files”.

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -File .\local-sample.txt

Ejemplo 2: este comando carga el archivo único “sample.txt” a Amazon S3 y crea un objeto con la clave “sample.txt” en el bucket “test-files”. Si no se proporciona el parámetro -Key, el nombre del archivo se utiliza como clave de objeto de S3.

Write-S3Object -BucketName amzn-s3-demo-bucket -File .\sample.txt

Ejemplo 3: Este comando carga el único archivo "local-sample.txt" a Amazon S3 y crea un objeto con la clave "prefix/to/sample.txt» en el bucket «test-files».

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "prefix/to/sample.txt" -File .\local-sample.txt

Ejemplo 4: Este comando carga todos los archivos del subdirectorio «Scripts» al bucket «test-files» y aplica el key prefijo común «» a cada objeto. SampleScripts Cada archivo cargado tendrá una clave de «SampleScripts/filename», donde la palabra «filename» variará.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\

Ejemplo 5: Este comando carga todos los archivos*.ps1 del directorio local «Scripts» al depósito «archivos de prueba» y aplica el prefijo clave común «» a cada objeto. SampleScripts Cada archivo cargado tendrá una clave de "SampleScripts/nombrearchivo.ps1", donde el nombre de archivo variará.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder .\Scripts -KeyPrefix SampleScripts\ -SearchPattern *.ps1

Ejemplo 6: este comando crea un nuevo objeto S3 que contiene la cadena de contenido especificada con la clave “sample.txt”.

Write-S3Object -BucketName amzn-s3-demo-bucket -Key "sample.txt" -Content "object contents"

Ejemplo 7: este comando carga el archivo especificado (el nombre del archivo se usa como clave) y aplica las etiquetas especificadas al nuevo objeto.

Write-S3Object -BucketName amzn-s3-demo-bucket -File "sample.txt" -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}

Ejemplo 8: este comando carga de forma recursiva la carpeta especificada y aplica las etiquetas especificadas a todos los objetos nuevos.

Write-S3Object -BucketName amzn-s3-demo-bucket -Folder . -KeyPrefix "TaggedFiles" -Recurse -TagSet @{Key="key1";Value="value1"},@{Key="key2";Value="value2"}
  • Para obtener información sobre la API, consulte la referencia de cmdlets. PutObjectAWS Tools for PowerShell

Python
SDK para Python (Boto3)
nota

Hay más información al respecto. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

class ObjectWrapper: """Encapsulates S3 object actions.""" def __init__(self, s3_object): """ :param s3_object: A Boto3 Object resource. This is a high-level resource in Boto3 that wraps object actions in a class-like structure. """ self.object = s3_object self.key = self.object.key def put(self, data): """ Upload data to the object. :param data: The data to upload. This can either be bytes or a string. When this argument is a string, it is interpreted as a file name, which is opened in read bytes mode. """ put_data = data if isinstance(data, str): try: put_data = open(data, "rb") except IOError: logger.exception("Expected file name or binary data, got '%s'.", data) raise try: self.object.put(Body=put_data) self.object.wait_until_exists() logger.info( "Put object '%s' to bucket '%s'.", self.object.key, self.object.bucket_name, ) except ClientError: logger.exception( "Couldn't put object '%s' to bucket '%s'.", self.object.key, self.object.bucket_name, ) raise finally: if getattr(put_data, "close", None): put_data.close()

Carga un objeto mediante una solicitud condicional.

class S3ConditionalRequests: """Encapsulates S3 conditional request operations.""" def __init__(self, s3_client): self.s3 = s3_client @classmethod def from_client(cls): """ Instantiates this class from a Boto3 client. """ s3_client = boto3.client("s3") return cls(s3_client) def put_object_conditional(self, object_key: str, source_bucket: str, data: bytes): """ Uploads an object to Amazon S3 with a conditional request. Prevents overwrite using an IfNoneMatch condition for the object key. :param object_key: The key of the object to upload. :param source_bucket: The source bucket of the object. :param data: The data to upload. """ try: self.s3.put_object( Bucket=source_bucket, Key=object_key, Body=data, IfNoneMatch="*" ) print( f"\tConditional write successful for key {object_key} in bucket {source_bucket}." ) except ClientError as e: error_code = e.response["Error"]["Code"] if error_code == "PreconditionFailed": print("\tConditional write failed: Precondition failed") else: logger.error(f"Unexpected error: {error_code}") raise
  • Para obtener más información sobre la API, consulta PutObjectla AWS Referencia de API de SDK for Python (Boto3).

Ruby
SDK para Ruby
nota

Hay más información al respecto. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

Cargue un archivo con un cargador administrado (Object.upload_file).

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectUploadFileWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end # Uploads a file to an Amazon S3 object by using a managed uploader. # # @param file_path [String] The path to the file to upload. # @return [Boolean] True when the file is uploaded; otherwise false. def upload_file(file_path) @object.upload_file(file_path) true rescue Aws::Errors::ServiceError => e puts "Couldn't upload file #{file_path} to #{@object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-uploaded-file" file_path = "object_upload_file.rb" wrapper = ObjectUploadFileWrapper.new(Aws::S3::Object.new(bucket_name, object_key)) return unless wrapper.upload_file(file_path) puts "File #{file_path} successfully uploaded to #{bucket_name}:#{object_key}." end run_demo if $PROGRAM_NAME == __FILE__

Cargue un archivo con Object.put.

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectPutWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end def put_object(source_file_path) File.open(source_file_path, 'rb') do |file| @object.put(body: file) end true rescue Aws::Errors::ServiceError => e puts "Couldn't put #{source_file_path} to #{object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-object-key" file_path = "my-local-file.txt" wrapper = ObjectPutWrapper.new(Aws::S3::Object.new(bucket_name, object_key)) success = wrapper.put_object(file_path) return unless success puts "Put file #{file_path} into #{object_key} in #{bucket_name}." end run_demo if $PROGRAM_NAME == __FILE__

Cargue un archivo con Object.put y añada cifrado del lado del servidor.

require 'aws-sdk-s3' # Wraps Amazon S3 object actions. class ObjectPutSseWrapper attr_reader :object # @param object [Aws::S3::Object] An existing Amazon S3 object. def initialize(object) @object = object end def put_object_encrypted(object_content, encryption) @object.put(body: object_content, server_side_encryption: encryption) true rescue Aws::Errors::ServiceError => e puts "Couldn't put your content to #{object.key}. Here's why: #{e.message}" false end end # Example usage: def run_demo bucket_name = "amzn-s3-demo-bucket" object_key = "my-encrypted-content" object_content = "This is my super-secret content." encryption = "AES256" wrapper = ObjectPutSseWrapper.new(Aws::S3::Object.new(bucket_name, object_content)) return unless wrapper.put_object_encrypted(object_content, encryption) puts "Put your content into #{bucket_name}:#{object_key} and encrypted it with #{encryption}." end run_demo if $PROGRAM_NAME == __FILE__
  • Para obtener más información sobre la API, consulta PutObjectla Referencia AWS SDK for Ruby de la API.

Rust
SDK para Rust
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

pub async fn upload_object( client: &aws_sdk_s3::Client, bucket_name: &str, file_name: &str, key: &str, ) -> Result<aws_sdk_s3::operation::put_object::PutObjectOutput, S3ExampleError> { let body = aws_sdk_s3::primitives::ByteStream::from_path(std::path::Path::new(file_name)).await; client .put_object() .bucket(bucket_name) .key(key) .body(body.unwrap()) .send() .await .map_err(S3ExampleError::from) }
  • Para obtener más información sobre la API, consulta PutObjectla referencia sobre la API de AWS SDK para Rust.

SAP ABAP
SDK para SAP ABAP
nota

Hay más información al respecto GitHub. Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

"Get contents of file from application server." DATA lv_body TYPE xstring. OPEN DATASET iv_file_name FOR INPUT IN BINARY MODE. READ DATASET iv_file_name INTO lv_body. CLOSE DATASET iv_file_name. "Upload/put an object to an S3 bucket." TRY. lo_s3->putobject( iv_bucket = iv_bucket_name iv_key = iv_file_name iv_body = lv_body ). MESSAGE 'Object uploaded to S3 bucket.' TYPE 'I'. CATCH /aws1/cx_s3_nosuchbucket. MESSAGE 'Bucket does not exist.' TYPE 'E'. ENDTRY.
  • Para obtener más información sobre la API, consulte PutObjectla referencia sobre la API ABAP del AWS SDK para SAP.

Swift
SDK para Swift
nota

Hay más información al respecto. GitHub Busque el ejemplo completo y aprenda a configurar y ejecutar en el Repositorio de ejemplos de código de AWS.

import AWSS3 import Smithy public func uploadFile(bucket: String, key: String, file: String) async throws { let fileUrl = URL(fileURLWithPath: file) do { let fileData = try Data(contentsOf: fileUrl) let dataStream = ByteStream.data(fileData) let input = PutObjectInput( body: dataStream, bucket: bucket, key: key ) _ = try await client.putObject(input: input) } catch { print("ERROR: ", dump(error, name: "Putting an object.")) throw error } }
import AWSS3 import Smithy public func createFile(bucket: String, key: String, withData data: Data) async throws { let dataStream = ByteStream.data(data) let input = PutObjectInput( body: dataStream, bucket: bucket, key: key ) do { _ = try await client.putObject(input: input) } catch { print("ERROR: ", dump(error, name: "Putting an object.")) throw error } }
  • Para obtener más información sobre la API, consulta PutObjectla referencia sobre la API de AWS SDK for Swift.