AWS Code Sample
Catalog

MultipartCopy.php

MultipartCopy.php demonstrates how to copy files sized 5 GB to 5 TB.

<?php /** * Copyright 2010-2018 Amazon.com, Inc. or its affiliates. All Rights Reserved. * * This file is licensed under the Apache License, Version 2.0 (the "License"). * You may not use this file except in compliance with the License. A copy of * the License is located at * * http://aws.amazon.com/apache2.0/ * * This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR * CONDITIONS OF ANY KIND, either express or implied. See the License for the * specific language governing permissions and limitations under the License. * * ABOUT THIS PHP SAMPLE: This sample is part of the SDK for PHP Developer Guide topic at * https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/service/s3-multipart-upload.html * */ require 'vendor/autoload.php'; use Aws\S3\S3Client; use Aws\Exception\AwsException; use Aws\S3\MultipartCopy; use Aws\Exception\MultipartUploadException; // Create a S3Client $s3Client = new S3Client([ 'profile' => 'default', 'region' => 'us-west-2', 'version' => '2006-03-01' ]); //Copy objects within S3 $copier = new MultipartCopy($s3Client, '/bucket/key?versionId=foo', [ 'bucket' => 'your-bucket', 'key' => 'my-file.zip', ]); try { $result = $copier->copy(); echo "Copy complete: {$result['ObjectURL']}\n"; } catch (MultipartUploadException $e) { echo $e->getMessage() . "\n"; }

Sample Details

Service: s3

Last tested: 2018-09-20

Author: jschwarzwalder (AWS)

Type: full-example

On this page: