What Is Amazon S3?
Amazon Simple Storage Service is popularly known as S3. It is a storage service for the Internet. It has a simple web services interface that you can use to store and retrieve any amount of data, at any time, from anywhere over the Internet.
Store Data in Buckets:
Data is stored in a container called Bucket. Upload as many objects as you like into an Amazon S3 bucket. Each object can contain up to 5 TB of data. For example, if the object named photos/puppy.jpg is stored in the johnsmith bucket, then can be addressable using the URL https://johnsmith.s3.amazonaws.com/photos/puppy.jpg
Download Data:
Download your data or enable others to do so. Download your data any time you like or allow others to do the same.
Build impactful cloud solutions that solve challenging business problems
Permissions:
Grant or deny access to others who want to upload or download data into your Amazon S3 bucket.
Here I will be using PHP AWS SDK on Windows Machine to Upload, Download and Delete data from S3
To access any of the AWS service, the user first need to have an AWS account
Please log into: Amazone AWS
and install/configure AWS SDK on to a local system.
AWS accepts only https requests, so install dummy SSL certificate in a local WAMP/XAMPP.
Creating a Bucket Using CreateBucket:
To upload your data (photos, videos, documents etc.), you first create a bucket in one of the AWS Regions. By default, you can create up to 100 buckets in each of your AWS accounts
Steps to Create a Bucket:
Go to aws.amazon.com and login using your credentials.
Go to Services -> Storage -> S3
Click on create bucket button.
1.Provide Name of Bucket & Region
- Bucket name – storing-data
- Region – US East (N. Virginia)
2.Set Properties to Bucket by Enabling/Disabling
- Enable versioning
- Server access logging
- Default encryption AES-256 (Advanced Encryption Standard)
Use Server-Side Encryption with Amazon S3-Managed Keys (SSE-S3)
3.Set permissions
- Manage public permissions – Do not grant public read access to this bucket (Recommended)
- Manage system permissions – Do not grant Amazon S3 Log Delivery group write access to this bucket
Finally click on Create bucket.
How Do You Get an Access Key for Amazon S3?
Use an access key ID and secret access key to authenticate an Amazon Web Services (AWS) account in a Cloud Storage migration project. An AWS secret access key can’t be retrieved after it has been created. Once lost, it cannot be recovered; a new access key must be created.
Follow the steps below to create a new secret access key for an AWS account:
- Sign in to the AWS Management Console and open the IAM console.
- In the navigation pane, choose Users.
- Add a checkmark next to the name of the desired user, and then choose User Actions from the top.
Note:The selected user must have read and write access to the AWS S3 bucket for the migration.
Click on Manage Access Keys:
- Click on Create Access Key.
Note: Each AWS account can only have two access keys. If the secret access keys are lost, one of the existing access keys must be deleted, and a new one created.
- Click on Show User Security Credentials.
- Copy and paste the Access Key ID and Secret Access Key values, or click on Download Credentials to download the credentials in a CSV (file).
Code to Upload, Retrieve and Delete an Object in a Bucket:
Let the code to upload, retrieve and delete an object in a bucket. WAMP / XAMPP can be used. I have used XAMPP
Create a project folder s3. this is how the folder structure should look like
AWS SDK resides inside vendor folder
AWS
Define S3 Configuration in : C:xampphtdocss3config.php
<?php define("KEY", 'AKIAJFWC3NTWAFASA'); define("SECRET", 'MgAduMo4gBCXM+kVr/ZADwefdsFASDASD'); define("BUCKET", storing-data'); ?>
Use Aws Services provided by AWS SDK in : C:xampphtdocss3connection.php
<?php use AwsS3S3Client; require_once 'vendor/autoload.php'; require_once 'config.php'; $config =['key' => KEY, 'secret' => SECRET, 'bucket' => BUCKET]; $s3 = AwsS3S3Client::factory($config); ?>
Upload files using : C:xampphtdocss3start.php
<?php ?> <html> <head> <title>Upload Data</title> <script type="text/javascript" src="js/jquery.js"></script> </head> <body> <h3>Upload the files</h3> <form name="upload" action="upload.php" method="post" enctype="multipart/form-data"> <input type="file" name="uploaduser" id="uploaduser" /> <input type="submit" name="submit" value="upload"/> </form> <h3>List of items uploaded</h3> <?php include_once 'listing.php'; ?> </body> </html>
Upload Action page to capture and move the object : C:xampphtdocss3upload.php
<?php require_once 'connection.php'; if(isset($_FILES['uploaduser'])){ $files = $_FILES['uploaduser']; $name = $files['name']; $tmpName = $files['tmp_name']; $size = $files['size']; $extension = explode('.', $files['name']); $extension = strtolower(end($extension)); $key = md5(uniqid()); $tmp_file_name = "{$key}.{$extension}"; $tmp_file_path = "files/{$tmp_file_name}"; move_uploaded_file($tmpName, $tmp_file_path); try{ $s3->putObject([ 'Bucket' => $config['bucket'], 'Key' => "uploads-azeez/{$name}", 'Body' => fopen($tmp_file_path, 'rb'), 'ACL' => 'public-read' ]); //remove the file from local folder unlink($tmp_file_path); } catch (AwsS3ExceptionS3Exception $ex){ die("Error uploading the file to S3"); } header("Location: start.php"); exit(); }else if(isset($_POST) && !empty($_POST)){ $name = $_POST['key']; // Delete an object from the bucket. $s3->deleteObject([ 'Bucket' => $config['bucket'], 'Key' => "$name" ]); }
List the uploaded Object using : C:xampphtdocss3listing.php
<?php require_once 'connection.php'; $objects = $s3->getIterator('ListObjects', ['Bucket' => $config['bucket'], 'Prefix' => 'uploads-azeez/'] ); ?> <html> <head> <title>Listing Bucket data</title> <style> table, th, td { border: 1px solid black; border-collapse: collapse; } </style> <script type="text/javascript" src="js/jquery.js"></script> <script> function ConfirmDelete(key) { var x = confirm("Are you sure you want to delete?"); if (x) { $.ajax({ url: 'upload.php', type: "POST", data: {'key': key}, success: function(response) { console.log(response); window.location.reload(); }, error: function(jqXHR, textStatus, errorThrown) { console.log(textStatus, errorThrown); } }); // event.preventDefault(); } else return false; } </script> </head> <body> <table > <thead> <tr> <td>File Name</td> <td>Download Link</td> <td>Delete</td> </tr> </thead> <tbody> <?php foreach ($objects as $object): ?> <tr> <td><?php echo $object['Key']; ?></td> <td><a href="<?php echo $s3->getObjectUrl(BUCKET, $object['Key']); ?>" download="<?php echo $object['Key']; ?>">Download</a></td> <td><a href="" name="delete" onclick='ConfirmDelete("<?php echo $object['Key']; ?>")'>Delete</a></td> </tr> <?php endforeach; ?> </tbody> </table> </body> </html>
Finally, this is How the application looks:
Amazon AWS S3
Upload Larger Files Using Multi-part:
You can upload large files to Amazon S3 in multiple parts. You must use a multipart upload for files larger than 5 GB. The AWS SDK for PHP exposes the MultipartUploader class that simplifies multipart uploads.
The upload method of the MultipartUploader class is best used for a simple multipart upload.
<?php require 'vendor/autoload.php'; use AwsCommonExceptionMultipartUploadException; use AwsS3MultipartUploader; use AwsS3S3Client; $bucket = $config['bucket']; $keyname = $_POST[‘key’]; $s3 = new S3Client([ 'version' => 'latest', 'region' => 'us-east-1' ]); // Prepare the upload parameters. $uploader = new MultipartUploader($s3, '/path/to/large/file.zip', [ 'bucket' => $bucket, 'key' => $keyname ]); // Perform the upload. try { $result = $uploader->upload(); echo "Upload complete: {$result['ObjectURL']}" . PHP_EOL; } catch (MultipartUploadException $e) { echo $e->getMessage() . PHP_EOL; }