How to Manage AWS S3 buckets and objects from CLI

Akkireddy
8 min readJul 14, 2020

It is easier to manage AWS S3 buckets and objects from CLI. This article explains the basics of how to manage S3 buckets and its objects using AWS s3 CLI using the following examples:

AWS_S3_bucket

For quick reference, here are the commands. For details on how these commands work, read the rest of the article.

# s3 make bucket (create bucket)

aws s3 mb s3://tvgbucket — region us-east-1

# s3 remove bucket

aws s3 rb s3://tvgbucket

aws s3 rb s3://tvgbucket — force

# s3 ls commands

aws s3 ls

aws s3 ls s3://tvgbucket

aws s3 ls s3://tvgbucket — recursive

aws s3 ls s3://tvgbucket — recursive — human-readable — summarize

# s3 cp commands

aws s3 cp getdata.php s3://tvgbucket

aws s3 cp /local/dir/data s3://tvgbucket — recursive

aws s3 cp s3://tvgbucket/getdata.php /local/dir/data

aws s3 cp s3://tvgbucket/ /local/dir/data — recursive

aws s3 cp s3://tvgbucket/init.xml s3://backup-bucket

aws s3 cp s3://tvgbucket s3://backup-bucket — recursive

# s3 mv commands

aws s3 mv source.json s3://tvgbucket

aws s3 mv s3://tvgbucket/getdata.php /home/project

aws s3 mv s3://tvgbucket/source.json s3://backup-bucket

aws s3 mv /local/dir/data s3://tvgbucket/data — recursive

aws s3 mv s3://tvgbucket s3://backup-bucket — recursive

# s3 rm commands

aws s3 rm s3://tvgbucket/queries.txt

aws s3 rm s3://tvgbucket — recursive

# s3 sync commands

aws s3 sync backup s3://tvgbucket

aws s3 sync s3://tvgbucket/backup /tmp/backup

aws s3 sync s3://tvgbucket s3://backup-bucket

# s3 bucket website

aws s3 website s3://tvgbucket/ — index-document index.html — error-document error.html

# s3 presign url (default 3600 seconds)

aws s3 presign s3://tvgbucket/dnsrecords.txt

aws s3 presign s3://tvgbucket/dnsrecords.txt — expires-in 60

1. Create New S3 Bucket

Use mb option for this. mb stands for Make Bucket.

The following will create a new S3 bucket

$ aws s3 mb s3://tvgbucket make_bucket: tvgbucket

In the above example, the bucket is created in the us-east-1 region, as that is what is specified in the user’s config file as shown below.

$ cat ~/.aws/config [profile ramesh] region = us-east-1

2. Create New S3 Bucket — Different Region

To create a bucket in a specific region (different than the one from your config file), then use the -region option as shown below.

$ aws s3 mb s3://tvgbucket — region us-west-2 make_bucket: tvgbucket

3. Delete S3 Bucket (That is empty)

Use rb option for this. rb stands for remove bucket.

The following deletes the given bucket.

$ aws s3 rb s3://tvgbucket remove_bucket: tvgbucket

If the bucket you are trying to delete doesn’t exist, you’ll get the following error message tvgbucketet.

$ aws s3 rb s3://tvgbucket1

remove_bucket failed: s3://tvgbucket1 An error occurred (NoSuchBucket) when calling the DeleteBucket operation: The specified bucket does not exist

4. Delete S3 Bucket (And all its objects)

If the bucket contains some object, you’ll get the following error message:

$ aws s3 rb s3://tvgbucket

remove_bucket failed: s3://tvgbucket An error occurred (BucketNotEmpty) when calling the DeleteBucket operation: The bucket you tried to delete is not empty

To delete a bucket along with all its objects, use the -force option as shown below.

$ aws s3 rb s3://tvgbucket — force

delete: s3://tvgbucket/demo/getdata.php

delete: s3://tvgbucket/ipallow.txt

delete: s3://tvgbucket/demo/servers.txt

delete: s3://tvgbucket/demo/

remove_bucket: tvgbucket

5. List All S3 Buckets

To view all the buckets owned by the user, execute the following ls command.

$ aws s3 ls

2020–02–06 11:38:55 tvgbucket

2019–12–18 18:02:27 etclinux

2019–12–08 18:05:15 readynas

In the above output, the timestamp is the date the bucket was created. The timezone was adjusted to be displayed to your laptop’s timezone.

The following command is the same as the above:

aws s3 ls s3://

6. List All Objects in a Bucket

The following command displays all objects and prefixes under the tvgbucket.

$ aws s3 ls s3://tvgbucket

PRE config/

PRE data/

2020–04–07 11:38:20 13 getdata.php

2020–04–07 11:38:20 2546 ipallow.php

2020–04–07 11:38:20 9 license.php

2020–04–07 11:38:20 3677 servers.txt

In the above output:

  • Inside the tvgbucket, there are two folders config and data (indicated by PRE)
  • PRE stands for the Prefix of an S3 object.
  • Inside the tvgbucket, we have 4 files at the / level
  • The timestamp is when the file was created
  • The 2nd column displays the size of the S3 object

Note: The above output doesn’t display the content of sub-folders config and data

7. List all Objects in a Bucket Recursively

To display all the objects recursively including the content of the sub-folders, execute the following command.

$ aws s3 ls s3://tvgbucket — recursive

2020–04–07 11:38:19 2777 config/init.xml

2020–04–07 11:38:20 52 config/support.txt

2020–04–07 11:38:20 1758 data/database.txt

2020–04–07 11:38:20 13 getdata.php

2020–04–07 11:38:20 2546 ipallow.php

2020–04–07 11:38:20 9 license.php

2020–04–07 11:38:20 3677 servers.txt

Note: When you are listing all the files, notice how there is no PRE indicator in the 2nd column for the folders.

8. Total Size of All Objects in an S3 Bucket

You can identify the total size of all the files in your S3 bucket by using the combination of the following three options: recursive, human-readable, summarize

Note: The following displays both total file size in the S3 bucket, and the total number of files in the s3 bucket

$ aws s3 ls s3://tvgbucket — recursive — human-readable — summarize

2020–04–07 11:38:19 2.7 KiB config/init.xml

2020–04–07 11:38:20 52 Bytes config/support.txt

2020–04–07 11:38:20 1.7 KiB data/database.txt

2020–04–07 11:38:20 13 Bytes getdata.php

2020–04–07 11:38:20 2.5 KiB ipallow.php

Total Objects: 7 Total Size: 10.6 KiB

In the above output:

  • recursive option make sure that it displays all the files in the s3 bucket including sub-folders
  • human-readable displays the size of the file in a readable format. Possible values you’ll see in the 2nd column for the size are: Bytes/MiB/KiB/GiB/TiB/PiB/EiB
  • summarize options make sure to display the last two lines in the above output. This indicates the total number of objects in the S3 bucket and the total size of all those objects

9. Request Payer Listing

If a specific bucket is configured as requester pays buckets, then if you are accessing objects in that bucket, you understand that you are responsible for the payment of that request access. In this case, the bucket owner doesn’t have to pay for access.

To indicate this in your ls command, you’ll have to specify -request-payer option as shown below.

$ aws s3 ls s3://tvgbucket — recursive — request-payer requester

2020–04–07 11:38:19 2777 config/init.xml

2020–04–07 11:38:20 52 config/support.txt

2020–04–07 11:38:20 1758 data/database.txt

2020–04–07 11:38:20 13 getdata.php

2020–04–07 11:38:20 2546 ipallow.php

2020–04–07 11:38:20 9 license.php

2020–04–07 11:38:20 3677 servers.txt

For signed URL, make sure to include x-amz-request-payer=requester in the request

10. Copy Local File to S3 Bucket

In the following example, we are copying the getdata.php file from the local laptop to the S3 bucket.

$ aws s3 cp getdata.php s3://tvgbucket

upload: ./getdata.php to s3://tvgbucket/getdata.php

If you want to copy the getdata.php to a S3 bucket with a different name, do the following

$ aws s3 cp getdata.php s3://tvgbucket/getdata-new.php

upload: ./getdata.php to s3://tvgbucket/getdata-new.php

For the local file, you can also specify the full path as shown below.

$ aws s3 cp /home/project/getdata.php s3://tvgbucket

upload: ../../home/project/getdata.php to s3://tvgbucket/getdata.php

11. Copy Local Folder with all Files to S3 Bucket

In this example, we are copying all the files from the “data” folder that is under /home/projects directory to S3 bucket

$ cd /home/projects

$ aws s3 cp data s3://tvgbucket — recursive

upload: data/parameters.txt to s3://tvgbucket/parameters.txt

upload: data/common.txt to s3://tvgbucket/common.txt

In the above example, note that only the files from the local data/ folder are getting uploaded. Not the folder “data” itself

If you like to upload the data folder from local to s3 bucket as a data folder, then specify the folder name after the bucket name as shown below.

$ aws s3 cp data s3://tvgbucket/data — recursive

upload: data/parameters.txt to s3://tvgbucket/data/parameters.txt

upload: data/common.txt to s3://tvgbucket/data/common.txt

12. Download a File from S3 Bucket

To download a specific file from an S3 bucket do the following. The following copies getdata.php from the given s3 bucket to the current directory.

$ aws s3 cp s3://tvgbucket/getdata.php .

download: s3://tvgbucket/getdata.php to ./getdata.php

You can download the file to the local machine within a different name as shown below.

$ aws s3 cp s3://tvgbucket/getdata.php getdata-local.php

download: s3://tvgbucket/getdata.php to ./getdata-local.php

Download the file from the S3 bucket to a specific folder in the local machine as shown below. The following will download the getdata.php file to /home/project folder on the local machine.

$ aws s3 cp s3://tvgbucket/getdata.php /home/project/

download: s3://tvgbucket/getdata.php to ../../home/project/getdata.php

13. Download All Files Recursively from an S3 Bucket (Using Copy)

The following will download all the files from the given bucket to the current directory on your laptop.

$ aws s3 cp s3://tvgbucket/ . — recursive

download: s3://tvgbucket/getdata.php to ./getdata.php

download: s3://tvgbucket/config/init.xml ./config/init.xml

If you want to download all the files from an S3 bucket to a specific folder locally, please specify the full path of the local directory as shown below.

$ aws s3 cp s3://tvgbucket/ /home/projects/tvgbucket — recursive

download: s3://tvgbucket/getdata.php to ../../home/projects/tvgbucket/getdata.php

download: s3://tvgbucket/config/init.xml to ../../home/projects/tvgbucket/config/init.xml

In the above command, if the tvgbucket folder doesn’t exists under /home/projects, it will create it automatically.

14. Copy a File from One Bucket to Another Bucket

The following command will copy the config/init.xml from tvgbucket to backup bucket as shown below.

$ aws s3 cp s3://tvgbucket/config/init.xml s3://backup-bucket

copy: s3://tvgbucket/config/init.xml to s3://backup-bucket/init.xml

In the above example, even though the init.xml file was under the config folder in the source bucket, on the destination bucket, it copied the init.xml file to the top-level / in the backup-bucket.

If you want to copy the same folder from source and destination along with the file, specify the folder name in the destination bucket as shown below.

$ aws s3 cp s3://tvgbucket/config/init.xml s3://backup-bucket/config

copy: s3://tvgbucket/config/init.xml to s3://backup-bucket/config/init.xml

If the destination bucket doesn’t exist, you’ll get the following error message.

$ aws s3 cp s3://tvgbucket/test.txt s3://backup-bucket-777

copy failed: s3://tvgbucket/test.txt to s3://backup-bucket-777/test.txt An error occurred (NoSuchBucket) when calling the CopyObject operation: The specified bucket does not exist

15. Copy All Files Recursively from One Bucket to Another

The following will copy all the files from the source bucket including files under sub-folders to the destination bucket.

$ aws s3 cp s3://tvgbucket s3://backup-bucket — recursive

copy: s3://tvgbucket/getdata.php to s3://backup-bucket/getdata.php

copy: s3://tvgbucket/config/init.xml s3://backup-bucket/config/init.xml

16. Move a File from Local to S3 Bucket

When you move the file from Local machine to S3 bucket, as you would expect, the file will be physically moved from local machine to the S3 bucket.

$ ls -l source.json

-rw-r — r — 1 akkisysadmin 1404 Apr 2 13:25 source.json

$ aws s3 mv source.json s3://tvgbucket

move: ./source.json to s3://tvgbucket/source.json

As you see the file doesn’t exist on the local machine after the move. It’s only on the S3 bucket now.

$ ls -l source.json ls: source.json: No such file or directory

17. Move a File from S3 Bucket to Local

The following is the reverse of the previous example. Here, the file will be moved from the S3 bucket to the local machine.

As you see below, the file now exists on the s3 bucket.

$ aws s3 ls s3://tvgbucket/getdata.php

2020–04–06 06:24:29 1758 getdata.php

Move the file from the S3 bucket to /home/project directory on the local machine.

$ aws s3 mv s3://tvgbucket/getdata.php /home/project move: s3://tvgbucket/getdata.php to ../../../home/project/getdata.php

After the move, the file doesn’t exist on the S3 bucket anymore.

$ aws s3 ls s3://tvgbucket/getdata.php

18. Move a File from One S3 Bucket to Another S3 Bucket

Before the move, the file source.json is in tvgbucket.

$ aws s3 ls s3://tvgbucket/source.json

2020–04–06 06:51:39 1404 source.json

This file is not in backup-bucket.

$ aws s3 ls s3://backup-bucket/source.json $

Move the file from tvgbucket to backup-bucket.

$ aws s3 mv s3://tvgbucket/source.json s3://backup-bucket move: s3://tvgbucket/source.json to s3://backup-bucket/source.json

Now, the file is only on the backup-bucket.

$ aws s3 ls s3://tvgbucket/source.json $ $ aws s3 ls s3://backup-bucket/source.json 2020–04–06 06:56:00 1404 source.json

--

--

Akkireddy

#DevOps — #AWS — #Cloud enthusiast.. Views are my own.