S3 commandline client s5cmd

In this page you will find documentation about an S3 client. Here we refer to buckets, this is the S3 term for containers. They are identical.


The tool s5cmd allows you to parallelise workloads like data transfers. This is very convenient when you want to copy a whole directory with its contents to an S3 bucket or vice versa. More information may be found at https://github.com/peak/s5cmd, https://joshua-robinson.medium.com/s5cmd-for-high-performance-object-storage-7071352cc09d and https://aws.amazon.com/blogs/opensource/parallelizing-s2-workloads-s5cmd/.

The key benefit of s5cmd is its greatly improved performance as compared to s3cmd and aws cli etc.


Binaries can be downloaded from: https://github.com/peak/s5cmd/releases for Windows, Mac and Linux.

For Openstack SWIFT, please use version 1.4.0rc1 or above since there is an issue with specifying regions with older versions.


In order to authenticate you need the same as for the awscli client. See: http://doc.swift.surfsara.nl/en/latest/Pages/Clients/awscli.html#configuration.

Upload/Download an object to/from a bucket

An object can be uploaded to a bucket by the following command:

s5cmd --endpoint-url https://proxy.swift.surfsara.nl cp  --destination-region NL <file name> s3://mybucket/myobject

It can be downloaded by:

s5cmd --endpoint-url https://proxy.swift.surfsara.nl cp  --source-region NL s3://mybucket/myobject filename

Upload a folder with contents to a bucket

s5cmd --endpoint-url https://proxy.swift.surfsara.nl cp  --destination-region NL /path/to/my/folder s3://mybucket

Download a bucket with contents to a directory

s5cmd --endpoint-url https://proxy.swift.surfsara.nl cp  --source-region NL s3://mybucket/* /path/to/my/folder/.

Creating and deleting buckets and objects, listing buckets and objects

For these operations we recommend to use an other s3 client like awscli.

Large files


Important: By default s5cmd spawns 256 workers to do its tasks in parallel. This tool is really well suited for transferring a large number of small files. For larger files (>= 1GB) we have found it beneficial to reduce the number of workers to a smaller number, like for example 20, in order to reduce the load on the client side. To do that use the commandline flag --numworkers <value>. An example is shown below:

s5cmd --endpoint-url https://proxy.swift.surfsara.nl --numworkers 20 cp  --destination-region NL /path/to/my/folder/with/big/files s3://mybucket