To stop versioning in an S3 Bucket, we can use the boto3 method
put_bucket_versioning() with the
VersioningConfiguration parameter to be set to a
Below are 3 methods on how we can stop versioning in an S3 Bucket using AWS-SDK for Python, boto3.
The Python scripts below do the same thing: suspend the versioning of the target S3 Bucket named ‘radishlogic-bucket’.
You may use any method that you like depending on which you are comfortable using.
Interestingly, suspending versioning uses the same boto3 method as enabling versioning, put_bucket_versioning(). The only difference is that the ‘Status’ is ‘Suspended’ instead of ‘Enabled’.
Continue reading How to stop object file versioning in an S3 Bucket using Python boto3
To enable object versioning in an S3 Bucket using Python boto3, you can use the
put_bucket_versioning() function. This guide focuses on utilizing this method to control object versions effectively, ensuring data integrity and secure file handling within your AWS S3 environment.
Below are 3 ways to enable versioning in a target S3 Bucket using AWS-SDK for Python, boto3.
The following examples achieve the same goal: enabling versioning for the AWS S3 Bucket named ‘radishlogic-bucket’.
Feel free to employ any of these methods in your project based on your comfort level.
You can also check if versioning is enabled on your S3 bucket via Python boto3.
Continue reading How to enable object file versioning on an S3 Bucket using Python boto3
To check if versioning is enabled in an S3 Bucket using Python boto3, we will need to use the
get_bucket_versioning() method of boto3 S3.
Below are 3 ways to code how to get the S3 Bucket versioning status using AWS-SDK for Python, boto3.
The Python scripts below all do the same thing. They check the status of versioning in the target S3 Bucket named radishlogic-bucket.
You can choose whichever method you are comfortable with.
Continue reading How to check if versioning is enabled in an S3 bucket using Python boto3
To create an S3 Bucket in your target AWS Account, you will need to use the
create_bucket() method of boto3 S3.
The method requires only the parameter Bucket, which is your target bucket name.
But I highly recommend that you also use the
CreateBucketConfiguration parameter to set the region of the S3 Bucket. If you do not set the CreateBucketConfiguration parameter, it will create your S3 Bucket in the N. Virginia region (us-east-1) by default.
Below are two ways to create an S3 Bucket using Python boto3.
Both python scripts does the same thing. They will create an S3 Bucket named radishlogic-bucket in the Singapore region (ap-southeast-1).
You can choose whichever method you are comfortable with.
Continue reading How to create an S3 Bucket using Python boto3
To list the S3 Buckets inside an AWS Account, you will need to use the
list_buckets() method of boto3.
Below are two example codes that you can use to retrieve all S3 buckets inside a Amazon Web Services account.
Both example scripts will do that same thing. It will query AWS for all the S3 Buckets inside the account and return the buckets names in a Python list.
Since both will do the same thing, you can use whichever method you prefer.
Continue reading How to list all S3 Buckets using Python boto3
If you want to list the files/objects inside a specific folder within an S3 bucket then you will need to use the
list_objects_v2 method with the
Prefix parameter in boto3.
Below are 3 examples codes on how to list the objects in an S3 bucket folder.
What the code does is that it gets all the files/objects inside the S3 bucket named radishlogic-bucket within the folder named s3_folder/ and adds their keys inside a Python list (
Continue reading How to list files in an S3 bucket folder using boto3 and Python
s3_object_key_list). It then prints each of the object keys in the list and also prints the number of files in the folder.
If you need to list all files/objects inside an AWS S3 Bucket then you will need to use the list_objects_v2 method in boto3.
Below are 3 example codes of how to list all files in a target S3 Bucket.
You can use any of the 3 options since it does the same thing.
It will get all of the files inside the S3 Bucket radishlogic-bucket using Python boto3, put it inside a Python list, then print each object key. It will print the files inside folder recursively, regardless if they are inside a folder or not.
At the end, it will also print the number of items inside the S3 Bucket.
Continue reading How to list all objects in an S3 Bucket using boto3 and Python
To delete a file inside an AWS S3 Bucket using Python then you will need to use the delete_object function of boto3.
Below are 3 examples to delete an S3 file.
You can use any of the 3 options since it does the same thing. It will delete the file in S3 with the key of s3_folder/file.txt inside the S3 bucket named radishlogic-bucket using Python boto3.
Continue reading How to delete a file in AWS S3 using boto3 and Python
To write a file from a Python string directly to an S3 bucket we need to use the boto3 package.
There are 2 ways to write a file in S3 using boto3. The first is via the boto3 client, and the second is via the boto3 resource. Both of these methods will be shown below.
S3 objects and keys
If you are new to AWS S3, you might be confused with some of the terms. So we’ll define some of them here. If you already know what objects and keys are then you can skip this section.
S3 objects are the same as files. When we run the method
put_object what it means is that we are putting a file into S3.
S3 keys are the same as the filename with its full path. So if we want to create an object in S3 with the name of
filename.txt within the
foobar folder then the key is
Now that we have clarified some of the AWS S3 terms, follow the details below to start writing Python strings directly to objects in S3.
Continue reading How to write Python string to a file in S3 Bucket using boto3
If you want to give your users temporary access to a private S3 file without giving them access to the AWS console, you will need to generate an S3 presigned URL of your target file.
To generate and test the S3 presigned URL, you can try the code below.
Continue reading How to generate S3 presigned URL using boto3 and Python