How to list all versions of a single object file in an AWS S3 Bucket using Python boto3

To get the object versions of a single object in AWS S3, we will need to use the list_object_versions() method of boto3. Below are 3 methods to list all the versions of a single S3 object using Python and boto3.

You can scroll to the codes below to quickly access the Python scripts.

All 3 codes do the same thing. The function get_file_versions_list() will accept the bucket name (bucket_name) and the target S3 object key (object_key), then it will use boto3 to get the list of object versions and delete markers of the target S3 object. It will then sort that list from latest to oldest, and then count the number of versions the object has.

Getting the list of versions and the delete markers was not as straight as I thought it would be. In the latter part of this article, I will be discussing the complexity of the prefix parameter and the delete markers.

Continue reading How to list all versions of a single object file in an AWS S3 Bucket using Python boto3

How to stop object file versioning in an S3 Bucket using Python boto3

To stop versioning in an S3 Bucket, we can use the boto3 method put_bucket_versioning() with the VersioningConfiguration parameter to be set to a Status of Suspended.

Below are 3 methods on how we can stop versioning in an S3 Bucket using AWS-SDK for Python, boto3.

The Python scripts below do the same thing: suspend the versioning of the target S3 Bucket named ‘radishlogic-bucket’.

You may use any method that you like depending on which you are comfortable using.

Interestingly, suspending versioning uses the same boto3 method as enabling versioning, put_bucket_versioning(). The only difference is that the ‘Status’ is ‘Suspended’ instead of ‘Enabled’.

Continue reading How to stop object file versioning in an S3 Bucket using Python boto3

How to enable object file versioning on an S3 Bucket using Python boto3

To enable object versioning in an S3 Bucket using Python boto3, you can use the put_bucket_versioning() function. This guide focuses on utilizing this method to control object versions effectively, ensuring data integrity and secure file handling within your AWS S3 environment.

Below are 3 ways to enable versioning in a target S3 Bucket using AWS-SDK for Python, boto3.

The following examples achieve the same goal: enabling versioning for the AWS S3 Bucket named ‘radishlogic-bucket’.

Feel free to employ any of these methods in your project based on your comfort level.

You can also check if versioning is enabled on your S3 bucket via Python boto3.

Continue reading How to enable object file versioning on an S3 Bucket using Python boto3

How to check if versioning is enabled in an S3 bucket using Python boto3

To check if versioning is enabled in an S3 Bucket using Python boto3, we will need to use the get_bucket_versioning() method of boto3 S3.

Below are 3 ways to code how to get the S3 Bucket versioning status using AWS-SDK for Python, boto3.

The Python scripts below all do the same thing. They check the status of versioning in the target S3 Bucket named radishlogic-bucket.

You can choose whichever method you are comfortable with.

Continue reading How to check if versioning is enabled in an S3 bucket using Python boto3

How to create an S3 Bucket using Python boto3

To create an S3 Bucket in your target AWS Account, you will need to use the create_bucket() method of boto3 S3.

The method requires only the parameter Bucket, which is your target bucket name.

But I highly recommend that you also use the CreateBucketConfiguration parameter to set the region of the S3 Bucket. If you do not set the CreateBucketConfiguration parameter, it will create your S3 Bucket in the N. Virginia region (us-east-1) by default.

Below are two ways to create an S3 Bucket using Python boto3.

Both python scripts does the same thing. They will create an S3 Bucket named radishlogic-bucket in the Singapore region (ap-southeast-1).

You can choose whichever method you are comfortable with.

Continue reading How to create an S3 Bucket using Python boto3

How to list all S3 Buckets using Python boto3

To list the S3 Buckets inside an AWS Account, you will need to use the list_buckets() method of boto3.

Below are two example codes that you can use to retrieve all S3 buckets inside a Amazon Web Services account.

Both example scripts will do that same thing. It will query AWS for all the S3 Buckets inside the account and return the buckets names in a Python list.

Since both will do the same thing, you can use whichever method you prefer.

Continue reading How to list all S3 Buckets using Python boto3

Require Multi-Factor Authentication (MFA) for IAM User in AWS

As a Security Best Practice we should always require IAM Users to have Multi-Factor Authentication (MFA) enabled when accessing the AWS Console.

The problem is how do we require users to configure MFA?

The IAM policy below can be used to require users to enable their MFA. If they do not have MFA, all their permissions will be denied. This will make access to your AWS Account more secure.



IAM Policy that requires IAM Users to have Multi-Factor Authentication (MFA)

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowViewAccountInfo",
            "Effect": "Allow",
            "Action": [
                "iam:ListUsers",
                "iam:ListMFADevices",
                "iam:GetAccountPasswordPolicy",
                "iam:GetAccountSummary"
            ],
            "Resource": "*"
        },
        {
            "Sid": "AllowChangeOwnPasswordsOnFirstLogin",
            "Effect": "Allow",
            "Action": [
                "iam:ChangePassword",
                "iam:GetUser"
            ],
            "Resource": "arn:aws:iam::*:user/${aws:username}"
        },
        {
            "Sid": "AllowChangeOwnPasswordsAfterMFAEnabled",
            "Effect": "Allow",
            "Action": [
                "iam:GetLoginProfile",
                "iam:UpdateLoginProfile"
            ],
            "Resource": "arn:aws:iam::*:user/${aws:username}"
        },
        {
            "Sid": "AllowManageOwnVirtualMFADevice",
            "Effect": "Allow",
            "Action": [
                "iam:CreateVirtualMFADevice",
                "iam:DeleteVirtualMFADevice"
            ],
            "Resource": "arn:aws:iam::*:mfa/${aws:username}"
        },
        {
            "Sid": "AllowManageOwnUserMFA",
            "Effect": "Allow",
            "Action": [
                "iam:DeactivateMFADevice",
                "iam:EnableMFADevice",
                "iam:ListMFADevices",
                "iam:ResyncMFADevice"
            ],
            "Resource": "arn:aws:iam::*:user/${aws:username}"
        },
        {
            "Sid": "DenyAllExceptListedIfNoMFA",
            "Effect": "Deny",
            "NotAction": [
                "iam:ListUsers",
                "iam:ChangePassword",
                "iam:GetUser",
                "iam:CreateVirtualMFADevice",
                "iam:DeleteVirtualMFADevice",
                "iam:DeactivateMFADevice",
                "iam:EnableMFADevice",
                "iam:ListMFADevices",
                "iam:ResyncMFADevice"
            ],
            "Resource": "*",
            "Condition": {
                "BoolIfExists": {
                    "aws:MultiFactorAuthPresent": "false"
                }
            }
        }
    ],
    "Id": "RadishLogic.com MFA Required IAM Policy"
}

The name of my IAM Policy is MFA-Required, you may use whatever name you desire to use.

Continue reading Require Multi-Factor Authentication (MFA) for IAM User in AWS

How to Get Lambda Runtime Region via Python

To get the AWS Region where your Lambda Function is running you will need to import the os module.

import os

Then from the os module, you need to get the value of AWS_REGION from the environ mapping variable. This will return the AWS Region where the Lambda Function is running.

runtime_region = os.environ['AWS_REGION']

Note: The way of getting the Runtime AWS Region of your Lambda Function is the same as when you get a Lambda Environment Variable.

Continue reading How to Get Lambda Runtime Region via Python

Access AWS Lambda Environment Variables using Node.js

If you want to get the values of Environment Variables in AWS Lambda using Node.js runtime follow the instructions below.

Node.js Code to Access Environment Variables

To access the Environment Variables of Lambda Functions using Node.js or javascript simply use the code below.

const environmentVariable = process.env.ENVIRONMENT_VARIABLE

Let’s say that my environment variable has a name of DB_USER, I will use the code below to get its value.

Continue reading Access AWS Lambda Environment Variables using Node.js