Access AWS Lambda Environment Variables using Ruby

AWS Lambda Environment Variables are a useful way to input configuration values to your AWS Lambda runtime. Especially, when there are configurations that are different in your Development environment compared to your Production environment. Like name of DynamoDB tables or MySQL databases.

Below we discuss how we can retrieve the values of Environment Variables in AWS Lambda using Ruby.


Ruby Code to Access Environment Variables

The code for accessing Environment Variables on AWS Lambda is just the same code for accessing environment variables in your local computer or server.

Here is the code to access environment variables using Ruby.

env_var = ENV['ENVIRONMENT_VARIABLE']

If we want to get the value of an environment variable with the key of DB_HOST then we will use the code below.

Continue reading Access AWS Lambda Environment Variables using Ruby

How to get AWS Lambda remaining time using Python

To get the remaining time of a running AWS Lambda Function using Python you should check the context object and its get_remaining_time_in_millis function.

Continue reading How to get AWS Lambda remaining time using Python

How to get the remaining time of a running AWS Lambda Function using Node.js

To get the remaining time of a running Lambda Function using Node.js, we will use the context object’s getRemainingTimeInMillis() function. This returns the number of milliseconds that the Lambda Function can still run before it times out.

Below is a simple code that fetches the remaining time inside a Lambda Function. I have set the timeout to be only 3 seconds, that is why the output is 2,999 milliseconds or approximately 3 seconds.

Continue reading How to get the remaining time of a running AWS Lambda Function using Node.js

How to list files in an S3 bucket folder using boto3 and Python

If you want to list the files/objects inside a specific folder within an S3 bucket then you will need to use the list_objects_v2 method with the Prefix parameter in boto3.

Below are 3 examples codes on how to list the objects in an S3 bucket folder.

What the code does is that it gets all the files/objects inside the S3 bucket named radishlogic-bucket within the folder named s3_folder/ and adds their keys inside a Python list (s3_object_key_list). It then prints each of the object keys in the list and also prints the number of files in the folder.

Continue reading How to list files in an S3 bucket folder using boto3 and Python

How to list all objects in an S3 Bucket using boto3 and Python

If you need to list all files/objects inside an AWS S3 Bucket then you will need to use the list_objects_v2 method in boto3.

Below are 3 example codes of how to list all files in a target S3 Bucket.

You can use any of the 3 options since it does the same thing.

It will get all of the files inside the S3 Bucket radishlogic-bucket using Python boto3, put it inside a Python list, then print each object key. It will print the files inside folder recursively, regardless if they are inside a folder or not.

At the end, it will also print the number of items inside the S3 Bucket.

Continue reading How to list all objects in an S3 Bucket using boto3 and Python

How to delete a file in AWS S3 using boto3 and Python

To delete a file inside an AWS S3 Bucket using Python then you will need to use the delete_object function of boto3.

Below are 3 examples to delete an S3 file.

You can use any of the 3 options since it does the same thing. It will delete the file in S3 with the key of s3_folder/file.txt inside the S3 bucket named radishlogic-bucket using Python boto3.

Continue reading How to delete a file in AWS S3 using boto3 and Python

How to write Python string to a file in S3 Bucket using boto3

To write a file from a Python string directly to an S3 bucket we need to use the boto3 package.

There are 2 ways to write a file in S3 using boto3. The first is via the boto3 client, and the second is via the boto3 resource. Both of these methods will be shown below.

S3 objects and keys

If you are new to AWS S3, you might be confused with some of the terms. So we’ll define some of them here. If you already know what objects and keys are then you can skip this section.

S3 objects are the same as files. When we run the method put_object what it means is that we are putting a file into S3.

S3 keys are the same as the filename with its full path. So if we want to create an object in S3 with the name of filename.txt within the foobar folder then the key is foobar/filename.txt.

Now that we have clarified some of the AWS S3 terms, follow the details below to start writing Python strings directly to objects in S3.

Continue reading How to write Python string to a file in S3 Bucket using boto3

How to generate S3 presigned URL using boto3 and Python

If you want to give your users temporary access to a private S3 file without giving them access to the AWS console, you will need to generate an S3 presigned URL of your target file.

To generate and test the S3 presigned URL, you can try the code below.

Continue reading How to generate S3 presigned URL using boto3 and Python

How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python

If you want to get a JSON file from an S3 Bucket and load it into a Python Dictionary then you can use the example codes below.

There are 4 scenarios for the examples scripts below.

  1. Basic JSON file from S3 to Python Dictionary
  2. With Try/Except block
  3. With datetime, date, and time conversions
  4. Running the code in a Lambda Function

AWS boto3 provides 2 ways to access S3 files, the boto3.client('s3') and boto3.resource('s3'). For each of the example scenarios above, a code will be provided for the two methods.

Related: Writing a Dictionary to JSON file in S3 using boto3 and Python

Since both methods will function the same, you can choose whichever method you like.

Continue reading How to read a JSON file in S3 and store it in a Dictionary using boto3 and Python