[AWS Lab] Lambda – /tmp Space

In this lab, we will learn how to use local transient cache (/tmp) folder in a Lambda function.

  • Overview
  • S3
    • Create a bucket
    • Upload a configuration file
  • Lambda – Function
    • Retrieves the configuration file from S3 or /tmp
    • Python, boto3

1. Create a configuration file

  • Open any text editor
  • Create a configuration file
    • File name: “config.json
{
    "DynamoDBTable": "OrderHistory"
}

Any json file will be OK.


2. S3 – Create a bucket

  • Click “Create bucket
    • Bucket name: unique name of your choice, such as “my-bucket-2022-12-31
  • Accept all defaults and click “Create bucket
  • Click the bucket name you just created
  • Click “Upload” and upload the “config.json” file

3. IAM – Setup the Lambda execution role

  • Click “Roles” on the left pane
  • Click “Create role
  • trusted entity type
    • Select “AWS service
  • Use case
    • Select “Lambda
  • Add Permissions
    • Search “CloudWatchLogsFullAccess” and select it
    • Search “AmazonS3ReadOnlyAccess” and select it
  • Name, review
    • Role name: “LambdaS3ExecutionRole
  • Click “Create role
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "logs:*"
            ],
            "Effect": "Allow",
            "Resource": "*"
        }
    ]
}
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:Get*",
                "s3:List*",
                "s3-object-lambda:Get*",
                "s3-object-lambda:List*"
            ],
            "Resource": "*"
        }
    ]
}

4. Lambda – Create a Function

  • Function Name: “ReadConfig
  • Runtime: Python 3.9
  • Permissions – Change default execution role
    • Check “Use an existing role
    • Select “LambdaS3ExecutionRole
  • Click “Create function

5. Lambda – Function Code (Python)

  • Type the following code
  • Click “Deploy
import json
import logging
import os
import boto3

logger = logging.getLogger()
logger.setLevel(logging.INFO)

client = boto3.client('s3')

def lambda_handler(event, context):
    bucketName = "my-bucket-2022-12-31"
    fileName = "config.json"    
    localFileName = '/tmp/config.json'
 
    # read the config file from the /tmp first

    if os.path.isfile(localFileName):
        # if it exists
        logger.info('Read config from /tmp')
        tmpFile = open(localFileName, 'r')
        fileContent = tmpFile.read()
    else:
        # if it does not exist,
        # read it from S3 and save it to /tmp
        logger.info('Read config from S3')
        response = client.get_object(
            Bucket = bucketName,
            Key = fileName,
        )
        fileContent = response['Body'].read().decode('utf-8')

        tmpFile = open(localFileName, 'w')
        tmpFile.write(fileContent)
        tmpFile.close()

    jsonConfig = json.loads(fileContent)
    logger.info(jsonConfig)

    return(jsonConfig)

6. Lambda – Test

  • Click the “Test” tab
    • Name: “greeting-queue
  • Select “Create new event
    • Event name: “LambdaTmpTest
    • Accept all default settings
  • Click “Save
  • Click “Test” several times

7. CouldWatch – Check the Lambda logs

  1. On the Lambda function page, click the “Monitor” tab
  2. Check some metrics on the page
  3. Click the “View logs in the CloudWatch” button
  4. On the CloudWatch page, click the “Log stream
  5. Check the logs

You can see the first invocation reads the config file from S3 but subsequent invocation reads the file from the “/tmp”.

Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s