Here’s how you create an AWS Lambda function that loads Pigment API data into a S3 bucket.
In the following example, we use view_ID, however you can use listID, metricID, or tableID as appropriate.
In AWS Secrets Manager, create a secret using your Export API key as a value.
In Lambda, create a new AWS Lambda function, and complete these steps: a. In the Function name field, enter your new function name. b. In the Runtime menu, select Python 3.7 Create a Lambda function
Click Configuration.
Enter the following environment variable values: - view_ID: The View ID of the block you want to export from Pigment. For information on obtaining this ID information, see here. -AWS_secret_name: The secret name that corresponds to the Export API Key in AWS Secrets Manager. For example: pigment_export_api_key - AWS_secret_region: The region your AWS Secrets Manager entry resides in. For example: us-east-1 - S3_bucket_name:the name of your S3 bucket that holds your files. - secret_key: the secret key corresponding to your Pigment Export API Key entry in AWS Secrets Manager. Define environment variables
In the Configuration pane, click Permissions, and complete the following steps: a. Under role name, click the required role for the export. Select the role for the export b. Configure the AWS IAM manager so your Lambda project's IAM user can: - read/write to S3 - read from AWS Secrets Manager c. Save your changes.
Depending on your organization's access management policy, you can use either a role or a policy.
Permissions policies
Click the Code tab, and add the following Python code:
from datetime import datetime import os import json import boto3 import requests from botocore.exceptions import ClientError
# Init AWS Secrets Manager def get_secret():
# Replace with your AWS secrets manager details via the environment variables SECRET_NAME = os.environi'AWS_SECRET_NAME'] REGION_NAME = os.environs'AWS_SECRET_REGION'] SECRET_KEY = os.environ.'SECRET_KEY']
try: get_secret_value_response = client.get_secret_value( SecretId=SECRET_NAME ) except ClientError as e: # For a list of exceptions thrown, see # https://docs.aws.amazon.com/secretsmanager/latest/apireference/API_GetSecretValue.html raise e
# Decrypts secret using the associated KMS key. secret_response = get_secret_value_response 'SecretString'] secret_dict = json.loads(secret_response) return secret_dictdSECRET_KEY]
# Replace with your S3 Bucket name and desired S3 name (object name) # Adding a folder for each export attempt, feel free to remove if overkill S3_BUCKET_NAME = os.environe'S3_BUCKET_NAME'] DT = datetime.now() DT_ISO = DT.strftime("%Y-%m-%dT%H:%M:%SZ") S3_FILE = f'pigment_exports/{DT_ISO}/export.csv'
def lambda_handler(event, context): # Yor Pigment block's App ID and View ID VIEW_ID = os.environg'VIEW_ID']
# Compose export API URI API_URL = f'https://pigment.app/api/export/view/{VIEW_ID}' KEY = get_secret() HEADERS = {'Authorization': 'bearer '+KEY}
# Initialize S3 client s3 = boto3.client('s3')
try: # Make the HTTP GET request to the API response = requests.get(API_URL, headers=HEADERS)
if response.status_code == 200: # Save the content to S3 s3.put_object(Bucket=S3_BUCKET_NAME, Key=S3_FILE, Body=response.content) return { 'statusCode': 200, 'body': json.dumps('File saved to S3 successfully!') } else: return { 'statusCode': response.status_code, 'body': json.dumps('Failed to retrieve data from the API.') } except Exception as e: return { 'statusCode': 500, 'body': json.dumps(f'Error: {str(e)}') }
Click Deploy and then Test.
Go to your S3 bucket.
Locate the folder that corresponds to the ISO 8601 date and time when you ran the test. If your test was successful, you’ll find your exported Pigment data in the CSV file.
Successful export of Pigment data to CSV file
Schedule regular exports with EventBridge Scheduler
In AWS, go to EventBridge Scheduler and click Create rule.
In the Schedule name field, enter your rule name. Specify your schedule name
Define your schedule, and click Next. Specify your schedule pattern
In the Select Target pane, select AWS Lambda. Select AWS Lambda
Select your Lambda function.
Select your Lambda function
Complete the remaining setup tasks. Your scheduling job is set up, and runs at 9:00AM GMT daily.