Skip to main content

You can transfer data from your Pigment Views, Lists, Metrics, and Tables to Amazon S3 with our REST API.

First, you create a Amazon Lambda function that runs a Python script. This script retrieves the Pigment API data and stores it in an Amazon S3 bucket.

Recommended reading

 

In case you need it, we’ve compiled some useful background information for you:   

Before you begin

 

Before you start your export, we recommend that you complete the following tasks in Amazon S3 and Pigment. 

Amazon S3

  • You need a Amazon S3 account to perform this export. 
  • Consult with your Amazon S3 admin. You may need to obtain permissions and information to successfully complete your export.

Pigment

Only Pigment members with the account type Workspace Admin or Security Admin can access the Access API key management page and manage API keys. 

Create an AWS Lambda function 

 

Here’s how you create an AWS Lambda function that loads Pigment API data into a S3 bucket. 

In the following example, we use view_ID,  however you can use listID, metricID, or tableID as appropriate. 

  1. In AWS Secrets Manager, create a secret using your Export API key as a value. 
  2. In Lambda, create a new AWS Lambda function, and complete these steps:
    a. In the Function name field, enter your new function name. 
    b. In the Runtime menu, select Python 3.7 
     
    Create a Lambda function
  3. Click Configuration
  4. Enter the following environment variable values:  
    - view_ID: The View ID of the block you want to export from Pigment.
    For information on obtaining this ID information, see here.
    -AWS_secret_name: The secret name that corresponds to the Export API Key in AWS Secrets Manager. 
    For example: pigment_export_api_key
    - AWS_secret_region: The region your AWS Secrets Manager entry resides in.
    For example: us-east-1
    - S3_bucket_name:the name of your S3 bucket that holds your files.
    - secret_key: the secret key corresponding to your Pigment Export API Key entry in AWS Secrets Manager. 
     
    Define environment variables 

     

  5. In the Configuration pane, click Permissions, and complete the following steps:
    a. Under role name, click the required role for the export. 
     
    Select the role for the export
    b. Configure the AWS IAM manager so your Lambda project's IAM user can:
    - read/write to S3
    - read from AWS Secrets Manager
    c. Save your changes. 

    Depending on your organization's access management policy, you can use either a role or a policy. 

    Permissions policies
     
  6. Click the Code tab, and add the following Python code: 
    from datetime import datetime
    import os
    import json
    import boto3
    import requests
    from botocore.exceptions import ClientError

    # Init AWS Secrets Manager
    def get_secret():

    # Replace with your AWS secrets manager details via the environment variables
    SECRET_NAME = os.environi'AWS_SECRET_NAME']
    REGION_NAME = os.environs'AWS_SECRET_REGION']
    SECRET_KEY = os.environ.'SECRET_KEY']

    # Create a Secrets Manager client
    session = boto3.session.Session()
    client = session.client(
    service_name='secretsmanager',
    region_name=REGION_NAME
    )

    try:
    get_secret_value_response = client.get_secret_value(
    SecretId=SECRET_NAME
    )
    except ClientError as e:
    # For a list of exceptions thrown, see
    # https://docs.aws.amazon.com/secretsmanager/latest/apireference/API_GetSecretValue.html
    raise e

    # Decrypts secret using the associated KMS key.
    secret_response = get_secret_value_response 'SecretString']
    secret_dict = json.loads(secret_response)
    return secret_dictdSECRET_KEY]

    # Replace with your S3 Bucket name and desired S3 name (object name)
    # Adding a folder for each export attempt, feel free to remove if overkill
    S3_BUCKET_NAME = os.environe'S3_BUCKET_NAME']
    DT = datetime.now()
    DT_ISO = DT.strftime("%Y-%m-%dT%H:%M:%SZ")
    S3_FILE = f'pigment_exports/{DT_ISO}/export.csv'

    def lambda_handler(event, context):
    # Yor Pigment block's App ID and View ID
    VIEW_ID = os.environg'VIEW_ID']

    # Compose export API URI
    API_URL = f'https://pigment.app/api/export/view/{VIEW_ID}'
    KEY = get_secret()
    HEADERS = {'Authorization': 'bearer '+KEY}

    # Initialize S3 client
    s3 = boto3.client('s3')

    try:
    # Make the HTTP GET request to the API
    response = requests.get(API_URL, headers=HEADERS)

    if response.status_code == 200:
    # Save the content to S3
    s3.put_object(Bucket=S3_BUCKET_NAME, Key=S3_FILE, Body=response.content)
    return {
    'statusCode': 200,
    'body': json.dumps('File saved to S3 successfully!')
    }
    else:
    return {
    'statusCode': response.status_code,
    'body': json.dumps('Failed to retrieve data from the API.')
    }
    except Exception as e:
    return {
    'statusCode': 500,
    'body': json.dumps(f'Error: {str(e)}')
    }
     
  7. Click Deploy and then Test
  8. Go to your S3 bucket.
  9. Locate the folder that corresponds to the ISO 8601 date and time when you ran the test. 
    If your test was successful, you’ll find your exported Pigment data in the CSV file. 

    Successful export of Pigment data to CSV file

     

Schedule regular exports with EventBridge Scheduler

  1. In AWS, go to EventBridge Scheduler and click Create rule
  2. In the Schedule name field, enter your rule name. 
    Specify your schedule name

     

  3. Define your schedule, and click Next
    Specify your schedule pattern

     

  4. In the Select Target pane, select AWS Lambda. 
    Select AWS Lambda

     

  5. Select your Lambda function. 

    Select your Lambda function

     

  6. Complete the remaining setup tasks.
    Your scheduling job is set up, and runs at 9:00AM GMT daily. 

Be the first to reply!

Reply