Skip to main content

You can transfer data from your Pigment Views, Lists, Metrics, and Tables to Google Cloud Storage (GCS) with our REST API.

To do this, you create a Google Cloud function that runs a Python script. This script imports data from the Pigment API and stores it in CSV format in a storage bucket within a Google Cloud project.

Recommended reading

In case you need it, we’ve compiled some useful background information for you:  

Before you begin

We recommend that you use environment variables and Secret Manager to export your data. It is possible to hardcode the values for export_api_key and view_ID, however it doesn’t comply with security best practices.

Complete the following tasks in Google and Pigment before you start your export.

Google

  • Create a Google account
  • Create a Google Cloud project
  • Get your cloud project ID
  • Create a Google Cloud storage bucket to store your Pigment API data
  • Obtain your storage bucket ID
  • Consult with your GCP admin to ensure a successful export workflow

Pigment

Only Pigment members with the account type Workspace Admin Workspace Admin or Security Admin can access the Access API key management page and manage API keys. 

Create a Google Cloud Function 

Here’s how you create a Google Cloud function that loads Pigment API data into a storage bucket.

In the following example, we use view_ID,  however you can use listID, metricID, or tableID as appropriate. 

  1. In Google Secret Manager, create a secret using your export API key as a value. 
  2. In the Cloud Functions dashboard, click Create Function
  3. Update the values in the Basics panel: 
    - Environment: select 1st gen
    - Function name: enter your function name 
    - Region name: select your region name 
    Create a function in the Cloud Functions dashboard
  4. In the Trigger menu, select Cloud Pub/Sub.
  5. Create a new topic, and in the Topic ID field enter the topic name: scheduler 
     
    Create a new topic in the Cloud Functions dashboard

     

  6. Click the Runtime tab and use your own values to create a new environment variable called view_ID
    Create a new environment variable called view_ID

     

  7. Click the Security and image repo tab and complete these steps:
    a. In the Secret menu, select your secret. 
    b. In the Reference method menu, select: Exposed as environment variable 
    c. In the Name 1 field, enter: export_api_key
    d. Click Next
    Edit the secret reference

     

  8. In the Code pane, do the following:
    a. In the Runtime menu, select Python 3.8
    b. Copy and paste this code into the file: main.py
    import os
    import requests
    from google.cloud import storage

    EXPORT_API_KEY = os.environ.get('EXPORT_API_KEY')
    VIEW_ID = os.environ.get('VIEW_ID')
    API_URL = f'https://pigment.app/api/export/view/{VIEW_ID}'
    HEADERS = {'Authorization': 'bearer '+EXPORT_API_KEY}

    # Replace these with your own values

    GCS_BUCKET_NAME = 'bucket-id'
    GCS_FILE_NAME = 'filename.csv'

    def download_csv_data_from_pigment(api_url):
    response = requests.get(api_url, headers=HEADERS)
    if response.status_code == 200:
    return response.content
    else:
    raise Exception(f"Failed to download data from API. Status code: {response.status_code}")

    def upload_to_gcs(bucket_name, file_name, data):
    storage_client = storage.Client()
    bucket = storage_client.bucket(bucket_name)
    blob = bucket.blob(file_name)

    # Upload the CSV data to the GCS bucket
    blob.upload_from_string(data, content_type='text/csv')

    def main(data, context):

    csv_data = download_csv_data_from_pigment(API_URL)
    upload_to_gcs(GCS_BUCKET_NAME, GCS_FILE_NAME, csv_data)

    return "CSV data successfully fetched from API and uploaded to GCS."

    c. In the Entry point field, enter the value: main 
    d. In your code, replace the value bucket-id with your Cloud Storage bucket ID.
    e. (Optional) Rename the file: filename.csv
    This is the CSV file where Pigment data is stored in your bucket. 
    Update the main.py file

     

  9. In GCS, locate and open the file: requirements.txt 
  10. Paste this text:
    google-cloud-storage>=1.36.0 
    Update the requirements.txt file

     

  11. Click Deploy.
  12. (Optional) When your updates are deployed, click Test Function in the Actions menu to test your function. 
  13. Go to your bucket in GCS and locate the file: filename.csv
    You may have renamed this file in step 8e. It contains your exported Pigment API data. 
    The CSV file containing your exported Pigment data

     

Schedule regular exports with Cloud Scheduler 

You can schedule exporting jobs so that your Pigment data is automatically exported to your CSV file on a regular basis. The example described below uses Cloud Scheduler to import API data to your bucket at 9:00AM GMT daily.

  1. Go to Cloud Scheduler and select Create Job.
  2. In the Define the schedule pane, do the following:
    a. In the Name field, enter your Scheduler job name.
    b. In the remaining fields, set the frequency using a unix-cron format.

    For more information on using a unix-cron format, see crontab.guru

    The job in this example runs every day at 9:00AM GMT: 
    Define your schedule
     
    c. Click Continue
  3. In the Configure the execution pane, do the following:
    a. In the Target type menu, select Pub/Sub
    b. In the Select a Cloud Pub/Sub topic menu, select the topic that you created earlier with the Cloud Function.
    Configure your schedule

    c. Click Continue
    Your scheduling job is set up and runs at 9:00AM GMT daily. 

 

Be the first to reply!

Reply