Skip to main content

You can transfer data from your Pigment Views, Lists, Metrics, and Tables to Azure Blob Storage or Azure Data Lake Services Gen 2 (ADLS Gen2) using Azure Data Factory. 

You use Data Factory Studio to create two linked services that establish a connection to your Pigment API and ADLS Gen2. After you set up the linked services, you create two datasets to provide a structured representation of your exported data. These datasets connect the Pigment data to the linked services you just created. Next, you create activities in Data Factory Studio in order to perform specific data transfer operations in the data pipeline. When you connect the three activities, the end result is the successful transfer of a CSV file, containing your Pigment data, to an assigned location in ADLS Gen2.

We use specific Azure services and information in this example. If you don’t want to use these services, you’ll need to adjust the information provided here to your own requirements. 

Recommended reading

 

In case you need it, we’ve compiled some useful background information for you:  

 

Before you begin

We recommend that you complete the following tasks in Azure Services and Pigment before you begin your export. 

Azure services

  1. Consult with your Azure admins. In some organizations, services like Key Vault are sometimes controlled by different teams. You may need to obtain permissions and information from these teams to successfully complete your export. 
  2. Create an Azure Key Vault: This stores your Pigment API key and the Salesforce key. Ensure that this vault has the correct permissions to complete the export. To comply with security best practices, we recommend that you use a vault and that you don’t hardcode any secrets.
  3. Verify access to your storage account: In our example, we use ADLS Gen2. 
  4. Assign permissions in Azure Identity and Access Management (IAM): The linked services in our example use a system-assigned managed identity. Ensure that you assign the correct role to that identity within Azure’s IAM.

Pigment

 

Only Pigment members with the account type Workspace Admin or Security Admin can access the Access API key management page and manage API keys. 

1. Create linked services 

 

Here you create two linked services in Data Factory Studio. In this example, the first linked service is to the Pigment API, and the second one is to the ADLS Gen2 storage. 

  1. Open Data Factory Studio. 
  2. Click Manage and then click Linked Services
  3. Click +New
  4. For the HTTP linked service, do the following: 
    a. Search for HTTP, and then click Continue
    b. Enter the following values: 
    - Name: PigmentExportAPI
    - BaseURL: https://pigment.app/api/export/view

    In this BaseURL, we use view,  however you can use list, metric, or table as appropriate. 

    - Authentication Type: Anonymous
    c. Keep the remaining default values. 
    d. Click Save
    Create a HTTP linked service

     

  5. For the ADLS Gen2 linked service, do the following: 
    a. Search for ADLS Gen2, and then click Continue
    b. Complete the details for your ADLS Gen2 setup. 
    If you need help to configure this linked service, take a look at the Azure documentation
     
    Create a ADLS Gen2 linked service

     

2. Create datasets 

 

Datasets provide a structured representation of the exported data, and connect the data to the linked services you just created. For this export, you need two datasets: 

  • CSV export from HTTP 
  • ADLS Gen2 

Information on how to create each dataset is provided in detail below. 

CSV Export from HTTP

In the following example, we use view_ID,  however you can use listID, metricID, or tableID as appropriate. 

In Data Factory Studio, do the following: 

  1. Click the Author tab.
  2. Click Datasets > … > New dataset
  3. Search for HTTP, and then click Continue
  4. Select DelimitedText, and then click Continue

    You need to select DelimitedText because Pigment’s API returns CSV data in a semi-colon delimited format. 

  5. On the Set properties page, enter the following values: 
    - Name: PigmentExport
    - LinkedService: PigmentExportAPI
    - First row as header: Checked
  6. Keep the remaining default values. 
  7. Click OK
  8. On the Parameters tab, create the first parameter with the following values: 
    - Name: application_id
    - Type: String
    - Default value: Pigment_Application_ID

    For information on obtaining Pigment ID information, see here

  9. Create a second parameter with the following values: 
    - Name: view_id
    - Type: String
    - Default value: VIEW_ID
  10. On the Connection tab, enter the following values: 
    - Relative URL: dataset().view_id 
    - Column delimiter: Semicolon (;)

  11. Keep the remaining default values. 
  12. Click Save.
     
    Create a CSV Export from HTTP dataset

     

ADLS Gen2

 

In Data Factory Studio, do the following: 

  1. Click the Author tab.
  2. Click Datasets > … > New dataset
  3. Search for ADLS Gen2, and then click Continue.  
  4. Select DelimitedText, and then click Continue.
  5. In the Set Properties page, enter the following values: 
    - Name: ADLSGen2
    - Linked Service: AzureDataLakeStorage
    - File path: The target location in Blob storage for your CSV file. 
    - First row as header: Checked
  6. Keep the remaining default values. 
  7. Click OK
  8. In the Connections page, enter the following values: 
    - File path: The target location in Blob storage for your CSV file. 
    - Column delimiter: Semicolon(;) 
     
    Create a ADLS Gen2 dataset
     

3. Create activities for your pipeline

 

In Data Factory Studio, you need to create these three activities: 

  1. Get the Pigment API key using the Azure Key Vault
  2. Set the export API key as a variable
  3. Load data from Pigment to ADLS Gen2

Information on how to create each activity is provided in detail below. 

Get the Pigment API key using the Azure Key Vault

 

This activity takes the Pigment API Key stored in Azure Key Vault and passes it into the API call as a variable.

  • Activity Type: Web
  • General:
    - Name: GetAPIKey 
    - Secure Output: True 
    - Secure Input: True 

    It’s recommended that you assign a True value to both the Secure Input and Secure Output fields. 

  • Settings: 
    - URL: This is your Azure Key store URL. Append ?api-version=7.0 to the end of this URL.
    If you need help obtaining this, check out this Azure documentation
    - Method: GET 
    - Authentication: System Assigned Managed Identity
    Resource: https://vault.azure.net  

Set export API key as a variable

 

This activity takes the value from the export API key and sets it as the value in a variable. That variable is used in the next activity. 

  • Activity Type: Set variable
  • General:
    - Name: SetExportAPIKey 
    - Secure Output: True 
    - Secure Input: True 

    It’s recommended that you assign a True value to both the Secure Input and Secure Output fields. 

  • Settings: 
    1. Click +New located next to the name field, and enter the following values: 
    - Name: PigmentExportKey
    - Type: String 
    2. Click Save, and then enter the following values: 
    - Name: PigmentExportKey
    Value: @activity('Get API Key').output.value

Move data from Pigment to ADLS Gen2

 

This activity takes the output from the export API key, which is used to make the API call to Pigment. It returns a CSV file for your specified block, and then adds it to the assigned location in ADLS Gen2.

  • Activity Type: Copy data
  • General:
    - Name: LoadCSVToADLS 
  • Source: 
    - Source dataset: PigmentExport 
    - Dataset properties: These properties need to match the Block you are exporting from Pigment.  
    - Request method: GET 
    Additional headers:
     
    @{concat('Authorization: Bearer',variables('PigmentExportKey'))}
  • Sink: 
  • - Sink dataset: ADLS

4. Connect your activities

 

When you have successfully created these activities, use the On Success option in Data Factory Studio to connect them and to complete your pipeline. It looks similar to the image below: 

Successful Pigment to Azure data pipeline

 And you’re done! A CSV file with your Pigment data is available in your ADLS Gen2 storage. 

 

5. Next steps

 

We recommend the following steps to enhance and maximize your Pigment export: 

  • Verify and debug your pipeline to correct any possible errors in your workflow. 
  • Update the pipeline or your datasets to align with your use case. 
  • Observe any updated or new Salesforce data based on the data exported from Pigment. 
  • Currently, the workflow only runs when you manually configure it. Consider adding a Trigger to your export so you can schedule automatic exports.
  • Consider adding Pipeline Activities to observe error handling, and using monitoring and logging options available in Azure. 
  • Explore Data Factory’s other options for data transformation and movement. This allows you to expand your Pigment export data even further within the platform. 
Be the first to reply!

Reply