You can transfer data from your Pigment Views, Lists, Metrics, and Tables and export it to an object in Salesforce. To do this, you transfer a CSV file to Salesforce by performing an upsert (update and insert) operation in the Salesforce connector in Azure Data Factory.
You can use Data Factory Studio to create three linked services that establish a connection to your Pigment workspace. After you set up the linked services, you create two datasets to provide a structured representation of your exported data. These datasets connect the Pigment data to the linked services you just created. Next, you create activities in Data Factory Studio in order to perform specific data transfer operations in the data pipeline. When you connect the three activities, the end result is the successful transfer of a CSV file, containing your Pigment data, to a specified Salesforce Object.
Azure’s Salesforce connector uses the External ID field type on whichever Salesforce object you want to interact with. If you decide you don’t want to use the External ID field, then you need to use the Salesforce REST API. More information on this is available in the Salesforce documentation.
Recommended reading
In case you need it, we’ve compiled some useful background information for you:
-
Copy data from an HTTP endpoint by using Azure Data Factory or Azure Synapse Analytics
-
Copy data from and to Salesforce using Azure Data Factory or Azure Synapse Analytics
- How to trigger an import with APIs
Before you begin
We recommend that you complete the following tasks in Azure Services and Pigment before you begin your export.
Azure services
- Consult with your Azure and Salesforce admins to ensure a successful export workflow. In some organizations, services, such as Key Vault and Salesforce, are sometimes controlled by different teams. You may need to obtain permissions and information from these teams to successfully complete your export.
- Create an Azure Key Vault: This stores your Pigment API key and the Salesforce key. Ensure that this vault has the correct permissions to complete the export. To comply with security best practices, we recommend that you use a vault and that you don’t hardcode any secrets.
- Assign permissions in Azure Identity and Access Management (IAM): The linked services in our example use a system-assigned managed identity. Ensure that you assign the correct role to that identity within Azure’s IAM.
Pigment
Only Pigment members with the account type Workspace Admin or Security Admin can access the Access API key management page and manage API keys.
- Obtain your export API key. This is explained in Manage API Keys.
- Obtain your View ID. This is explained in How to export data from Pigment with APIs.
- Obtain your List, Metric, or Table IDs. This is explained in Export raw data from a Pigment Block.
Salesforce
You need the following Salesforce information:
- API name of the field that has the required External ID property
- Salesforce object API name
This is the Salesforce object you want to update. In this example we use the Opportunity object. - Account username
This account must have read and update access to the required records and objects. - Account password
- (Optional) Account security token
If you need more information on whether this is optional for you, take a look at the Azure documentation. - Environment URL
For example: https://example.my.salesforce.com
1. Create linked services
Here you create three linked services in Data Factory Studio. These link to your Pigment API, Salesforce, and the Azure Key Vault.
- Open Data Factory Studio.
- Click Manage and then click Linked Services.
- Click +New.
-
For the HTTP linked service, do the following:
a. Search for HTTP, and then click Continue.
b. Enter the following values:
- Name:PigmentExportAPI
- BaseURL:https://pigment.app/api/export/view
In this BaseURL, we use
view
, however you can uselist
,metric
, ortable
as appropriate.- Authentication Type:
Anonymous
c. Keep the remaining default values.
d. Click Save. - For the Salesforce linked service, do the following:
a. Search for Salesforce, and then click Continue.
b. Complete the details for your Salesforce setup.
If you need help to configure this linked service, take a look at the Azure documentation. - For the Azure Key Vault linked service, do the following:
a. Search for Azure Key Vault, and then click Continue.
b. Complete the details for your Azure Key Vault setup.
If you need help to configure your Azure Key Vault linked service, take a look at the Azure documentation.
When you’ve created all three linked services, your Linked Services tab looks similar to this:
2. Create datasets
Datasets provide a structured representation of the exported data, and connect the data to the linked services you just created. For this export, you need two datasets:
- CSV export from HTTP
- Salesforce object
Information on how to create each dataset is provided in detail below.
CSV Export from HTTP
In the following example, we use view_ID
, however you can use listID
, metricID
, or tableID
as appropriate.
For the first dataset, do the following:
- Search for HTTP, and then click Continue.
- Select DelimitedText, and then click Continue.
You need to select DelimitedText because Pigment’s API returns CSV data in a semi-colon delimited format.
- On the Set properties page, enter the following values:
- Name:PigmentExport
- LinkedService:PigmentExportAPI
- First row as header:Checked
- Keep the remaining default values.
- Click OK.
- On the Parameters tab, create a parameter with the following values:
- Name:view_id
- Type:String
- Default value:VIEW_ID
This is the View ID of the block you want to export from Pigment. If you need information on how to get this ID, see here.
- On the Connection tab, enter the following values:
- Relative URL:dataset().view_id
- Column delimiter:Semicolon (;)
- Keep the remaining default values.
- Click Save.
- On the Schema tab, define your Block schema by doing the following:
a. In Pigment, download a CSV copy of your view. If you need help with this, see here.
b. Open your CSV file and remove any data you don’t need for the schema in Data Factory.
c. Save your file.
c. Click Import Schema to upload your modified CSV file to Data Factory Studio.
Salesforce Object
For the second dataset, do the following:
- Search for Salesforce, and then click Continue.
- In the Name field, enter the name of your new dataset.
- Set the Object API Name to the required Salesforce API name.
This API name represents the Saleforce object used in the upsert operation. In the example below we use the Opportunity object. - Click OK.
3. Create activities for your pipeline
In Data Factory Studio, you need to create these three activities:
- Get the Pigment API key using the Azure Key Vault
- Set the export API key as a variable
- Move data from Pigment to Salesforce
Information on how to create each activity is provided in detail below.
Get the Pigment API key using the Azure Key Vault
This activity takes the Pigment API Key stored in Azure Key Vault and passes it into the API call as a variable. Create this activity with the following values:
- Activity Type:
Web
- General:
- Name:GetAPIKey
- Secure Output:True
- Secure Input:True
We recommend that you assign a True value to both the Secure Input and Secure Output fields.
- Settings:
- URL: This is your Azure Key store URL. Append?api-version=7.0
to the end of this URL.
If you need help obtaining this, check out this Azure documentation.
- Method:GET
- Authentication:System Assigned Managed Identity
- Resource:https://vault.azure.net
Set export API key as a variable
This activity takes the value from the export API key and sets it as the value in a variable. That variable is used in the next activity. Create this activity with the following values:
- Activity Type:
Set variable
- General:
- Name:SetExportAPIKey
- Secure Output:True
- Secure Input:True
It’s recommended that you assign a True value to both the Secure Input and Secure Output fields.
- Settings:
1. Click +New located next to the name field, and complete the following:
- Name:PigmentExportKey
- Type:String
2. Click Save, and then complete the remaining fields:
- Name:PigmentExportKey
- Value:@activity('Get API Key').output.value)
Move data from Pigment to Salesforce
This activity takes the output from the export API key, which is used to make the API call to Pigment. It returns a CSV file for your specified block, and then calls Salesforce’s API to make the updates to the selected Salesforce object.
Create this activity with the following values:
- Activity Type:
Copy data
- General:
- Name:UpdateSalesforce
- Source:
- Source dataset:Set Export API Key
- Dataset properties:True
- Request method:True
- Additional headers:@{concat('Authorization: Bearer',variables('PigmentExportKey'))}
- Sink:
- Sink dataset:Salesforce
- Write behavior:Upsert
- External ID field: This is the Salesforce API name of the field on the Salesforce object that you’re updating. It needs to contain the External ID property, so you need to ensure that this field is exported as a column in your Pigment dataset. If the values don’t match an existing record, a new record is created. If you need more information on the External ID field, more details are available in Salesforce documentation.
- Review and complete the remaining fields in the Sink tab as you need for your setup. - Mapping:
- Source: Select the columns in your Pigment export that you require in Salesforce. - Destination: Select the target Salesforce API field name where you want to insert your Pigment data.External ID is a required entry in this page.
In the example below, the Salesforce object Opportunity is updated, however the connector supports several fields.
- Review and complete the remaining fields in the Mapping tab as you need for your setup.
4. Connect your activities
When you have successfully created these activities, use the On Success option in Data Factory Studio to connect them and to complete your pipeline. It looks similar to the image below:
And you’re done! A CSV file with your Pigment data is successfully upserted to your selected Salesforce Object.
5. Next steps
We recommend the following steps to enhance and maximize your Pigment export:
- Verify and debug your pipeline to correct any possible errors in your workflow.
- Update the pipeline or your datasets to align with your use case.
- Observe any updated or new Salesforce data based on the data exported from Pigment.
- Currently, the workflow only runs when you manually configure it. Consider adding a Trigger to your export so you can schedule automatic exports.
- Consider adding Pipeline Activities to observe error handling, and using monitoring and logging options available in Azure.
- Explore Data Factory’s other options for data transformation and movement. This allows you to expand your Pigment export data even further within the platform.