Skip to main content

Hello Pigment Community,

Our organisation is currently undergoing a migration to utilise SQL data warehouses on Databricks, and we're looking to transition our connection from Redshift to Databricks. However, we haven't found a straightforward, out-of-the-box solution to establish this connection.

Could anyone from the Pigment team or community members who have faced similar challenges provide guidance on the proposed solution for connecting directly to Databricks? We currently have a workaround involving copying our Data Lake models into S3 and then connecting from there, but we're eager to explore a more direct integration.

Thank you in advance!

At present we don't have a Databricks connector so the only option is to use our Import API to 'push' data into Pigment. This can be done using a Databricks notebook and construct a Python script to pull from Databricks and push into Pigment. There are Pigment imports scripts shared in the Community as a good starter (shared below).

Alternatively, you could look at some sort of iPaaS solution (Workato, Informatica, Boomi, Mule et al) to manage this workflow.

Pigment Python API scripts:

 


Reply