Skip to main content
Solved

Migrating Connection from Redshift to Databricks

  • March 17, 2025
  • 2 replies
  • 123 views

Forum|alt.badge.img

Hello Pigment Community,

Our organisation is currently undergoing a migration to utilise SQL data warehouses on Databricks, and we're looking to transition our connection from Redshift to Databricks. However, we haven't found a straightforward, out-of-the-box solution to establish this connection.

Could anyone from the Pigment team or community members who have faced similar challenges provide guidance on the proposed solution for connecting directly to Databricks? We currently have a workaround involving copying our Data Lake models into S3 and then connecting from there, but we're eager to explore a more direct integration.

Thank you in advance!

Best answer by Pierre

Hello Patryk,

For your information we are currently actively working on a Native integration with Databricks. 
The connector should be available in production by end of October. 

It shoudl hopefully make the integration with Databrick simpler to configure and maintain.

 

Pierre

2 replies

Forum|alt.badge.img+3

At present we don't have a Databricks connector so the only option is to use our Import API to 'push' data into Pigment. This can be done using a Databricks notebook and construct a Python script to pull from Databricks and push into Pigment. There are Pigment imports scripts shared in the Community as a good starter (shared below).

Alternatively, you could look at some sort of iPaaS solution (Workato, Informatica, Boomi, Mule et al) to manage this workflow.

Pigment Python API scripts:

 


Pierre
Community Manager
Forum|alt.badge.img+9
  • Community Manager
  • Answer
  • October 2, 2025

Hello Patryk,

For your information we are currently actively working on a Native integration with Databricks. 
The connector should be available in production by end of October. 

It shoudl hopefully make the integration with Databrick simpler to configure and maintain.

 

Pierre