Skip to main content

Hi all,

 

I'm importing an employee transaction list with historical employment data using a direct connector to my HRIS. I’m creating a new unique ID property from Employee ID + Effective Date.

 

I ran into the following challenge: my source data sometimes contains two events for the same employee on the same day (e.g., a hire and a no-show termination).

 

My question is: is there an automated method in Pigment to identify and remove all rows that share a non-unique key during an import? I want these conflicting pairs of rows to be excluded entirely and I need the monthly import from my HRIS to run automatically without failing.

 

Any ideas on how to set this up?

 

Thanks!


Hi ​@andyzilla,

Thanks for your question!

At the moment, there isn’t a Pigment setting that will automatically drop duplicates and let the import continue. The best approach is either to remove duplicates before import or use a staging area in Pigment to identify and manage them.


Here’s how it works today:

If you enable Unique item values on a property, Pigment will stop the import if duplicates are found. However, it won’t filter or remove those duplicate rows on its own.

What you can do instead:

Clean up before importing
The smoothest option is to remove duplicates in your source system (HRIS) or by using a data-prep/ETL step before the data reaches Pigment. That way, your imports run successfully every time.

Handle duplicates inside Pigment
Another approach is to bring the data into a staging Transaction List (without unique constraints). From there, you can create a Metric or formula to flag rows with duplicate keys (e.g., Employee ID + Effective Date) so you can review and resolve them before moving the data into your main model.
 

Hope this helps! Let us know if you’d like more detail on setting up a staging list or duplicate flagging, we’re happy to walk you through it! 😊


Hi Tobias!

Thank you, this is helpful.

I’ll try handling the duplicates inside Pigment with a staging list, as I want to leverage the automation from my direct connectors. 


Reply