Skip to main content

Our team is thinking about future Pigment updates, primarily the upcoming AI capabilities.  We are looking at bringing in more dimensions into our modeling to help better isolate actionable insights and plan at a more granular level.  As a result, our source datasets (which we will bring into Pigment as transaction lists) are going to grow significantly.  From millions to perhaps up to 100 million records.  Are there any limits to the size of a transaction list?  Will overall performance suffer?  Just want to get a sense of the possible.

Thank you!

Pigment is built to handle very large datasets at enterprise scale. In our internal performance tests, we’ve successfully managed Transaction Lists with hundreds of millions of rows and models with billions of cells. On top of that, we already support customers today running Pigment with data volumes in this range.

That said, performance always depends on factors like the number of columns, data constraints, and the type of calculations being applied. To ensure performance remains optimal at scale, this may require a dedicated Enterprise plan and, in some cases, a migration to our more advanced storage solutions designed for these types of load.

We recommend reaching out to your CSM, who can guide you on best practices and make sure your workspace is set up to fully support this level of data volume.


Thank you Julien.  This is great news for our organization.

 


Reply