Our team is thinking about future Pigment updates, primarily the upcoming AI capabilities. We are looking at bringing in more dimensions into our modeling to help better isolate actionable insights and plan at a more granular level. As a result, our source datasets (which we will bring into Pigment as transaction lists) are going to grow significantly. From millions to perhaps up to 100 million records. Are there any limits to the size of a transaction list? Will overall performance suffer? Just want to get a sense of the possible.
Thank you!