Data Quality pipelines

Pipelines define and apply transforms to data loaded by a dataset.

Organizations often gather data from multiple sources in multiple formats. When data is loaded into Data Integrity Suite, destination data stores may require different formats to be used in new ways. The format is often different while purposes require higher quality and consistency. A quality pipeline defines a series of steps to transform or clean data before a job loads it into its final destination. Data Quality pipelines can assure accuracy, consistency, uniqueness, integrity, and validity.