Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Last updated : Oct 26, 2022
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Oct 2022
Adopt ?

Great Expectations has become a sensible default for our teams in the data quality space, which is why we recommend adopting it — not only for the lack of better alternatives but also because our teams have reported great results in several client projects. Great Expectations is a framework that allows you to craft built-in controls that flag anomalies or quality issues in data pipelines. Just as unit tests run in a build pipeline, Great Expectations makes assertions during the execution of a data pipeline. We like its simplicity and ease of use — the rules stored in JSON can be modified by our data domain experts without necessarily needing data engineering skills.

Apr 2021
Trial ?

We wrote about Great Expectations in the previous edition of the Radar. We continue to like it and have moved it to Trial in this edition. Great Expectations is a framework that enables you to craft built-in controls that flag anomalies or quality issues in data pipelines. Just as unit tests run in a build pipeline, Great Expectations makes assertions during execution of a data pipeline. We like its simplicity and ease of use — the rules stored in JSON can be modified by our data domain experts without necessarily needing data engineering skills.

Oct 2020
Assess ?

With the rise of CD4ML, operational aspects of data engineering and data science have received more attention. Automated data governance is one aspect of this development. Great Expectations is a framework that enables you to craft built-in controls that flag anomalies or quality issues in data pipelines. Just as unit tests run in a build pipeline, Great Expectations makes assertions during execution of a data pipeline. This is useful not only for implementing a sort of Andon for data pipelines but also for ensuring that model-based algorithms remain within the operating range determined by their training data. Automated controls like these can help distribute and democratize data access and custodianship. Great Expectations also ships with a profiler tool to help understand the qualities of a particular data set and to set appropriate limits.

Published : Oct 28, 2020

Download the PDF

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

Subscribe now

Visit our archive to read previous volumes