Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Data Quality Assessment

6 views
Skip to first unread message

Mike Dwell

unread,
Jan 29, 2022, 12:37:15 AM1/29/22
to
Hi there,

I have been bumping my head against wall in trying to figure out a good real-world solution for this challenging problem that my friend asked me.

Could you please give some pointers?

Lets say we want to assess the data quality of Company A's big data. Due to both security, privacy and work-load concerns, it's impossible to view/access the whole data repository(data-lake or data-ocean) of A.

We can only request a sample of Company A's big data and then hopefully we can apply some quality-assess-toolkit to do some analysis.

My question is: how to draw such a data sample? what requirements should we set up for such a data sample?

Moreover, Company A may "optimize" or "decorate" the sample data that he gives out, what might be a good scheme or mechanism design such that we can avoid his "optimization" or "decoration"?

Could anybody please give some pointers?

Thanks a lot!

David Duffy

unread,
Jan 29, 2022, 4:11:45 AM1/29/22
to
Mike Dwell <mike.dwel...@gmail.com> wrote:
>
> Lets say we want to assess the data quality of Company A's big data. Due
> to both security, privacy and work-load concerns, it's impossible to
> view/access the whole data repository(data-lake or data-ocean) of A.
>
> We can only request a sample of Company A's big data and then hopefully we can apply some quality-assess-toolkit to do some analysis.
>
> My question is: how to draw such a data sample? what requirements should we set up for such a data sample?
>
> Moreover, Company A may "optimize" or "decorate" the sample data that he gives out, what might be a good scheme or mechanism design
> such that we can avoid his "optimization" or "decoration"?

Hi. I don't think there is a generic "quality-assess" approach, as
this so depends on the nature of the work the company does, aside from
obvious things like range checks on dates etc (which they usually do on
the fly anyway) or batch-to-batch comparisons. Like can you validate
data points using external data sources? If it is "big data", then
they can still easily provide a far bigger sample than you will ever
need for statistical power purposes. Personally, I would first do an
"eyeball" examination of a relatively small contiguous data series
just to get a feel of what goes on.

Just 2c, David Duffy.
0 new messages