So, how can you evade your own data quality horror stories? We share three common causes of data downtime and walk through how you can escape them. Or take Equifax, who issued inaccurate credit scores to millions of its customers back in the summer of 2022, all due to a problem with bad data on a legacy on-prem server. The data downtime incident sent the company’s stock plummeting by 36%, costing the company upwards of $110 million in lost revenue. On a dark and stormy night in 2022 (just kidding, we don’t know what time of day it was), gaming software company Unity Technologies’ Audience Pinpoint tool, designed to aid game developers in targeted player acquisition and advertising, ingested bad data from a large customer, causing major inaccuracies in the training sets for its predictive ML algorithms and a subsequent dip in performance. What constitutes a data horror story, you might ask? Here are a few examples. And for data engineers, that “horror” is more often than not bad data.Īccording to a recent study from Wakefield Research, data teams spend 40 percent or more of their time tackling poor data quality, impacting 26 percent of their company’s total revenue. While moviegoers can hide behind their buckets of popcorn or yell at the protagonist to “get away from the door!” data engineers are not so lucky when horror strikes. The basement light flickers and the room goes silent. Every entrance in the mansion is locked aside from the backdoor. We all know the tell-tale signs that something is about to go horribly awry in a horror movie.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |