“Dirty Data” adds costs and hampers decision making

by Erin McCune on March 16, 2007

in BI and Performance Management, Research Round Up, Technology

Over at the EyeOnBI blog Mike Leano summarizes a recent Gartner presentation on the cost of "dirty data" – data that is inaccurate, incomplete, or duplicated. Gartner estimates that more than 25% of the data within Fortune 1000 companies is flawed, and will continue to be flawed for the foreseeable future at most companies. Gartner observes that bad data isn't just an IT problem, and suggests that companies need to appoint 'data stewards' within business groups that are responsible for the quality of information.

You've got to have clean data if you are going to successfully compete on analytics. Speaking of competing on analytics – Tom Davenport's book by that name was recently published (expanding on his HBR article mentioned in my previous post) and arrived from Amazon this week. I'll be reading it soon and share my thoughts in an upcoming post.

Competing on Analytics: The New Science of Winning

by Thomas H. Davenport and Jeanne G. Harris
Harvard Business School Press
March 2007

Leave a Reply

Previous post:

Next post:

Clicky Web Analytics