It sounds obvious to most. For those of us with three decades or more in the industry, we forever have the words “garbage in, garbage out” engrained in our minds. It was true then and will always be true.
These words bear repeating again in the age of data analytics. Today, there is no shortage of marketing material adorned with colorful graphics and every imaginable chart type. Let’s face it, a picture is worth a thousand words, especially if it helps a company find ways to improve operational efficiency, increase profits, or discover and broadcast a new competitive edge.
The pictures are usually true representations of the data sets that feed them. But what if those data sets have errors in them inherent to manual keying? The pictures will not only mislead but could possibly steer leadership to make disastrous decisions resulting in extreme negative impacts to corporate financials, customer satisfaction, inventory mismanagement, or future budgets. In today’s world, many systems are integrated, and one data entry error may affect multiple systems, thus affecting dashboards across many different departments.
The $1-$10-$100 Rule
A common business concept is the 1-10-100 rule. This rule-of-thumb illustrates the hard costs of an organization chasing data errors and how costs can quickly escalate until those mistakes are discovered.
The rule is this: it costs $1 to verify the accuracy of data at the point of entry, $10 to correct or clean up data in batch form, and $100 (or more) per record if nothing is done – including the costs associated with low customer retention and inefficiencies. (Source: totalqualitymanagement.wordpress.com)
The costliest data entry error in recent history is known as the “FLASH CRASH” of May 2010 when the stock market plunged 1,000 points for no apparent reason. The “flash crash” wiped out $1.1 trillion in investments. It left the market badly shaken, even though most of the money was quickly recovered.
What happened? It appears that a single keystroke error was to blame. The letter “B” was inserted in a sell order instead of the letter “M.” Billion was input where Million should have been. In your business, your balance sheet may not have as many zeroes, but keystroke errors can be equally as devastating.
Having flawless data capture enables flawless analytics and delivers the knowledge needed to develop a sound business vision. As researcher Jonathan Litman put it, we find knowledge rewarding because it “dispels undesirable states of ignorance and uncertainty." If the data driving a decision is inaccurate, the vision directly reflects the same. Good business leaders not only want to see the health of the business at any given moment but also want to feel like they have been an integral part of the discovery process to determine the WHY part of what the data is communicating. This is how leadership learns to avoid the negatives and replicate the positives as well as uncover new opportunities along the path.
The New Analytical Landscape Demands Data Quality
Without ensuring data quality at the point of data capture using quality enforced validation rules embedded directly into a data collection form itself, the results can be costly and even catastrophic: more people looking at more bad data while making more misinformed decisions.
By allowing employees to explore data within a trusted and secure environment, organizations empower managers and leaders who know the business, have the context, and can realize the full potential of the insights. Business teams can investigate and collaborate using the data and make smart decisions as a result of that analysis. In the new culture, everyone has access to the appropriate and correct data based on credentials and is encouraged to explore. Anyone can uncover actionable insights, share findings, and improve outcomes.
Want to make the business case for mobility? Learn how to get your budget approved, and more in our new eBook.