0
Comments

Data Quality: Accuracy or Precision?

For many decision-making scenarios, analytics are moving away from the traditional slower-moving BI of previous years. New kinds of analytics are emerging to handle accelerated challenges facing many enterprises where decisions must be made faster, and where a greater variety of data impacts the enterprise including complex data formats. In response to this emerging world of continuous real-time decision-making, what constitutes data quality is also changing for certain problem-solving scenarios. Data quality is becoming a ‘sliding scale’ where expectations for data quality are relative to the purpose and type of data analytics and business need. “Good enough’ data quality is becoming pervasive – but it comes with many qualifications and caveats.

Pearl Zhu writes about data quality for ‘big data’ in the Future of CIO blog – the points she makes are significant for today’s imperative for faster decision-making:

Therefore, the goal of data quality management is not about pursuit of “a single version of truth”, but how do we enumerate, rationalize and perceive all different versions of truth, and the diagnosis & analysis effort need focus on specific problems of relationship between data & what it represents, and what kind of business puzzles can be untangled via such data inter-relationships.  

The Internet generates a never-ending avalanche of all kinds of data; many of the sources are related to digital marketing and social media. But keep in mind that Internet data (big or otherwise) impacts the decision-making needs of many areas of an enterprise, not just Marketing. Web analytics and digital marketing expert Avinash Kaushik has this to say about Internet data quality:

Ditch the old mental model of Accuracy, go for Precision (more here: Accuracy, Precision & Predictive Analytics). It might seem astonishing but your analysis will actually get more accurate if you go for precision.

I am not saying accept bad data…What I am saying is that your job does not depend on data with 100% integrity on the web. Your job depends on helping your company Move Fast and Think Smart.

Kaushik further sees the direction of data quality morphing from the ‘old school’ BI need for data Accuracy to future insights driven by predictive analytics relying on data Precision

Source: Avinash Kaushik

Data Quality for Situational Intelligence

Visual Business Intelligence (VBI) has been gaining interest for several years. Drawing on complex data, VBI gives users the ability to interact with data to make decisions and take actions with greater precision. VBI encompasses contextual analysis and predictive modeling, to help optimize the selection of the ‘best’ actions to take. Organizations with requirements for situational intelligence or situation awareness find VBI an important process when decisions have to be made now, based on whatever data is available now. Typical use cases include disaster response and military operations. This sort of analytics requires fast, easy access to existing data sources, without immediate concern about the freshness or quality of the data.

 

A significant challenge for VBI entails the validation of data sources and the quality of the data in situ: do the end users understand what they are doing and are they intelligently combining the appropriate data? Is the intelligence that is pulled together relevant and accurate, and how do end users know or test the intelligence to provide validation? One essential ‘must have’ is the involvement of top notch subject matter experts (SMEs) to identify data and analysis problems quickly, whenever they might affect the situational intelligence that is being sought.

Share and Enjoy:
  • Print
  • LinkedIn
  • Facebook
  • Twitter
  • Digg
  • Technorati
  • StumbleUpon

Leave Your Response

You must be to post a comment.

Search

Welcome to Pervasive Software's Data Integration Blog

Log in

Lost your password?

Register For This Site

Join

Join us as we spread the word.