The (Re) Emergence of the Data Strategist

In his blog, “Why analytics platforms are failing your Data Scientists” Pyramid Analytics co-founder and CEO Omri Kohl asked, “With a saturated analytics and business intelligence (A&BI) market, why are we still struggling to make analytics platforms work for Data Scientists? And perhaps more importantly, why are we failing to see a return on our expensive Data Science initiatives?” He continues, “44 percent of analytics teams spend more than half their time accessing and preparing data rather than doing actual analysis. That’s a dramatic level of investment for very little return.” 

Kohl notes five ways in which analytics platforms are failing data scientists. Among them, data analytics platforms generally cater to casual users, the 80% of analytics personnel who are light information consumers. A common occurrence is a mismatch between an organization’s analytics strategy and a business’ day-to-day analytics and data workflows. Another is that it’s not unusual for a data analytics platform to be evaluated by non-technical users or personnel not thinking about business outcomes. For example, business executive may purchase a data-analysis platform though they do not perform the data analytic function. Conversely, a platform tool may be inadequate for a trained data scientist to fully function to the analysis needs. 

Another key problem is the inconsistency of the data. Consider the problem when the goal is centralized data storage, and, instead, data storage is allowed to reside in varied business units —causing data scientist consternation because they lack ready access to data. And the frustrating time-wasting task for data scientists, who instead of analytics assignments have to prep data, like data cleansing and normalization. 

Given this, the analytics & business intelligence market is struggling to make analytics platforms work effectively for data scientists. There may be more data, but functionality of data platforms are not sufficient to meet industry needs. Below is one explanation. 

Many of you are familiar with what is referred to as the ‘Big Data Fallacy’. It is the assumption that greater amounts of data lead to greater amounts of actionable insights. However, there is some bad news for anyone who had bought into that assumption. 

In fact, the opposite is true. Massive amounts of data tend to result in research customer paralysis, not analysis. Data, itself, does not generate insights. Data is the raw material, it is inert, like the raw minerals in the ground that have not yet been mined. Data ‘mining’ and insights processing—prepping data—the process of merging data from various different sources cannot be performed by data platforms. It is, in fact, a painstaking and tedious task best performed by data programs such as python, SPSS, R, or Excel. Moreover, it requires data programming skills far beyond those of a casual user.

Research buyers are slowly learning that the paradigm of ‘better, faster, cheaper’ is simply not delivering the kind of insights sought by corporate research clients. Yes, platforms are very good at producing descriptive statistics and PowerPoint slides for topline reports. But, they do not replace statistical programming and research analysis.

That realization has led to the rise of the terms ‘data strategist’ and ‘insights strategist’. 

The data strategist straddles the worlds of data scientist/statistician/insights professionals. The data strategist has the statistical and analytical tools that will pull findings out of vast amounts of data, including big social media data, existing client data, or survey research data. Moreover, a data strategist presents findings in the form of bullet-point actionable insights, a skill that is unfortunately rarely used. 

The reality in today’s research marketplace is that a statistician tends not to have serious survey research experience and a researcher usually does not have the requisite statistical expertise. A data strategist has both.

Despite higher-than-average costs of a statistical research partner, their efficiency of clear insights far outperforms the shotgun approach of research platforms. The bottom line is that the amount of data gather is not the determination of the quality of an insight. Each project should be treated as custom research.

For the benefit of your clients and to buttress your research reputation, think things through, use the statistical firepower that is available, and generate the insights that really make a difference.

To view or add a comment, sign in

More articles by Michael Lieberman

Insights from the community

Others also viewed

Explore topics