“Big Data” Is OK, But Intuition Rules, OK?

I tend to be an early riser and this weekend was no exception.  In my usual quiet time in the morning, with coffee and Macbook in hand, I settle down in the kitchen, read the BBC News site, and generally see what’s going on in the world.

This weekend I stumbled upon a blog titled “Why Big Data Will Never Beat Business Intuition” and I have to say that quite simply, I agree.

I won’t recap the article for you, but needless to say, it makes a series of points to argue that we should take a little time to really think about how we use Big Data and it cautions us against blind interpretations without human intuition.

Let me add another example I’ve used in presentations over the last few weeks.  Analytics aren’t reserved for businesses or data scientists. We all use analytics everyday. My friend Donald MacCormick blogged on a great example last year when he used the BBC weather website as an example of an analytic, and again I agree.

But here’s the point – when I read the BBC weather website and it tells me the forecast for the day ahead, I don’t simply take it for granted. The first thing I’ll do is look out the window and ask myself if the weather looks like it’s supposed to. Quite often I’ll even open the door and really check how warm it is. That’s my human intuition telling me not to rely just on the data.

And that in a nutshell is why I think Tim Leberecht’s blog makes sense. And that’s why, when it comes to Big Data, I believe two things are critical – the discussion on the use case and application of the data, and the education of the people using it.

Of course, flexible, self-service, and real-time analytics are then needed to allow people to use their intuition in a natural way, as opposed to a machine-driven way, but it’s that intuition and education that really makes their application work.

This blog was first posted on the SAP Analytics Blog

Its June…Time for Tennis

Its the end of June and here in south west London where I live that means only one thing…Wimbledon.  So along with unseasonal weather that means that British tennis fans are typically now reaching the highest peak of their inflated expectations regarding the prospect of a British winner of this historic major championship.

Now what normally happens is that in about 10 days time they then enter the trough of disillusionment as all the British players are knocked out, the champagne is warm and the strawberries are running out.

However my own interest in Tennis was revived when I had the opportunity to co-present with former professional tennis player, broadcaster and SAP Ambassador Justin Gimlestob at Sapphire Now in Orlando.

Justin explained how analytics are now playing a huge part in the game, in his analysis of it and the fan experience.  Justin made me see a whole new tactical side to the game that frankly had been lost on me due to the hype and fixation around a British winner of Wimbledon. That appeals to my analytical nature and the result is that this year I’m watching and enjoying WImbledon once again. Although the sight of expectant British tennis fans draped in Union Jacks still turns me off a little…

Image

You can see our presentation on the Sapphire Now website.  If you don’t want to listen to me then start the video at about 12minutes 30 seconds which is when Justin joins me.

http://events.sap.com/sapphirenow/en/session/4653

A Question of Analytical Education Verses Analytical Simplification

At the Gartner BI and Analytics Summit in Barcelona this week, I found myself participating in an interesting debate on whether the growing army of business users, in this new age of pervasive business intelligence and analytics, should receive education and training in analytics or whether the tools should just be easier to use.

Gartner directly posed the question during a panel discussion – should we improve user skills rather than simplify the tools? The premise was that if analytics were easier to use (with features like guided discovery and intuitive, easy-to-use interfaces), users would need to know absolutely nothing before they apply  statistical packages, visualization templates, and predictive algorithms to uncover the gems of insight buried in the data. This  feels a little like we’re suggesting that even though we’re living in a world with a burgeoning surfeit of data, as long as users have the right analytics tools, the data will simply tell its own story.

Me –  I’m not so sure. I’ve always found that if I just wade into a plethora of  information without any prior hypotheses of the type of relationships and trends that I might find, it’s always difficult to separate out the useful data from the misleading or confusing data that’s simply noise. I guess I need to go through a process similar to classical market research: first, develop testable hypotheses through qualitative focus groups and face-to-face interviews to surface the issues, then collect and analyse quantitative data to measure which ones are true and important enough to act on.  The parallel in business is having a dialogue with colleagues and customers before diving into the data.

Is the Traditional Scientific Method Obsolete?

Chris Anderson challenged this accepted approach in an article he wrote for Wired back in 2008 called ‘The End of Theory: The Data Deluge Makes the Scientific Method Obsolete’. He suggested that in the petabyte world of big data, the traditional approach of hypothesize – model –  test is obsolete and “the new availability of huge amounts of data, along with the statistical tools to crunch these numbers, offers a whole new way of understanding the world.”  Many people have criticized such optimism, using planetary weather as an example to point out that too much ‘noise’ in the data about planetary weather made it  unlikely that trends in global warming would have been uncovered without researchers having a prior hypothesis.

Seems to me that Anderson’s bold statement also denies us – the human – any role in analytics. While I don’t doubt that some data-driven predictions can succeed, I’m convinced that a questioning mind, a better knowledge of the math, and an appreciation of the common misconceptions that people typically make, will always result in better assessments and reduce some of the inherent risk that results from poor business decisions.

Analytical Education Still Relevant and Increasingly Vital

So the point I was trying to make, albeit in 140 characters on Twitter, is that although we do need to make tools easier to use,  we (vendors, organizations and, increasingly, our educational systems) also have a responsibility to ensure that anyone using analytic tools is provided with a general grounding in simple statistics and research methods.  These users can then develop a questioning approach to using data and, by default, analytics in order to support the business decisions they make in their working lives. This approach could entail:

  • Questioning the validity of the data they’re working with. For example, we know that cancer patients can be misdiagnosed.  Some who are told they have the disease don’t (i.e. a false positive), while others who are told they don’t have the disease in fact do, (i.e. a false negative). You can bet we encounter the same issue in business all the time and never even think about it!
  • Teaching about probability and significance so we appreciate that findings and predictions are generally range based rather than a single data point – and how this impacts results such as elections or sales wins which have binary (win all or lose all) outcomes.
  • Helping us to become ‘diligent skeptics’ when it comes to data. A colleague tells a great story about an experience in insurance when a business analyst from the special lines division merged some previously siloed data sets and found considerable crossover between customers taking out one or more of their policies (cover for pets, music instruments, extended warranties) and was adamant this was evidence of “significant customer loyalty”. The reality was the insurer was the market leader in special lines, all of which were sold under different brand names through different channels, and the crossover was inevitable rather than evidence that indicated the likely success of a concerted campaign of cross selling.
  • Showing students how best to present and visualize data so it’s easy to understand and can be quickly digested.

If we don’t focus on analytical education and training, we risk entering an age of pervasive analytics that could rapidly become dysfunctional as people act on insights that are beautifully presented but entirely misplaced.  Analytics, forecasting, and predictions will always have some degree of error. Everything we can do to minimize that will improve the decisions we make and benefit our businesses.

Even the black belts get it wrong occasionally. Nate Silver, the statistician who became famous for predicting the voting outcome of every state in the 2008 and 2013 US Presidential elections through the use of sophisticated statistical analysis, fouled up on predicting the result of the Superbowl last weekend. If experts can get it wrong, where does that leave the rest of us?

Is the world ready for Integrated Reporting? The technology sure is.

Over the last couple of weeks I’ve had an increasing number conversations about the future of finance, the role of analytics within it and the changing regulatory landscape that underpins it.  This follows a summit event SAP held in Waldorf last month which gathered about 60 CFOs together with the International Integrated Reporting Committee (IIRC) to debate the topic and all this prompted me to cast my mind back to 2006 and a claim for “Real-time reporting”.

It was in November of that year when the world’s biggest accounting firms joined forces in calling for a radical overall of how companies report performance saying that the current way they communicate with investors is “broken” and “redundant”.  The reason they put on a united front was because what they were proposing was radical and would result, if implemented in the most significant shake-up of the type of information that companies share with the markets since accounting standards and independent auditing were introduced in the 1930s.

Frustrated by the perceived irrelevance of today’s purely financial reporting and no doubt badgered by clients who baulk at the cost involved in producing quarterly financial statements, the Big 6 (PwC, Deloitte, KPMG, Ernst & Young, Grant Thornton and BDO) called for regulators and policy makers to move towards real-time, internet-based reporting, encompassing a wider range of performance measures.

Given the raft of regulatory changes, such as the Sarbanes-Oxley Act, that had been introduced to strengthen corporate reporting in preceding years, their call was not unsurprisingly met with some entrenched resistance, and in a number of letters to the FT at the time ridicule.  However at the time I found it difficult to disagree with their basic premise that, “Current systems of reporting and auditing company information will need to change – toward the public release of more non-financial information customised to the user, and accessed far more frequently than is currently done.” I’ve essentially been a proponent of this ever since.

They pointed out the accepted wisdom that the increasing discrepancy between the “book” and “market” values of many listed companies means that content of traditional financial statements is of questionable use and that incorporating non-financial measures, such as measures of customer satisfaction, product or service defects, employee turnover and patent awards, would provide more valuable indications of a company’s future prospects.

Six years later and beyond the introduction of XBRL filing requirements from a number of regulatory bodies including the SEC not much has really changed. From the outside, one might have thought that the current economic uncertainty and the fragility of many once powerful global companies would have been an impetus for change. But one suspects that senior business leaders on advisory panels are pushing back on any suggested changes in regulation for the moment. Indeed, my own conversations with finance leaders shows little appetite for broader regulatory disclosure and would do the minimum needed to comply.

But while progress against their original vision may have been limited, the desire of many likeminded individuals and a few visionary organizations (who are voluntarily disclosing more data) to drive a change in the way we report has not. As the workshop in Waldorf demonstrated there is a renewed interest in seeing finance lead the way and helping create a framework by which companies share more insightful information with more people.  It’s not about more reporting, but instead is about creating strategies which lead to longer term sustainable value and demonstrating that to stakeholders who care about this.  I believe the IIRC would recognize, as do the pilot companies involved in the program that there is much to be done to develop and implement the concepts around integrated reporting but one thing that has not stood still in the last 6 years is the enabling technologies and I believe we are now ready to make this a reality for the enlightened.

By combining solutions such as Analytics from SAP for analysis and visualization with in-memory engines such as SAP HANA that both enables the rapid retrieval of transactional ERP data and real-time monitoring and compliance; analysts, investors and other interested parties could have immediate access to the latest financial and non-financial data and address the “big-data” challenges that are inherent with integrated reporting. As a result solutions also provide a framework to allow organizations to develop and manage their strategies in a way that leads to long term value. What’s more, they could present it anyway they wanted and with the help of XBRL tagging, make easy comparisons between competitors and peer groups. With SAP’s Mobility solutions, they could even do it on a hand held device.

As many companies are already using these technologies to manage their business, no one can credibly argue that integrated reporting cannot happen until the enabling technologies arrive on the scene. For me, the only remaining obstacle is global businesses’ appetite for change and that may be somewhat intransigent. But events such as the recent Facebook stock offering with prospective P/E ratios around 90 have helped fan the flames and will surely keep the momentum for change alive.