The slow evolution of predictive data analytics
- 01 June, 2017 10:16
Back in the early days, when forward-thinking executives pioneered the use of predictive analytics for business gain, they tended to recruit specialists who knew a lot about maths and economics but not necessarily very much about the business.
Today, of course, the art of predictive analytics is mainstream. Better still, it is starting to become readily available to anyone in the organisation, as needed.
But it has been a slow evolution. It began with line managers using spreadsheets to track activity after the event, and then – and no more scientifically than gauging wind direction by hand – trying to guess what might happen next.
We then had gifted amateurs trying to help by toying with simple databases and IT being goaded by line managers to at least ‘look into’ analytics.
Slowly, dashboards, disparate data silos and incompatible software tools accumulated but insights were still hard to uncover. Management staff knew more about their business but were no smarter.
Then board members saw the light. Directors were persuaded that data was one of their most valuable assets and that analysing it could lead to business gain.
The evolution gathered pace. Organisations made significant investments in predictive analytics and those people with oversight of these initiatives – whether in marketing, IT or elsewhere in the company structure – started finding themselves at the c-level, reporting directly to the board – and rightly so.
But if this is where we are today, there is still something missing.
Firstly and most visibly, there are simply not enough data scientists to meet employers’ needs. Opinions about the scale of the shortage vary but a report by McKinsey & Company estimates that next year in the US there will be a shortage of 140,000 to 190,000 people with deep analytical skills – as well as 1.5 million managers and analysts with the knowhow to use the analysis of big data to make effective decisions.
Relatively, the situation is no different in Australia and New Zealand. Our teaching institutions, together with support from industry and government, are working to close this gap but as data volumes increase and ever more organisations seek to take advantage of them, the problem will persist. Supply is nowhere near catching up with demand.
In addition to the skills gap, the other things holding organisations and productivity back are variety and complexity – the need to keep pace with rapid technology advances and changing market conditions.
While useful insights were found in only customer and other internal data just a few years back, organisations now need to include data from many other sources and of many different types if they want optimum answers. Add to this the new need to include machine learning in modeling, and even well-resourced analytics infrastructures are stretched.
Stretched organisations will look for others with an ingrained analytics culture to help with the heavy lifting by offering a low cost, low risk service. The same will be true with any organisation that understands that data is probably its most important asset but doesn’t have the time or capability to realise that value. They will want a cloud-based fast point of entry to analytics best practice. Think of it as analytics on-tap.
The analytics evolution is approaching the point where management is seeking pointers to competitive and other forms of business gain that will concentrate on learning – or seeking to be taught – how to define the questions that need to be asked of their data.
This will ensure that others with an agile analytics culture can be engaged to answer those questions and help them drive the desired outcomes and speed up ‘time-to-value.’
This shift will start slowly, with big organisations’ overloaded analyst teams needing additional resources for short term projects. Analytics-as-a-service in the cloud will also be adopted by smaller organisations leapfrogging their search for insights beyond the scale they can manage or afford for themselves.
Smaller companies in particular will want more than just the results of modelling. They will want providers to show them how revealed insights can be operationalised.
Companies adopting this approach will, in effect, be spending their time realising the value of their data, rather than suffering the chore of managing it. As service providers develop their offerings they will build suits of proven analytics models which can be fine-tuned for individual buyers’ unique requirements, rather than reinventing the wheel – thereby minimising cost and speeding up results to fast track the benefits of the analytics.
Eventually, in addition to customers buying such services to overcome the shortage of employable analysts; avoid fixed costs; enjoy the benefit of being able to lean on expert data management and analytics assistance; and accelerate the delivery of actionable insights, they will look for flexibility.
Service providers will have to expect their customers may subsequently want to transfer all or some of this outsourced work in-house. So they will need to adhere to open technology standards, and offer pricing models and agreement terms to cater to that.
Only a few years ago, many enterprises said they would never trust their data to the cloud. Now, it’s common practice because it makes sense in so many different ways and is seen as secure. Similarly, we will soon wonder what we did before we could trust our analytics to outside services dedicated to that purpose – and get our analytics outcomes on-tap.
David Bowie is vice president of SAS Australia and New Zealand.