On a recent trip to Silicon Valley, I quickly realised that the term ‘big data’ was not just a hyped up buzzword but indeed a key focus of tech giants and startups alike.
Google, for instance, is making progress around analysing huge amounts of data from any source using the Hadoop engine. You may ask, for what purpose? There is no sinister motive, it's because they can and want to. It’s clear that the Internet of Things is real and is coming soon.
Let’s take a reality check for a moment. Enterprises are showing early signs of taking initial steps analysing these big slabs of data to potentially gain more insights into the buying habits of their organisation’s customers, among other things.
Perhaps their progress has been slowed by competing groups within their businesses with their own agendas about how to use business insights gained through this analysis.
But what’s really hampering the uptake of big data analysis? There’s a history of failed data warehouse implementations at play, which have given IT a black eye.
There was also the era of knowledge management when ‘chief knowledge officers’ emerged in the business and spruiked the value of unstructured data. This type of role has now largely disappeared.
More recently, we have seen different groups in organisations take more than a passing interest in this area. The chief marketing officer sees this as critical to the customer experience; and the chief financial officer will assume that this is best done in his or her domain.
So what’s required to do proper big data analysis and why haven’t many organisations made big gains in understanding the value of their information? Is there a lack of skills or is it a question of structure?
Silicon Valley experts I asked believe that many existing business analyst skills are of value when doing this type of analysis.
The skills around understanding a business problem and data mapping this back to a model were all critical. What I didn’t hear from these experts was there’s a lack of data scientists and other people with deep technical skills.
This then leaves ‘structure’ as the probable root cause. Who is going to be held accountable that the business does not have the required information about customers to provide the advantage that it is seeking?
There’s a lack of clarity in many organisations, despite the general understanding of the benefits that customer insights may provide.
Perhaps a single ‘centre of excellence’ is the answer? Possibly not, but it could be the right place to start. I would suggest that it may not be feasible to assume that a single group can take on this domain and succeed.
But it would be a mistake to have true ‘forward looking’ analytics and business intelligence confused and in the same organisation.
Information is created by multiple divisions of most organisations. It is also likely that there are disparate and multiple data sources; it would not be uncommon to hear about a company with half a dozen customer masterfiles.
Making sense from a mess
The promise of big data is that you feed all your raw data, with duplicated information, into one system. The data processing grunt work is then done by cloud systems with the smarts to figure out relationships between data sets and make sense of the mess.
That’s the promise but for those that have come from a traditional information engineering background, it’s almost sacrilegious. It means that’s all the effort to classify and codify has been unnecessary.
I believe that ‘garbage in garbage out’ is a truism and it takes a leap of faith to make this shift. What is clear is that as a profession, we have failed to build robust information stores when the focus has been primarily around internally sourced data.
We now have multiple external data sources and this will grow exponentially. Similarly, we are responsible for a dramatic increase in the volume and velocity of data generated from internal and external channels.
For example, we do business with our bank through multiple channels – websites, apps, call centres and perhaps even a branch. The bank can analyse this data to determine whether or not we will purchase a particular product that is tailored to our needs.
Targeting the right customers with the right products and at the same time, while not offending or annoying the customer, is the holy grail of big data analysis.
This is something that airline, SAS, can’t do. I was recently travelling on this airline from St Petersburg to Paris. Unfortunately, SAS lost my luggage for 48 hours.
This was frustrating and I complained to airline staff and shared my experience with my followers on Twitter. Despite my obvious distress, SAS was still sending me online offers to fly in the next few days.
Clearly such offers were never going to be well received and although sending them is fairly cost effective for the airline, they impact negatively on the customer experience.
It was clear in this case that systems weren’t really talking to each other and to locate the suitcase was problematic as I was told by different airline staff that the culprit item was in Frankfurt, Copenhagen and Paris.
Of course if I had purchased a , then I may have been able to track on my suitcase better than SAS.
What I learnt it is only when disparate sources of information are combined that organisations are able to truly provide the experience that is desired.
Are you up for the challenge?
My negative experience with SAS (the airline) was the result of the business and IT not working together effectively.
This is something that can’t be achieved by setting up a new standalone unit or moving core services to the cloud. The cloud can certainly help but there are privacy and compliance issues that often need to be addressed.
Organisations also need to maintain a very strict focus to ensure that standards are not compromised. That being said, it is also likely that most large enterprises are not squeaky clean when it comes to storing data and managing internal systems.
The challenge will be to make 'information management' do exactly what the name suggests. We don’t control all the source systems and never will and there will always be pressure to provide big data solutions quickly.
But let’s be careful about assuming that we can take shortcuts and the technology will resolve everything. In time, machine learning will help, but that’s for another discussion.
CIOs who step up to the mark will position IT for the future. Those who don’t will be taking the safe path but this leads to a dead end where IT is just a commodity vendor.
David Gee is the former CIO of CUA where he recently completed a core banking transformation. He has more than 18 years' experience as a CIO, and was also previously director at KPMG Consulting.