I'm a numbers guy - you have to be if you're an analyst - and I'm a firm believer that analytics, used properly, could not only change your company but also change the world.
However, I've spent the last few weeks studying some of the bigger market failures of the last decade and, in many cases, executive management had the information it needed to avoid the mistake but simply chose not to use it.
Years ago, I did a study at IBM to understand how the company lost some of the markets it dominated. IBM was extremely numbers-focused, and here I found an intentional corruption of study results to give executives a sense of false comfort. It was so bad at one point that it cost the IBM CEO his job and nearly put one of America's oldest and most successful companies out of business.
Analytics is capable of giving executives unprecedented insight into assuring that their decisions are the correct ones, but many who pay for the deployment of these tools will likely instead find they are used to showcase what idiots they were after the fact. This would be like a car safety system that, rather than prevent an accident, only reported on the failure of the driver after an accident.
I'm not arguing that analytics are bad. If corporate culture favours positive information over accurate information, and if the winning choice comes from the most powerful executive, rather than the best informed one, then analytics is only going to make a bad decision look worse, as it will show that the bad decision could have been avoided.
If the bad behavior isn't fixed first, it will force whoever uses the tool to either falsify the results or constantly be a source of career-ending information on bad decisions - which in and of itself is a great path to unemployment for the user.
Microsoft's experience: Make a decision, then run the numbers
Watching Microsoft about five years ago, I couldn't reconcile the fact that CEO Steve Ballmer - even more of a numbers guy than I am - seemed to be making horrid mistakes that should have been avoidable.
I figured either the numbers from Microsoft's internal market research organisation weren't getting to Ballmer or the folks in market research were incompetent. After an interview with the head of market research, I concluded that both assumptions were wrong.
Rather, this organisation clearly had a mission to give executives results that made decisions they had already made look smarter-apparently so they were better protected when the decision didn't work out.
I figured Ballmer would eventually figure this out, since having numbers that say one thing and results that clearly say another should have clued him into the problem, even if he ignored my annoying emails.
This was also about the time I first ran into argumentation theory, which suggests that we are hard-wired to hold in high esteem people who win arguments. What's really screwy about this, when put in the context of human nature, is that we don't seem to care that much if someone is right, only that he prevails. To that end, we'll follow executives that win arguments no matter what.
Now think of Ballmer's position. (I'm picking on Microsoft because I really tried to fix this problem and was catastrophically unsuccessful, and argumentation theory may explain why.) Ballmer's a business guy in a company formed around a brilliant software developer, Bill Gates.
This would be like putting a hockey coach in charge of a tennis team. I don't care how long you're there, chances are you won't survive unless you figure out a way to fix the game.
My working theory: Ballmer inadvertently tasked research with a role of making his positions look right after the fact to offset the problem of him running a company of experts in an area where he wasn't an expert himself.
The sad thing is that, had he approached analytics a bit differently, he could have made better decisions, held his position of power and made Microsoft a better company. I still hope he'll eventually see this path before his time there is up.