"Just give me the numbers!"
Falling firmly into the "I just can't make this stuff up" category, the preceding statement was made by the head of an engineering department. He wanted the performance figures on a series of database lookups so that he could determine if the database code was performing up to specifications. This would be a perfectly reasonable request, except for one minor problem: The database code was not producing the correct results in the first place. Performance was sort of irrelevant given that getting the wrong answers quickly is not necessarily all that helpful, although it may be less irritating than having to wait for the wrong answers. It's rather like driving at 75 mph when lost; you may not know where you are or where you are going, but at least you'll get there quickly. Or something.
In another example, the engineers developing a bioinformatics data analysis package spent all their time arguing about the correct way to set up the GUI elements on each page. The problem was that when they actually ran one of the calculations, the program appeared to hang. In fact, I was assured by everyone, it just "took a long time to run." How long? The answer was, "Maybe a few weeks."
This may come as a shock to those few people who have never used a PC, but a few weeks is generally longer than most computers will run before crashing. Besides, the complete lack of response from the program regularly convinced users that the program had crashed. The engineers did not want to put in some visual indicator of progress because they felt it wouldn't look good. They refused to remove that calculation from the product because "someone might want to try it." Eventually, they grudgingly agreed to warn the user that it "might take a very long time to run."
In both of these cases, the team was solving the wrong problem. Although there were definitely complaints about the speed of the database, speed was very much a secondary issue so long as the database wasn't producing correct results. And while the user interface decisions were certainly important, designing an elegant interface for a feature that will convince the user that the product is not working is not particularly useful. At least rearranging the deck chairs on the Titanic was only a waste of time. It didn't contribute to the ship sinking.
So why were these teams so insistent upon solving the wrong problems? If you give someone a problem they can solve comfortably, and one that they have no idea how to approach, they will do the former. At that point, once goals are set, those goals become the focus of everyone's attention and a lot of work goes into accomplishing them. That is, after all, the best thing about goals; unfortunately, it can also be the worst thing about goals.
While clear, specific goals are certainly good things, goals also have to make sense. You need to have the right goals. It can be a very valuable exercise to look at the goals assigned to each person and each team in the company. Do those goals make sense? What problems or challenges are they addressing? Are the goals complementary, or are there significant gaps? If the engineering team is being evaluated on how many bugs they can fix and the QA team on how many new bugs they can find, what happens to the step where fixed bugs get verified? If no one is responsible for that happening, it won't get done (and didn't, in several software companies!). If the team focuses on the wrong problems, they'll spend their time fighting symptoms or revisiting solved problems, and never deal with the real issues.
Therefore, even before you can set goals, you have to know what the problem is that you are trying to solve. That means first separating the symptoms of the problem from the problem itself. The symptoms are only symptoms; frequently, they can point to many possible problems. It's important to look at the symptoms and brainstorm which problems they could be indicating. When you start developing possible solutions, you then need to ask what the final product will look like if you go ahead with your solution and you need to know what success looks like. Make sure that your proposed solution will actually solve at least some of the potential problems you've identified, and develop some way of testing to make sure you are solving the correct problem. In other words, have some checkpoints along the way so you can make sure that you're actually improving things. Only then can you start to set goals that will effectively guide you to producing the results you actually need.
Once goals are set, they have a way of taking over. What are you doing to make sure you don't set goals before you know where you're going?
Stephen Balzac is an expert on leadership and organizational development. A consultant, author and professional speaker, he is president of 7 Steps Ahead, an organizational development firm focused on helping businesses get unstuck. Steve is the author of The McGraw-Hill 36-Hour Course in Organizational Development and Organizational Psychology for Managers. He is also a contributing author to Volume 1 of Ethics and Game Design: Teaching Values Through Play. For more information, or to sign up for Steve's monthly newsletter, visit 7stepsahead.com. You can also contact Steve at 978-298-5189 or email@example.com.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.