I recently wrote about the importance of measuring performance to get funding, resources and support for security initiatives. Executives, who ultimately decide how company resources are rationed out to various departments, are particularly focused on key metrics. It's these metrics that differentiate projects and convince executives to spend money and time and increase head count.
So, I've been working on refining the things I'm measuring, to produce the most compelling story. I'm trying to use metrics to explain what we're doing well (how much value the company is getting from what they've already invested), what we need to improve (how to spend future dollars) and what our risks are (keeping our focus on using our resources in the right places).
I've also mentioned that metrics can be a double-edged sword. If they show weaknesses in a security program, it's important to show substantial progress in those areas at a pace that will convince the company's leadership that appropriate steps are being taken. I had a lot of red on my first set of dashboards, so I've been putting a lot of effort into improving those numbers. Now, they are mostly green with a few yellows. I've been keeping the executives up to date on this progress, and the feedback I'm getting is positive. A trend toward improvement is always important for demonstrating value. I feel as if I'm playing with fire by highlighting our problems, so I'm balancing that by demonstrating accomplishments. There are many areas of my security program that have good stories to tell, so I'm taking advantage of the metric reporting process to get those messages out.
Now that I've been measuring some basic security control performance for a while, I'm looking into adding some new subjects to my dashboard. Ideally, I want to focus executive attention on those areas I think are most important and need the most improvement at this time. But I don't want to base that on just my opinion, so I'm looking at including a more comprehensive risk analysis into my process.
I'm also thinking about comparing my company to others, an approach that executives seem to find very compelling. They often ask what other companies are doing, and almost every day one of them will tell me about their experience with security at their last company. It's funny, but there seems to be a basic assumption that every other company is doing security right, and we need to catch up -- even though that's entirely untrue in my experience. (I have yet to see a company that does a really good job of protecting their information assets.) I find that most companies do some things well, but other things not so well. Most people I talk to have worked at a company that made an impression on them with how well they did something, such as disk encryption or security awareness. So I'm thinking of looking at industry averages for various security categories, with Capability Maturity Model Integration (CMMI) scoring to show how mature we are in various areas versus how mature the industry as a whole is.
Ultimately, I want to derive meaning from the numbers I'm presenting. I'm trying to make specific points such as "we need to focus more resources in these areas" or "the time and money we've invested in certain efforts has paid off." That's a lot harder to do than just presenting a bunch of raw numbers -- it's information, instead of just data, and it's going to take some work on my part to figure out the best way to do that. I feel as if I'm breaking new ground, in a way, because I haven't seen any good models for reporting security metrics. I hope to come up with something good. Wish me luck.
This week's journal is written by a real security manager, "J.F. Rice," whose name and employer have been disguised for obvious reasons. Contact him at firstname.lastname@example.org.
To join in the discussions about security, go to blogs.computerworld.com/security.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.