Poor data quality is robbing insurers and financial institutions of between 15 and 25 per cent of operating profits, exposing it as low hanging fruit for the growth-starved financial services industry.
“Over the next three years the biggest opportunity for financial services industries in a steady as she goes market is extracting this value by fixing data quality,” said David Howard-Jones, a partner at Oliver Wyman.
Howard-Jones addressed the Institute of Actuaries Enterprise Risk Management Seminar this week in Sydney's central business district.
The costs of poor data management manifest themselves in excess regulatory capital, bad decision making, resource mis-allocation, and personnel issues, Howard-Jones said.
Fixing poor data quality is a relatively simple process, he said, but it requires focus and buy-in from all levels of the business. “It needs tools and time, as well as good processes, because good processes equals good data,” he said. “Culture also matters, because data has a network value within the organisation.”
“There are not too many ways to get a 10-20 per cent uplift in the bottom line,” he said. “You need to understand the context of data quality, and then create opportunities to fix it.”
Joshua Gliddon is a journalist at Filtered Media.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.