The unexpected benefits of data analytics
- 22 January, 2018 22:00
Successful IT projects require clear goals, and data analytics is no exception. When conducting analysis, data teams seek to discover useful information about customers, to support decision-making on a project, to enhance productivity, and a host of other outcomes.
But given the exploratory nature of data analytics, sometimes benefits or insights gleaned are out of the blue, completely unexpected and not part of the original business plan. These happy accidents support the idea that all data analytics is worth the effort, because you never know what might turn up.
Here are real-world examples of organizations that have seen unexpected benefits from their analytics efforts.
A pulse on current operations
Allegis Global Solutions developed its analytics program, ACUMEN Workforce Intelligence, with the goal of understanding three things: How are programs performing? How are they performing against other programs? What should we do next?
“While the analytics platform was created to develop a historical perspective that can be used for planning, we discovered that the analytics used to answer these three questions had a domino effect in our organization,” says Tim Johnson, executive director of business intelligence at Allegis.
“Because data is pulled from all of our programs and [we] update it on a daily basis, we have a very fresh view of operations,” Johnson says. “This resulted in the recent rollout of a new data application that helps analyze day-to-day activity, both at the program and enterprise levels.” Operators don’t need to pull a report and analyze information to gather program insights. Instead, they can just take action on information that they’ve already structured and presented.
“As an added benefit to using data operationally, analytics adoption rates are much higher than initially projected,” Johnson says. “Now we have set our sight on a goal of 100 percent internal adoption on a weekly basis. Our end users, because they are using the data on a day-to-day basis to improve performance, are also keeping a closer eye on the information, driving data quality to a new level of excellence.”
For the company’s initial analysis purposes, data only needed to be 90 percent to 95 percent accurate. But for operations it needs to be 98 percent to 99 percent accurate. “With the buy-in across the organization, we’re there,” Johnson says. “It’s funny to think those three little questions started this organization-wide data transformation.”
Long-term benefit of short-term losses
A data analytics effort helped online real estate resource provider Trulia fine-tune its email strategy for one of its campaigns, which has ultimately increased traffic.
The company had been sending multiple emails a day, which increased the number of customers who unsubscribed, says Deep Varma, vice president of engineering. “Then we shifted our approach and aggregated emails so that we were only sending one email per day,” he says.
At first, Trulia saw a decline in engagement from its users, so it pulled back on the new approach. But the company then decided to test the new format for a longer period and saw better results, which it didn’t initially expect, Varma says.
“In this example, initially, data proved something was going wrong because we didn’t broaden our scope in the testing,” Varma says. “Once we did, we experienced an unexpected benefit in the sense that we never thought the traffic would increase because the data in one week showed traffic declining. However, by running it longer, analytics proved the opposite was true.”
In another example, data analytics inspired a new product, which was not the original intent.
“We saw consumers dropping after submitting an inquiry to an agent, so we created a post-lead experience where we started showing them the recommended properties that are similar to the listing that they had just inquired about,” Varma says. “As a result, consumers began returning and continue to return more frequently due to this new recommender profile.”
Warranty issues give way to IoT solution
At Rockwell Automation, a project came to the analytics team through the product quality group at the company.
“We were challenged with a warranty management issue,” says Sangeeta Edwin, director of business intelligence at Rockwell Automation. “Instead of just looking at the problem presented, we stretched our data analytics team to focus on identifying the root cause for the returns.”
By tracing the data back to the machine-level, the team found a manufacturing defect that correlated assembly failures to warranty returns. “This helped evolve our strategy and platform to include IoT [internet of things] machine data analytics,” Edwin says.
This simple data analytics scenario turned into a surprising solution for the business, Edwin says. “We took our learnings from the quality group and branded our own device-level data analytics platform for our customers,” she says. “Through data analytics, we transformed a business problem into a helpful tool for our customers, creating new revenue streams.”
Variable error uncovers hidden causes
Over the years, Decision Point Healthcare Solutions, a provider of software products for the healthcare industry, has seen its health plan clients significantly reduce hospital readmission rates by targeting individuals that are high risk for multiple admissions for their field and telephonic care management programs.
In these cases, Decision Point uses a specialized predictive modeling algorithm to identify member/patients that are predicted to have two admissions 30 days apart from each other, prior to the initial, “index” admission. “In short, health plans are getting ahead of their readmissions by targeting the right individuals to potentially avoid the initial admission, the readmission, or both,” says Saeed Aminzadeh, founder and CEO of Decision Point.
In hindsight, it’s a great approach, but Decision Point came across the methodology unexpectedly, Aminzadeh says. For care management, Decision Point’s traditional approach included predicting individuals that are at high risk for avoidable admissions, emergency room (ER) visits and costs.
The new approach of predicting multiple, clustered admissions came about when Decision Point’s data scientists erroneously replaced admissions with readmissions as the dependent variable — the variable that is being predicted — in the predictive model development process.
After careful analysis, the new model was found to be quite different from Decision Point’s traditional predictive models. While traditional models predicting admissions, costs and ER visits identified high-risk individuals with a significant mix of clinical and utilization issues, predicting multiple, clustered admissions highlighted the subset of those individuals that have both emerging clinical and socio-economic issues.
“For example, while traditional models identified high-risk individuals with multiple chronic conditions, [the] new model identified high-risk individuals with both chronic conditions and other issues such as living alone, lack of a relationship with their doctor, no credit cards, health literacy, behavioral health,” or other concerns, Aminzadeh says. This was an important finding because it shows that in order to lower readmission rates, healthcare organizations must not only be able to address an individual’s clinical issues, but also remove socio-economic barriers that exacerbate those clinical issues, he said.
Rediscovering the importance of primary care
Healthcare insurance provider Health Care Service Corp. has seen similar unexpected gains from data analytics.
“We discovered a hidden gem when initially trying to pull data to help us identify members who were going to the emergency room for avoidable causes — non-emergencies — and the reasons for doing so,” says Himanshu Arora, executive director of enterprise analytics and governance at Health Care Service.
If 10 percent of the firm’s members were enrolled in HMO plans, it would expect less than 10 percent of avoidable ER services to be attributed to this group, given their access to a primary care physician (PCP). “What we found was a trend in the opposite direction by almost four times as much,” Arora says. “This 10 percent of members was incurring 40 percent of all avoidable services in the ER.”
The high cost of an ER visit plus the sheer inconvenience for members going to the ER and not benefitting from services that can prevent or mitigate such health episodes drove the firm to find ways it could help members beyond providing access to an in-network care provider.
“We went back to the drawing board to better understand what determinants of care, such as language barriers, access to transportation, and scheduling issues, were causing our HMO members to visit the ER instead of their PCP and find solutions,” Arora says. “We engaged our provider network to help them proactively reach out to members to ensure they’re getting the care they need, [and] re-evaluated our product and network design to strengthen our risk sharing models with providers — so they have more incentive to help identify and act on these analytics.”
Redefining customer success — and analytics methodologies
Zeta Global, which provides a software platform for marketing applications, uses analytics to support client and internal initiatives, such as the development of algorithms for making predictions within client data, traffic log analysis for network flow management, or signal analysis for security management.
“Predictive power in data often lays where you don’t expect it,” says Jeffry Nimeroff, CIO at Zeta Global. “Unsupervised techniques exist because information exists in data beyond what individuals — even smart ones — can target and excise. Unexpected results are part of the magic, and we’ve had many instances where we have stared in amazement” at findings.
One example involves technology cost reduction. “Most organizations that deploy technology and incur technology costs end up with some form of shadow technology,” Nimeroff says. “Individuals, in the search to be more productive, find the tools that work for them.”
As a part of its recent security operation center partnership, the implementation focused on threat intelligence. The expected results presented continued progress on security maturation, Nimeroff says, but the unexpected value came from uncovering hidden costs of security technology.
“With this visualization, we were able to talk with specific people about specific tech, some of which they totally forget about, and disconnect from services to the tune of a six-figure cost reduction,” Nimeroff says.
Another example comes from the firm’s efforts to determine its clients’ best performing customers.
“A lot of work in client analytics focuses on audience extension,” Nimeroff says. “In this instance, Zeta would leverage our proprietary data to match profiles to those who share properties with those profiles provided by the client. Having a larger audience of best performing customers will yield greater results.”
After generating models to predict what marketing messaging its clients should provide to their best customers in order to maximize results, Zeta saw audience performance hovering around the control group, Nimeroff says. “It turns out, via meta-analysis, that the original data provided by the client wasn’t their best performing customer set because the client limited the attributes they considered important for defining customer success,” he says.
When Zeta added the insights and extended the analytics back into the client database, leveraging more customer attributes, it was able to provide better audience data. By starting with the client’s database and messaging objectives, but having no preconceived notion of what attributes were important, Zeta was able to help extract the best client datasets. “This extended methodology has led to new practice areas at Zeta,” Nimeroff says.