As healthcare prepares for the Sept. 23 compliance deadline for the HIPAA Omnibus Rule, the industry finds itself at a crossroads.
On one the hand, the rule - published in January 2013 and effective March 26 - effectively brings HIPAA (enacted in 1996) into the 21st century and finalizes the new security and privacy safeguards required by the HITECH Act of 2009.
On the other hand, hardly a day goes by without a report of a (largely preventable) patient data breach from a hospital, contractor or other organization handling sensitive personal health information.
Why does healthcare struggle so much with data security? And what will it take for the industry to turn the corner?
Small Practices Especially Susceptible to Breaches
Since the fall of 2009, the U.S. Department of Health & Human Services (HHS) has, per the HITECH Act, published a list of data breaches affecting 500 or more patients. As of mid-September 2013, about 660 breaches had been reported.
A healthcare data breach analysis published by the Health Information Trust Alliance ( HITRUST) at the end of last year notes that data theft outnumbers all other causes of data breaches combined - loss, unauthorized data access or disclosure, incorrect mailing, improper record disposal and hacking. Since 2009, hospitals and health insurers have reported fewer breaches, which suggests that they are getting better at preventing data loss, but academic institutions and especially physician practices struggle to address the issue, HITRUST says.
Analysis: 11 Ways to Make Healthcare IT Easier
Small, independent practices typically lack the expertise and resources to handle their own security. What further complicates the matter, HITRUST points out, is the additional need to ensure that HIPAA business associates - those consultants, contractors, cloud service providers and other entities that handle a practice's patient data - also comply with privacy and security rules.
"Where we believe many organizations falter is not identifying and restricting access to what is actually required at a data, application and network level," HITRUST says. "This leads to information leakage and, ultimately, high-profile breaches when they do occur."
Take HIPAA Security Risk Analysis Seriously
It's for this reason that the federal meaningful use incentive program and the HIPAA Security Rule require healthcare organizations to conduct a risk analysis that examines the "confidentiality, integrity, and availability of electronic protected health information" (ePHI) that the organization holds. (Having such an agreement in place also tends to lessen the severity of the penalties levied by the HHS Office for Civil Rights if a breach does occur.
At the outset, a risk analysis should cover the basics, says Christopher Hourihan, principal research analyst with HITRUST. This includes establishing policies, setting up a basic firewall, installing antivirus software, encrypting data and hardware, and training employees.
Organizations can't be afraid to restrict access to data, applications or auxiliary devices, adds Robb S. Harvey, a partner with the law firm Waller Lansden Dortch & Davis LLP. This isn't always easy, as doctors and executives alike often want access to all data, all the time. Meanwhile, training must encompass data security, of course, but it also must cover how to behave in the event of a breach - who to contact, what types of services to make available to patients, and so on.
"Have a plan and follow it when you have to," Harvey says. "Getting a good plan in place is something every company should have, both from the standpoint of making sure patient information is being protected and also to protect the healthcare organization."
Policies can't be created by lawyers and human resources working in a back room, says Will Hinde, healthcare practice director for the technology and management consulting firm West Monroe Partners. Doing so creates disconnect between IT and an organization's business units, he says. Those drafting the policies need to know the pain, and the risk, of implementing them.
The risk analysis process hasn't changed much under the HIPAA Omnibus Rule, Hinde says. People still play a pivotal role, so it's important to evaluate the competency and integrity of the people who handle your data.
That list of people needs to include folks outside your four walls. HIPAA risk analysis must involve business associates - any vendor that creates, receives, stores, transmits or otherwise possesses your organization's patient data. Why? When a breach occurs at a healthcare organization's business associate, the entities share responsibility, and the Office for Civil Rights is bound to investigate the relationship between the entities. Simply put, a HIPAA covered entity needs to know how its business associates handle its data.
"It's been a problem, and it will continue to be a problem, because businesses sign a contract and then don't do anything else," Hourihan says.
Healthcare data analytics initiatives further complicate matters, as the data that's being examined and the conclusions drawn from it often get passed from one entity to another. Organizations must therefore consider the merits of health data innovation and corresponding privacy concerns carefully. "The information's pretty powerful," Hinde says, "and if it falls into the wrong hands, it can be costly."
Encryption Would Help, But Few Healthcare Providers Do It
Encryption technology - properly used on the process of transmitting healthcare data and on the hardware that stores the data once it reaches its destination - would help alleviate these privacy concerns. It would also give organizations so-called "safe harbor" under new HIPAA regulations: If healthcare data is lost or stolen, but it's encrypted, then that event is not considered a data breach.
Yet healthcare organizations often overlook encryption when drafting security policies. Some assume that, if data isn't leaving the building, then it's not at risk and encryption is unnecessary, Harvey says. If that's the case, orgs need to be even more vigilant about strengthening firewalls and maintain software patches, he adds. (Most aren't.)
Other practical matters inhibit the adoption of encryption technology, Hinde says. Encryption comes with administrative and performance impacts. The former involves storing encryption keys and assisting end users who invariably lock themselves out, Hinde says. The latter are especially true of electronic health record (EHR) systems and their associated databases, and it's something most organizations only realize after the fact. Neither help justify the cost.
Put another way: Healthcare organizations don't see encryption as a pain point, Hinde says. "No one's complaining, 'Man, I don't have encryption.'"
Healthcare Data Breaches Getting Harder to Detect, Track
Increasingly, though, data insecurity is becoming a pain point. Insurer Wellpoint was fined $1.7 million after its online application database exposed information on more than 600,000 patients, while Illinois provider Advocate Health Care faces lawsuits in the wake of a massive breach of more than 4 million patient records after thieves snagged four password-protected but unencrypted laptops in a July burglary.
While the number of healthcare data breach victims dropped in 2012 compared to 2011, 2013 is shaping up to be a bad year for breaches - the Advocate Health Care incident alone involves more patients than all breaches reported in 2012.
In addition, though the HITRUST report found hacking to be the root cause of less than 10 percent of breaches, Hourihan says the industry should expect a "pretty significant rise" in such breaches. Personal health information - up to 50 times more valuable to a crook that a Social Security number or credit card number - frequently pops up on underground message boards. It's difficult to trace back to a reported breach, Hourihan says.That, combined with a general lack of sophisticated security technology in the healthcare industry, leads HITRUST to believe those incidents are either unreported or as-yet-undiscovered breaches.
Law enforcement officials are even starting to see hackers use healthcare organizations as a way to gain access to military contractors, Harvey says, with compromised records quickly ending up in an offshore. "Only an especially vigilant CIO is going to catch encrypted data going offshore," he says. "It's not something that people are going to look for unless they know."
It's also no longer far-fetched to suggest that the hacked pacemaker - part of a recent plot twist on the TV series Homeland - could happen, Harvey says. Most medical devices operate on legacy software that isn't supported by current patches and is much too expensive to replace with newer software that is supported.
Efforts by organizations such as the Center for Internet Security (CIS) to establish medical device security benchmarks that can then be matched to an individual organization's security policy will help, and they could even be extended to broader clinical applications or devices used as part of the patient-centered medical home, but retrofitting safeguards on existing technology will be a challenge.
"We have to figure out how much of this we can tackle," says Rick Comeau, executive director of the CIS Security Benchmarks Division. That's where partnerships with device manufacturers, hospitals and industry groups such as the Medical Device Innovation, Safety and Security Consortium ( MDISS) and the National Healthcare and Public Health Information Sharing and Analysis Center ( NH-ISAC) will come into play, he adds/
Meaningful Use Making Healthcare Orgs Address Security
The CIS benchmarks, currently in the request for interest phase, are one reason to believe that there may, in fact, be light at the end of the tunnel.
HITRUST also suggests that the meaningful use incentive program may be casting a light on the need for electronic data security. Conducting a security risk analysis is one of the core, or mandatory, requirements for completing Stage 1 of meaningful use.
The corresponding breakdown of which types of facilities have completed meaningful use attestation also offers some clues as to why physician practices - "eligible providers" in meaningful use parlance - remain more susceptible to data breaches than hospitals. More than 80 percent of hospitals have attested for meaningful use, and therefore conducted a security risk analysis, compared to 50 percent of eligible providers. (To be fair, these figures are in line with the federal government's expectations.)
In addition, Stage 2 of meaningful use, which is slated to begin in 2014, places additional emphasis on the use of encryption technology when data is at rest. This, HITRUST says, should help address those breaches that result from the loss or theft of laptops, mobile media and other hardware containing patient data. "This helps make a case for the expanded scope of encryption to include all endpoints, particularly those in public or easily accessible sources."
Organizations are also starting to take the demands of mobile health seriously, Hinde says, weighing the threats that each type of smartphone or tablet can pose against its benefit to improving care quality, patient engagement or operational efficiency. Employee training takes time, he says, as does making a BYOD policy "administratable," but if it can be done without stifling innovation, then it will be worth it in the long run.
Brian Eastwood is a senior editor for CIO.com. He primarily covers healthcare IT. You can reach him on Twitter @Brian_Eastwood or via email. Follow everything from CIO.com on Twitter @CIOonline, Facebook, Google + and LinkedIn.
Read more about health care in CIO's Health care Drilldown.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.