Tom Walsh, founder and managing partner of tw-Security, a health care privacy and information security firm in Overland Park, Kansas, related an incident that shows how the best intentions can go wrong. A small health care clinic that was acquired by a larger group had not switched to the purchaser’s medical records system. The clinic had a ransomware breach on a Friday afternoon, and, over the weekend, the IT support “blew away” the server and loaded a new operating system. He did this to avoid paying ransom and allow doctors access to the system first thing Monday morning. Later, when the group’s IT security officer asked for the audit logs to prove a breach had not occurred, the logs had been erased with the rest of the system. There was no way to prove a breach had not occurred. The group had to report the breach to patients and the media because more than 500 patient records were involved.
“The burden of proof showing there hadn’t been a breach is on the organization,” Walsh said. “Making one misstep can get someone into a great deal of legal trouble.”
A systematic process must be followed to prove any incident did not result in a breach, but most organizations fail to take incident responses seriously. According to a 2016 report by the US Department of Health and Human Services (HHS), nearly half of providers queried in a survey did not have formal incident response plans and 55% had no incident response team. This is problematic considering 61% of the respondents had experienced a data breach in the past 2 years.
HIPAA regulations require providers to have an incident response plan that is documented and regularly updated. Like most aspects of HIPAA, this is typically only checked in the case of an audit after a breach. That does not mean providers should not heed the requirement, however. Having this paperwork helps prove providers took the required steps after a breach and may mitigate fines.
The purpose of an incident report is to identify and track problems, contain and reduce the impact of incidents, correct the causal issues, prevent recurrences, and demonstrate compliance.
If the Office for Civil Rights (OCR) conducts an audit, it would want to see an incident log. If providers do not have one “it means they are perfect or they are clueless,” Walsh said. “And nobody is perfect, there are always going to be some things that happen.”
Providers should have incident reports as far back as 6 years. Over that period, nearly every office has had a misdirected fax, handed the wrong form to a patient, or had a system outage, Walsh noted. “It happens all of the time, and OCR knows that,” Walsh said.
What to include
The good news about incident breach reports is they are relatively straightforward for small practices, according to Julian Jacobsen, co-founder of J.J. Micro IT Consulting of St. Louis, Missouri. Anything that could possibly be a breach should be listed on an incident report. If a laptop was stolen, for instance, that should be noted even if a practice is not sure protected health information was on it. For the report, a practice needs to demonstrate they understand the incident and can document the process the organization goes through when one occurs.
According to Jacobsen, the following areas should be covered in the plan:
- What constitutes an incident?
- What steps should be taken to respond to an incident?
- How should each step be documented?
- How is an incident and breach differentiated?
- Who makes the final decision of whether an incident should be considered a breach?
- How and when should affect patients be contacted?
- How and when should the media be contacted in case of a breach?
- How and when should HHS be contacted?
A good way to ensure a practice has everything that HHS might want in an incident report is to check the department’s website, Walsh said. The HHS site offers a sample form that lists the information required when reporting a breach in the portal.
Another good method is a walk-through. Create a fictitious scenario and see how the practice’s staff responds and if the plan meets the requirements. “There is real value in this,” Walsh said. “You don’t want to wait until you are knee deep in an incident to see you have holes in the process.
There should also be documentation that these plans have been reviewed and updated on an annual basis. Updates are typically only necessary if there have been changes to the law or the workplace (i.e. an employee listed as responsible for an aspect of the plan is no longer employed), Jacobsen said.
Who should be involved?
Depending upon the size of the organization, different people may take part in the incident response team. In smaller practices, it is often the practice owner, IT person, and security officer who work together to create the plan. Someone like the security officer needs to be a point of contact responsible for getting it written and followed.
In a larger organization, HIPAA privacy and security officers might coordinate to create the plan and have the practice’s decision makers to sign off on it, Jacobsen said. These officers might work with practice owner, an IT staffer and director, office manager, and human resource staff and director. A point person still needs to be in charge. Generally, Jacobsen recommends looking outside of the practice for help with HIPAA compliance.
An inexpensive option that “doesn’t do a great job, but they would at least have something” is an online compliance program. These can cost around $500 and an office will get a binder with a checklist for things they should be doing for compliance. An audit and remediation work for a small practice by a consultant, he said, would more likely fall into the $3,000 to $5,000 range.