RisknCompliance Blog

Thoughts On Delivering Meaningful Outcomes in Security and Privacy

Tag: HITECH

Can we change the tune on Health Information Security and Privacy please?

Notice the title doesn’t say HIPAA Security and Privacy. Nor does it have any of the words – HITECH, Omnibus Rule, Meaningful Use etc. That is the point of this post.

Let us start with a question…  I am sure many of you like me are routine visitors to the blogosphere and social media sites (especially LinkedIn group discussions) to get a pulse of the happenings in Information Security and Privacy. How often do you see posts or discussions around compliance versus discussions focused squarely on risk, meaning risk to the organization or to the patients if their health information was compromised by one or the other means?

Compliance (risk of non-compliance) is only one of the risks  and in our view, should not be the primary driver for any Information Security or Privacy program. In fact, we often like to say that Compliance should be a natural consequence of  good risk management practices.

Having lived and watched Health Information Security and Privacy for nearly ten years, I am not surprised by this trend at all. Rather, I am looking forward to a day where we talk more about safeguarding the security and privacy of patient data and less about preparing for an OCR Audit. I am not suggesting that you shouldn’t worry about the latter. In fact, I’ll say that one will very likely not have to worry about the OCR or any audit for that matter if one’s real intent is to safeguard security and privacy of patient information. The real intent and objective are extremely important because they shape our thinking and how we go about executing our efforts.

I think  Security and Privacy programs in Healthcare can be a lot more effective (and likely even cost efficient) if they were to prioritize the objectives in the following order:

  • Patient Care and Safety – In most discussions on security, we tend to focus solely on confidentiality of patient information and less so on integrity and availability of the information. When we begin to think of all three security components in equal measure, it is easier to appreciate how a security incident or breach could impact patient care and safety. With the increasing adoption of EHRs, it is very likely that many health-care providers are relying solely on electronic versions of the patient records in one or more EHRs. It is possible that a security incident or breach could result in the patient record not being “available” for access by the physicians who may need to look at the patient’s treatment history before providing the patient with some urgent or emergency care.  In another possible scenario, it is possible that the security breach resulted in compromise of the integrity of the patient record itself, in which case there may be a chance that physicians end up misdiagnosing the patient condition and not providing the right treatment. Such cases were probably unlikely in a world of paper records but they are not inconceivable in a world of electronic records. These issues can result from both malicious and unintentional circumstances.
  • Patient Privacy and Loss of Trust – The impact of a healthcare privacy breach doesn’t need much discussion. The impacted individuals can face severe and lasting financial and reputational harm which can make for a very painful experience. This in turn could result in the provider losing the valuable trust of its customers. 
  • Business Risk – Healthcare businesses could face Tort or Class Action lawsuits from either of the two previous scenarios.  And then of course, there is the possibility of patients turning to competitors especially when they have access to multiple providers where they live. In effect, health care organizations could face substantial losses to their bottomlines and given the increasing competitive nature of the industry, this could put business sustainability of the organizations at risk.
  • Risks of Non-Compliance – Finally of course, there is the risk of non-compliance with industry or government regulations. Non-compliance could leave healthcare organizations facing considerable civil and possible criminal fines as well as recurring expenses from having to comply with OCR resolution agreements for example. In most instances however, the impact of non-compliance fines and expenses are only temporary in nature lasting a few years or more. On the other hand, the impact of the previous three risks could be much more significant and longer lasting.

Until we think of security and privacy as being central to patient care/safety and the business/clinical culture, it is our view that many programs will likely falter and not deliver the intended results. The new era of digital healthcare requires healthcare organizations to think of security and privacy as a business or customer issue and not something that they need to address only for compliance purposes.

In a following post, we’ll specifically discuss some examples of why thinking compliance first will not get us very far in managing health information security risks.

Do we have a wake-up call in the OIG HHS Report on HIPAA Security Rule Compliance & Enforcement?

If you didn’t notice already, the Office of Inspector General  (OIG) in the Department of Health and Human Services (HHS) published a  report on the oversight by the Center for Medicare and Medicaid Services (CMS) in the enforcement of the HIPAA Security Rule. The report is available to the public here.   As we know, CMS was responsible for enforcement of the HIPAA Security Rule until the HHS  Secretary transferred that responsibility over to the Office of Civil Rights (OCR) back in 2009.

To quote from the report, the OIG conducted audits at seven covered entities (hospitals) in California, Georgia, Illinois, Massachusetts, Missouri, New York, and Texas in addition to an audit of CMS oversight and enforcement actions.  These audits focused primarily on the hospitals’ implementation of the following:

  • The wireless electronic communications network or security measures the security management staff implemented in its computerized information systems (technical safeguards);
  • The physical access to electronic information systems and the facilities in which they are housed (physical safeguards); and,
  • The policies and procedures developed and implemented for the security measures to protect the confidentiality, integrity, and availability of ePHI (administrative safeguards).

These audits were spread over three years (2008, 2009 and 2010) with the last couple of audits happening in March 2010. The report doesn’t mention  the criteria by which these hospitals were selected for audit except that these  hospitals were not selected because they had a breach of Protected Health Information(PHI) .

It wouldn’t necessarily be wise to extrapolate the findings in the report to the larger healthcare space in general without knowing how these hospitals were selected for audit. All one can say is that the findings would paint a worrisome picture if these hospitals were selected truly in a random manner.  For example, if one were to look at ”High Impact” causing  technical vulnerabilities, all 7 audited hospitals seem to have had vulnerabilities related to Access and Integrity Controls, 5 out of  7 had vulnerabilities related to Wireless and Audit Controls and  4 out 7 had vulnerabilities related to Authentication and Transmission Security Controls.

image

What might be particularly concerning is that the highest number of vulnerabilities were in the Access and Integrity Controls categories.  These are typically the vulnerabilities that are exploited most by hackers as evidenced (for instance) by the highlight in this quote from the 2011 Verizon Data Breach Investigation Report – “The top three threat action categories were Hacking, Malware, and Social. The most common types of hacking actions used were the use of stolen login credentials, exploiting backdoors, and man-in-the-middle attacks”.

Wake-up call or not, healthcare entities should perhaps take a cue from these findings and look to implement robust security and privacy  controls. A diligent effort should help protect organizations from the well publicized consequences of a potential data breach.

Providers – Is HIPAA Security Risk Analysis in your plan over the next few months?

Security Risk Analysis is something that we recommend all organizations conduct periodically or before a  significant process or technology change. After all, threats, vulnerabilities and impact (three components of risk, see my other post here) often change or evolve over time which means that risk analysis results can soon become outdated.

In the context of Healthcare, Security Risk Analysis is also mandatory for two reasons.

The first reason is that it is required for compliance with HIPAA Security Rule which, by way of the HITECH Act, now applies to Business Associates in addition to Covered Entities.  It is a “Required” Implementation Specification in the “Security Management Process” standard under Administrative Safeguards of the HIPAA Security Rule, as highlighted in the table below.

image

The second (and more urgent) reason to conduct a Security Risk Analysis is that it is a core requirement for providers to achieve Meaningful Use certification of Electronic Health Records (EHRs) and thereby become eligible for Medicare/Medicaid incentives beginning April 2011 or risk Medicare reimbursement penalties beginning 2015 (see below).

image image

clip_image001[3]

Source: Center for Medicare & Medicaid Services (CMS)


So, it is important that providers plan on conducting a security risk analysis within the next few months unless you have conducted one recently. If you have already implemented an EHR system, you will need to ensure that the risk analysis included the EHR system and the related processes or practice workflows. If you plan to implement an EHR system in the next few months, we would recommend conducting risk analysis before the implementation so that any discovered risks can be identified and mitigated by proper design of the system and associated workflows or processes.  Any change to the system or processes after implementation is going to be hard, not to talk of the disruption to the practice and other costs.

The Final Guidance from OCR on Risk Analysis can be a useful reference in planning and conduct of risk analysis efforts.

Finally, I would like to go back to what I said right at the beginning. We recommend that organizations focus on managing all information risks, not just the risk of non-compliance with regulations such as HIPAA.  Therefore, it is critical that personnel performing the risks analysis are up-to-date on the current threat environment. Upon determination of the threats, one must be able to clearly identify the organization’s vulnerabilities to those threats and then the impact resulting from any exploits and various legal or compliance obligations including HIPAA.  Last but not the least, risk analysis must be conducted at appropriate intervals and certainly whenever there is a significant change in processes or technologies.

—————————————-

Important Disclaimer

The guidance and content we provide in our blogs including this one is based on our experience and understanding of best practices. Readers must always exercise due diligence and obtain professional advice before applying the guidance within their environments.

You don’t know what you don’t know – Do we have a "detection" problem with the healthcare data breach numbers?

Like some of you perhaps, I have been reading a few recent articles on Healthcare data breaches, especially the one from Dark Reading and a detailed analysis of the 2010-to-date breaches from HITRUST Alliance.

What stood out for me from these articles is something that is not necessarily highlighted in the articles and that is the very low number of breaches involving technology/people/process controls as opposed to physical losses.

These articles focused on the 119 or so breaches that have been reported to Department of Health and Human Services (HHS) or made public to date in 2010. From the HITRUST Alliance analysis, it is clear that an overwhelming majority of the breaches resulted from physical loss/theft of paper or electronic media, laptops etc.  Only two breaches resulted from hacking incidents.

I then went back to do a little bit of my own analysis of the 2010 data breach incidents covered in the Identity Theft Resource Center report available here. I came up with the following numbers for breaches other than those that involved physical loss, theft, burglary, improper disposal etc. :

  • Malware  infection -1
  • Unauthorized access to file share – 1
  • Database misconfiguration or vulnerability – 2
  • Website vulnerability – 1
  • Improper access or misuse by internal personnel – 6

As you can see, these account for less than 10% of the healthcare breaches known or reported so far this year.  Contrast this with the findings in 2010 Verizon Data Breach Investigation Report which attributes 38% of breaches to malware, 40% to hacking and 48% to misuse. It is pertinent to note that the Verizon report focused on 141 confirmed breaches from 2009 covering  a variety of industries,  but I think it is still good for a high level comparison to determine if we may be missing something in the healthcare breach data.

The comparison seems to suggest that the healthcare industry probably has much stronger safeguards  against malware, hacking, improper logical access etc.  I know from my own experience working with healthcare entities that this is not necessarily the case. For further corroboration, I reviewed two Ponemon Institute survey reports – Electronic Health Information at Risk: A Study of IT Practitioners and Are You Ready for HITECH? – A benchmark study of healthcare covered entities & business associates, both from Q4 2009. Following sample numbers from these reports further validate that the state of Information Security and Privacy among HIPAA Covered Entities (CEs) and Business Associates (BAs) is far from perfect:

Electronic Health Information at Risk: A Study of IT Practitioners

#

Question

% of respondents saying “Yes”

1

My organization’s senior management does not view privacy and data security as a top priority

70%

2

My organization does not have ample resources to ensure privacy and data security requirements are met – 61% of respondents.

61%

3

My organization does not have adequate policies and procedures to protect health information

54%

4

My organization does not take appropriate steps to comply with the requirements of HIPAA and other related healthcare regulations

53%

Are You Ready for HITECH? – A benchmark study of healthcare covered entities & business associates

#

HIPAA compliance requirements that are not formally implemented

% of respondents saying “Yes”

1

Risk-based assessment of PHI handling practices

49%

2

Access governance a and an access management policy

47%

3

Staff training

47%

4

Detailed risk analysis

45%

All this leads me to think of the possibility that some HIPAA CEs and BAs may not be detecting potential breaches. If you study the healthcare breaches that have been reported so far, almost all of them have been through physical losses of computers or media (which is easy to know and detect) or through reporting by third parties (victims, law enforcement, someone finding improperly disposed PHI paper records in trash bins  etc.).  I don’t know of any healthcare data breach this year that was detected through proactive monitoring of information systems.

As I covered in a related post on breach reports and what they tell us, I would recommend that CEs and BAs focus on certain key controls and related activities (see table below) in order to improve their breach prevention and detection capabilities:

#

Key Controls

Recommended Activities

1

Secure Configuration and Lockdown

Review configuration of information systems (network devices, servers, applications, databases etc.) periodically and ensure that they are locked down from a security configuration standpoint

2

Web Application Security

· Scan web applications periodically for OWASP Top 10 vulnerabilities and fix any discovered vulnerabilities

· For new applications under development, perform code reviews and/or vulnerability scans to fix any security vulnerabilities before the applications are put to production use (Studies show that it is far more cost effective to fix the vulnerabilities before applications are put to production use than after)

· Use Web Application Firewalls as appropriate

3

Strong Access Credentials

· Configure PHI systems and applications to have a strong password policy (complexity of the password, periodic change of password etc.)

· Implement multi-factor authentication on PHI systems and applications wherever possible


(Note: According to 2010 Verizon Data Breach investigation report, stolen access credentials lead to largest number of breaches from hacking incidents)

4

Access Assurance or Governance

· Conduct Access Certifications periodically, preferably at least every quarter for PHI systems and applications.

· Review access privileges within PHI systems and applications to ensure all access conforms to the “Least Privilege” principle. In other words, no user, application or service must have any more privileges than what is required for the job function or role

· If any excess privileges are found, they must be remediated promptly

· Revoke access to PHI systems and applications promptly in the event that a person leaves the organization or no longer requires access due to a change in the person’s job role within the organization

5

Logging, Monitoring and Reporting

· Identify “risky” events within PHI systems

· Configure the systems to generate logs for the identified events

· Tamper-proof the logs

· Implement appropriate technologies and/or processes for monitoring of the events (Refer to our related posts here and here for examples)

· High risk events must be identified and monitored through near-real-time alerts

· Responsibilities for daily review of log reports and alerts must be assigned to specific personnel

6

Encryption (Data at rest, media), Physical security of media

· Maintain an inventory of locations and systems wherever PHI exists

· Implement suitable encryption of PHI on laptops and removable media

· Implement appropriate physical security safeguards to prevent theft of devices or systems containing PHI

7

Security Incident Response

· Implement and operationalize an effective Security Incident Response program including clear assignment of responsibilities, response steps/workflows  etc.

· Test Incident Response process periodically as required

8

Security Awareness and Training

· Implement a formal security awareness and training program so the workforce is aware of their responsibilities,  security/privacy best practices and actions to take in the event of suspected incidents

· Require personnel to go through the security awareness and/or training periodically as appropriate

If you are familiar with the HIPAA Security Rule, you will notice that not all of the above controls are “Required” (as opposed to “Addressable”) under HIPAA Security Rule or in the proposed amendments to the rule under the HITECH Act. One may argue however, that the above controls must be identified as required based on “risk analysis” , which of course is a required implementation specification in the HIPAA Security Rule. In any event, CEs and BAs need to look beyond the HIPAA compliance risk and focus on the risk to their business or brand reputation if a breach were to occur.

Hope this is useful! As always, we welcome your thoughts and comments.

RisknCompliance Services Note

We at RisknCompliance maintain a up-to-date database of the current security threats and vulnerabilities at a detailed level. We are able to leverage this knowledge in  providing our clients with  high quality risk analysis.

Please contact us here if you would like to discuss your HIPAA security or privacy needs. We will be glad to talk to you about how we could be of assistance.

Proposed updates to HIPAA Security and Privacy Rules – What is new?

It was good to see the Office of Civil Rights (OCR) publish the long awaited proposed updates to HIPAA Security and Privacy Rules Thursday last week. Note that OCR is the division of the Department of Health and Human Services (HHS) responsible for enforcing both the HIPAA Security and Privacy Rules.

I want to emphasize that these are proposed updates, also called Notice of Proposed Rulemaking (NPRM) in Federal Government parlance. There is a 60 days period for the public to submit comments on the NPRM after it was published yesterday in the Federal Register. The comments are due by 09/13/2010.

The NPRM includes updates to the following HIPAA rules or areas:

1. Privacy Rule

2. Security Rule

3. Rules pertaining to Compliance and Investigations

4. Imposition of Civil Money Penalties, and

5. Procedures for Hearings (Enforcement Rule)

As noted in the NPRM, these updates are being made to “implement recent statutory amendments under the Health Information Technology for Economic and Clinical Health Act (HITECH) and to strengthen the privacy and security protection of health information, and to improve the workability and effectiveness of these HIPAA Rules”.

For those who don’t have much history on HIPAA, the current Privacy Rule was issued on December 28, 2000, and amended on August 14, 2002 while the Security Rule was issued on February 20, 2003. So, the proposed updates are long overdue in any case given that Information Security and Privacy risk landscapes have changed substantially since these rules were issued.

I’ll focus on updates to just the Security and Privacy Rules in this post. I’ll have two more posts over the next week or so, one with an in-depth coverage on what to expect from proposed updates to the Security Rule and the other one with a similar coverage of the Privacy Rule.

So, here are notable proposed updates:

1. Replace “individually identifiable health information” with “protected health information” to better reflect the scope of the Privacy and Security Rules.

2. Definition of “Business Associate”(BA) being expanded to include the following new constituents:

a. Patient Safety Organizations (PSO)

b. Health Information Organizations (HIO)

c. E-Prescribing Gateways

d. Other Persons that facilitate PHI data transmissions for Covered Entities or other BAs and require routine access to such PHI

e. Vendors of Personal Health Records (like Google Health and Microsoft Healthvault)

f. Subcontractors of a Covered Entity (CE) – i.e., those persons that perform functions for or provide services to a BA, other than in the capacity as a member of the business associate’s workforce.

3. As provided in section 13401 of the HITECH Act, the Security Rule’s administrative, physical, and technical safeguards requirements in §§ 164.308, 164.310, and 164.312, as well as its policies and procedures and documentation requirements in § 164.316, shall apply to BAs in the same manner as these requirements apply to CEs.

4. BAs shall be civilly and criminally liable for penalties for violations of the provisions in #3 above.

5. Requirements of BA contracts (or other arrangements) between CEs and BAs will now apply to contracts (or other arrangements) between BAs and their subcontractors. It is important to note here that the burden of obtaining assurances (through contracts) from subcontractors regarding safety of PHI falls on the BA rather than the CE.

6. A subcontractor will be required to notify any breaches of unsecured PHI to the BA who in turn would be required to notify the CE. The CE then notifies the affected individuals, HHS, and, if applicable, the media, of the breach, unless it has delegated such responsibilities to a BA.

7. BAs, like CEs, may not use or disclose PHI except as permitted or required by the Privacy Rule or their contracts with CEs or as required by law. If a CE and its BA have failed to enter into a BA contract or other arrangement, then the BA may use or disclose PHI only as necessary to perform its obligations for the CE.

8. Other proposed changes to the Privacy Rule include:

a. Certain material changes to the Notice of Privacy Practices (NPP) issued by a CE or by a BA, if delegated so by a CE through contract

b. A number of changes to the definition of “marketing” in the Privacy Rule at § 164.501

c. Provisions for individuals to request restriction of disclosure of certain PHI to a health plan under certain circumstances

d. New restrictions on sale of PHI by CEs and BAs

e. Strengthen the right of “access” more uniformly to cover all protected health information maintained in one or more designated record sets electronically, regardless of whether the designated record set is an electronic health record

OCR has also proposed that the compliance deadline for all new and updated requirements in the Security and Privacy rules will be 180 days after the final update which I believe can be expected in Q4 this year. OCR is also proposing an additional one-year transition period to modify certain BA agreements. The NPRM further qualifies the one-year transition period as “The additional transition period would be available to a covered entity or business associate if, prior to the publication date of the modified Rules, the covered entity or business associate had an existing contract or other written arrangement with a business associate or subcontractor, respectively, that complied with the prior provisions of the HIPAA Rules and such contract or arrangement was not renewed or modified between the effective date and the compliance date of the modifications to the Rules.

Assuming that these timelines don’t change in the final rule, all CEs and BAs need to plan for full compliance with the final rules by Q2 of 2011 and for revision of existing BA agreements no later than Q2 of 2012. I want to emphasize here that the current BAs (as defined in section § 160.103  of 45 CFR 160) must already be in compliance with the current  Privacy Rule and certain provisions of the current Security Rule beginning February 18, 2010 as required by the HITECH Act. The new deadlines will apply only to the new BAs (see 2. a-f above)  and for all CEs and current BAs to comply with any new or updated requirements in the final rules.

So, what are the highlights in this NPRM? We have known all along (from the HITECH Act) that the BAs need to comply with the Privacy Rule and certain provisions of the Security Rule. The real highlight to me in this NPRM is the expansion of the definition of a BA. Pretty much everyone (including all subcontractors and others) that has the custody of PHI will now have to comply with both the Security and Privacy Rules. Another highlight to me is the expected compliance deadlines as discussed in the previous paragraph.

As I mentioned earlier in this post, I’ll provide an in-depth coverage of the updates to Security and Privacy Rules in two of my upcoming posts.

As always, we welcome your thoughts and comments. We would also obviously like to hear if you need any consulting support in order to prepare for the anticipated HIPAA changes.

Logging for Effective SIEM and PCI DSS Compliance …. UNIX, Network Devices and Databases

In one of my previous blogs, I covered the importance of logging the “right” events for an effective Log Management or Security Information and Event Management (SIEM) deployment … see here or here for a discussion on the two technologies. The blog also provided a suggested listing of the Windows or Active Directory events that you might want to log from a PCI DSS Compliance standpoint.

Clearly, no amount of investment in your Log Management or SIEM solution is going to do much good, unless you have been able to generate all the right logs to begin with … see a related discussion with the recognized PCI Expert and Author, Dr. Anton Chuvakin here.

I would like to extend my suggested list in the previous post to cover a few other systems here,  specifically UNIX/LINUX, Network Devices and Databases. Note that this list is only a starting point so you can work with the respective System Specialists or Administrators in your organization to generate these events.

UNIX/LINUX Logging for Effective SIEM and PCI DSS Compliance

 

Logging of Network Devices for Effective SIEM and PCI DSS Compliance

Database Logging for Effective SIEM and PCI DSS Compliance

Identity Theft Red Flags Rule – Is the 06/01/10 deadline looking good?

Frankly, I have lost count of how many times FTC has moved the deadline already (see my related post from 2009).  This time, however, I think the deadline is too close (about a week out at the the time of this blog post) that I think the rule is finally going to take effect.  Again, I may be proved wrong… let us wait and see!

Aside from the rule taking effect, enforcement of the rule is going to be interesting to watch! Just this past Thursday,  AMA and two other physician groups filed a suit contending that the rule shouldn’t apply to physicians.  The rule had already been contested by Lawyers and Accountants.

AMA’s suit comes after several back-and-forth discussions with FTC over the last year or so. It looks like AMA wasn’t obviously convinced that the rule should apply to physicians despite what I thought was this compelling argument by FTC.

AMA’s main contention has been that hospitals and physicians are already subject to HIPAA Security and Privacy Rules and therefore the Red Flags Rule shouldn’t apply to them. From my experience, however, I believe that most HIPAA Security/Privacy Programs may not be effective against Identity Theft tricksters of today. I would recommend that health care providers implement a risk-based, written Identity Theft Prevention Program to supplement the Administrative Requirements (§ 164.530) of the HIPAA Privacy Rule and Administrative Safeguards (§ 164.308) of the HIPAA Security Rule.

I think the below quote from FTC’s letter sums it up well:

“The Rule is designed to prevent identity theft primarily by ensuring that organizations are alert to signs that an identity thief is using someone else’s identifying information fraudulently to obtain products or services, including services such as medical care. Thus, the Red Flags Rule generally complements rather than duplicates the HIPAA data security requirements.”