RisknCompliance Blog

Thoughts On Delivering Meaningful Outcomes in Security and Privacy

Category: Privacy

No, Security-Privacy Is Not A Hindrance To TeleHealth Adoption

Since I follow the teleheath space rather closely from a security/privacy perspective, I was drawn yesterday to this article titled “How Health Privacy Regulations Hinder Telehealth Adoption”.  From my experience, I know telehealth has many obstacles to overcome but I have never thought of security-privacy being prominent among them. I have certainly not thought of security-privacy as a hindrance to its adoption, as the article’s title says.

I read the article and then downloaded the original AHA paper (pdf) the article is based on.

It wasn’t too long after did I conclude that the title of the article was misplaced, in my opinion.

The AHA paper is nicely written and very objective in my view. It covers a number of areas that are true challenges to telehealth adoption but it doesn’t portray security-privacy as a hindrance, contrary to the title of the article. On the other hand, it talks about specific security-privacy considerations for planning and implementation (see page 10 of the pdf). These considerations are no different from what one would need to implement when deploying any new type of technology.

The considerations are the right things to do if you were to have any confidence in your ability to safeguard patient privacy and safety. Sure, there are some regulatory aspects (discussed on page 11) but these are no different from what we need for protecting Protected Health Information (PHI) in any form.

In conclusion, I think the author should perhaps look to change the title lest anyone should think that it adds to the FUD, of which there is no shortage in security, as we know.

Can we change the tune on Health Information Security and Privacy please?

Notice the title doesn’t say HIPAA Security and Privacy. Nor does it have any of the words – HITECH, Omnibus Rule, Meaningful Use etc. That is the point of this post.

Let us start with a question…  I am sure many of you like me are routine visitors to the blogosphere and social media sites (especially LinkedIn group discussions) to get a pulse of the happenings in Information Security and Privacy. How often do you see posts or discussions around compliance versus discussions focused squarely on risk, meaning risk to the organization or to the patients if their health information was compromised by one or the other means?

Compliance (risk of non-compliance) is only one of the risks  and in our view, should not be the primary driver for any Information Security or Privacy program. In fact, we often like to say that Compliance should be a natural consequence of  good risk management practices.

Having lived and watched Health Information Security and Privacy for nearly ten years, I am not surprised by this trend at all. Rather, I am looking forward to a day where we talk more about safeguarding the security and privacy of patient data and less about preparing for an OCR Audit. I am not suggesting that you shouldn’t worry about the latter. In fact, I’ll say that one will very likely not have to worry about the OCR or any audit for that matter if one’s real intent is to safeguard security and privacy of patient information. The real intent and objective are extremely important because they shape our thinking and how we go about executing our efforts.

I think  Security and Privacy programs in Healthcare can be a lot more effective (and likely even cost efficient) if they were to prioritize the objectives in the following order:

  • Patient Care and Safety – In most discussions on security, we tend to focus solely on confidentiality of patient information and less so on integrity and availability of the information. When we begin to think of all three security components in equal measure, it is easier to appreciate how a security incident or breach could impact patient care and safety. With the increasing adoption of EHRs, it is very likely that many health-care providers are relying solely on electronic versions of the patient records in one or more EHRs. It is possible that a security incident or breach could result in the patient record not being “available” for access by the physicians who may need to look at the patient’s treatment history before providing the patient with some urgent or emergency care.  In another possible scenario, it is possible that the security breach resulted in compromise of the integrity of the patient record itself, in which case there may be a chance that physicians end up misdiagnosing the patient condition and not providing the right treatment. Such cases were probably unlikely in a world of paper records but they are not inconceivable in a world of electronic records. These issues can result from both malicious and unintentional circumstances.
  • Patient Privacy and Loss of Trust – The impact of a healthcare privacy breach doesn’t need much discussion. The impacted individuals can face severe and lasting financial and reputational harm which can make for a very painful experience. This in turn could result in the provider losing the valuable trust of its customers. 
  • Business Risk – Healthcare businesses could face Tort or Class Action lawsuits from either of the two previous scenarios.  And then of course, there is the possibility of patients turning to competitors especially when they have access to multiple providers where they live. In effect, health care organizations could face substantial losses to their bottomlines and given the increasing competitive nature of the industry, this could put business sustainability of the organizations at risk.
  • Risks of Non-Compliance – Finally of course, there is the risk of non-compliance with industry or government regulations. Non-compliance could leave healthcare organizations facing considerable civil and possible criminal fines as well as recurring expenses from having to comply with OCR resolution agreements for example. In most instances however, the impact of non-compliance fines and expenses are only temporary in nature lasting a few years or more. On the other hand, the impact of the previous three risks could be much more significant and longer lasting.

Until we think of security and privacy as being central to patient care/safety and the business/clinical culture, it is our view that many programs will likely falter and not deliver the intended results. The new era of digital healthcare requires healthcare organizations to think of security and privacy as a business or customer issue and not something that they need to address only for compliance purposes.

In a following post, we’ll specifically discuss some examples of why thinking compliance first will not get us very far in managing health information security risks.

Next time you do a Risk Assessment or Analysis, make sure you have Risk Intelligence on board

I was prompted to write this quick post this morning when I read this article.

I think it is a good example of what some (actually many, in my experience) risk management programs may be lacking, which is a good quality of Risk Intelligence. In this particular case, I think the original article failed to emphasize that vulnerabilities by themselves may not mean much unless there is a good likelihood of them being exploited, resulting in real risk.  We discussed some details regarding the quality of risk assessments in a previous post.

A good understanding of information risks and their prioritization needs to be the first and arguably the most important step in any information risk management program. Yet, we often see risk assessment initiatives not being done right or at the right quality. We think it is critical that a risk analysis or assessment is headed by someone or performed by a team that has or does the following:

  1. A very good understanding of your environment from people, process and technology perspectives
  2. A very good and up-to-date intelligence on the current threats out there (both internal and external) and is able to objectively define those threats
  3. Is able to clearly list and define the vulnerabilities in your environment. It will often require  process or technology specialists to do a good job of defining the vulnerabilities
  4. Is able to make an unbiased and objective determination of the the likelihood that the vulnerabilities (from Step 3) can be exploited by one or more threats (from Step 2)
  5. A very good understanding of the impact to the business if each vulnerability were to be exploited by one or more threats. Impact is largely a function of the organization’s characteristics including various business and technical factors, so it is important that you involve your relevant business and  technology Subject  Matter Experts.
  6. Based on the likelihood (Step 4) and impacts (Step 5), estimate risks and then rank them by magnitude.

We just can’t stress the importance of steps 1-5 enough. We think it takes “Risk Intelligence” to do these steps well. Without good Risk Intelligence on your team, you may well be wasting precious time, money and resources on your risk assessments.  More importantly, you may not be protecting your business to the extent that you should, with the same budget and resources.

======================================

Important Disclaimer

The guidance and content we provide in our blogs including this one is based on our experience and understanding of best practices. Readers must always exercise due diligence and obtain professional advice before applying the guidance within their environments.

You don’t know what you don’t know – Do we have a "detection" problem with the healthcare data breach numbers?

Like some of you perhaps, I have been reading a few recent articles on Healthcare data breaches, especially the one from Dark Reading and a detailed analysis of the 2010-to-date breaches from HITRUST Alliance.

What stood out for me from these articles is something that is not necessarily highlighted in the articles and that is the very low number of breaches involving technology/people/process controls as opposed to physical losses.

These articles focused on the 119 or so breaches that have been reported to Department of Health and Human Services (HHS) or made public to date in 2010. From the HITRUST Alliance analysis, it is clear that an overwhelming majority of the breaches resulted from physical loss/theft of paper or electronic media, laptops etc.  Only two breaches resulted from hacking incidents.

I then went back to do a little bit of my own analysis of the 2010 data breach incidents covered in the Identity Theft Resource Center report available here. I came up with the following numbers for breaches other than those that involved physical loss, theft, burglary, improper disposal etc. :

  • Malware  infection -1
  • Unauthorized access to file share – 1
  • Database misconfiguration or vulnerability – 2
  • Website vulnerability – 1
  • Improper access or misuse by internal personnel – 6

As you can see, these account for less than 10% of the healthcare breaches known or reported so far this year.  Contrast this with the findings in 2010 Verizon Data Breach Investigation Report which attributes 38% of breaches to malware, 40% to hacking and 48% to misuse. It is pertinent to note that the Verizon report focused on 141 confirmed breaches from 2009 covering  a variety of industries,  but I think it is still good for a high level comparison to determine if we may be missing something in the healthcare breach data.

The comparison seems to suggest that the healthcare industry probably has much stronger safeguards  against malware, hacking, improper logical access etc.  I know from my own experience working with healthcare entities that this is not necessarily the case. For further corroboration, I reviewed two Ponemon Institute survey reports – Electronic Health Information at Risk: A Study of IT Practitioners and Are You Ready for HITECH? – A benchmark study of healthcare covered entities & business associates, both from Q4 2009. Following sample numbers from these reports further validate that the state of Information Security and Privacy among HIPAA Covered Entities (CEs) and Business Associates (BAs) is far from perfect:

Electronic Health Information at Risk: A Study of IT Practitioners

#

Question

% of respondents saying “Yes”

1

My organization’s senior management does not view privacy and data security as a top priority

70%

2

My organization does not have ample resources to ensure privacy and data security requirements are met – 61% of respondents.

61%

3

My organization does not have adequate policies and procedures to protect health information

54%

4

My organization does not take appropriate steps to comply with the requirements of HIPAA and other related healthcare regulations

53%

Are You Ready for HITECH? – A benchmark study of healthcare covered entities & business associates

#

HIPAA compliance requirements that are not formally implemented

% of respondents saying “Yes”

1

Risk-based assessment of PHI handling practices

49%

2

Access governance a and an access management policy

47%

3

Staff training

47%

4

Detailed risk analysis

45%

All this leads me to think of the possibility that some HIPAA CEs and BAs may not be detecting potential breaches. If you study the healthcare breaches that have been reported so far, almost all of them have been through physical losses of computers or media (which is easy to know and detect) or through reporting by third parties (victims, law enforcement, someone finding improperly disposed PHI paper records in trash bins  etc.).  I don’t know of any healthcare data breach this year that was detected through proactive monitoring of information systems.

As I covered in a related post on breach reports and what they tell us, I would recommend that CEs and BAs focus on certain key controls and related activities (see table below) in order to improve their breach prevention and detection capabilities:

#

Key Controls

Recommended Activities

1

Secure Configuration and Lockdown

Review configuration of information systems (network devices, servers, applications, databases etc.) periodically and ensure that they are locked down from a security configuration standpoint

2

Web Application Security

· Scan web applications periodically for OWASP Top 10 vulnerabilities and fix any discovered vulnerabilities

· For new applications under development, perform code reviews and/or vulnerability scans to fix any security vulnerabilities before the applications are put to production use (Studies show that it is far more cost effective to fix the vulnerabilities before applications are put to production use than after)

· Use Web Application Firewalls as appropriate

3

Strong Access Credentials

· Configure PHI systems and applications to have a strong password policy (complexity of the password, periodic change of password etc.)

· Implement multi-factor authentication on PHI systems and applications wherever possible


(Note: According to 2010 Verizon Data Breach investigation report, stolen access credentials lead to largest number of breaches from hacking incidents)

4

Access Assurance or Governance

· Conduct Access Certifications periodically, preferably at least every quarter for PHI systems and applications.

· Review access privileges within PHI systems and applications to ensure all access conforms to the “Least Privilege” principle. In other words, no user, application or service must have any more privileges than what is required for the job function or role

· If any excess privileges are found, they must be remediated promptly

· Revoke access to PHI systems and applications promptly in the event that a person leaves the organization or no longer requires access due to a change in the person’s job role within the organization

5

Logging, Monitoring and Reporting

· Identify “risky” events within PHI systems

· Configure the systems to generate logs for the identified events

· Tamper-proof the logs

· Implement appropriate technologies and/or processes for monitoring of the events (Refer to our related posts here and here for examples)

· High risk events must be identified and monitored through near-real-time alerts

· Responsibilities for daily review of log reports and alerts must be assigned to specific personnel

6

Encryption (Data at rest, media), Physical security of media

· Maintain an inventory of locations and systems wherever PHI exists

· Implement suitable encryption of PHI on laptops and removable media

· Implement appropriate physical security safeguards to prevent theft of devices or systems containing PHI

7

Security Incident Response

· Implement and operationalize an effective Security Incident Response program including clear assignment of responsibilities, response steps/workflows  etc.

· Test Incident Response process periodically as required

8

Security Awareness and Training

· Implement a formal security awareness and training program so the workforce is aware of their responsibilities,  security/privacy best practices and actions to take in the event of suspected incidents

· Require personnel to go through the security awareness and/or training periodically as appropriate

If you are familiar with the HIPAA Security Rule, you will notice that not all of the above controls are “Required” (as opposed to “Addressable”) under HIPAA Security Rule or in the proposed amendments to the rule under the HITECH Act. One may argue however, that the above controls must be identified as required based on “risk analysis” , which of course is a required implementation specification in the HIPAA Security Rule. In any event, CEs and BAs need to look beyond the HIPAA compliance risk and focus on the risk to their business or brand reputation if a breach were to occur.

Hope this is useful! As always, we welcome your thoughts and comments.

RisknCompliance Services Note

We at RisknCompliance maintain a up-to-date database of the current security threats and vulnerabilities at a detailed level. We are able to leverage this knowledge in  providing our clients with  high quality risk analysis.

Please contact us here if you would like to discuss your HIPAA security or privacy needs. We will be glad to talk to you about how we could be of assistance.

Proposed updates to HIPAA Security and Privacy Rules – What is new?

It was good to see the Office of Civil Rights (OCR) publish the long awaited proposed updates to HIPAA Security and Privacy Rules Thursday last week. Note that OCR is the division of the Department of Health and Human Services (HHS) responsible for enforcing both the HIPAA Security and Privacy Rules.

I want to emphasize that these are proposed updates, also called Notice of Proposed Rulemaking (NPRM) in Federal Government parlance. There is a 60 days period for the public to submit comments on the NPRM after it was published yesterday in the Federal Register. The comments are due by 09/13/2010.

The NPRM includes updates to the following HIPAA rules or areas:

1. Privacy Rule

2. Security Rule

3. Rules pertaining to Compliance and Investigations

4. Imposition of Civil Money Penalties, and

5. Procedures for Hearings (Enforcement Rule)

As noted in the NPRM, these updates are being made to “implement recent statutory amendments under the Health Information Technology for Economic and Clinical Health Act (HITECH) and to strengthen the privacy and security protection of health information, and to improve the workability and effectiveness of these HIPAA Rules”.

For those who don’t have much history on HIPAA, the current Privacy Rule was issued on December 28, 2000, and amended on August 14, 2002 while the Security Rule was issued on February 20, 2003. So, the proposed updates are long overdue in any case given that Information Security and Privacy risk landscapes have changed substantially since these rules were issued.

I’ll focus on updates to just the Security and Privacy Rules in this post. I’ll have two more posts over the next week or so, one with an in-depth coverage on what to expect from proposed updates to the Security Rule and the other one with a similar coverage of the Privacy Rule.

So, here are notable proposed updates:

1. Replace “individually identifiable health information” with “protected health information” to better reflect the scope of the Privacy and Security Rules.

2. Definition of “Business Associate”(BA) being expanded to include the following new constituents:

a. Patient Safety Organizations (PSO)

b. Health Information Organizations (HIO)

c. E-Prescribing Gateways

d. Other Persons that facilitate PHI data transmissions for Covered Entities or other BAs and require routine access to such PHI

e. Vendors of Personal Health Records (like Google Health and Microsoft Healthvault)

f. Subcontractors of a Covered Entity (CE) – i.e., those persons that perform functions for or provide services to a BA, other than in the capacity as a member of the business associate’s workforce.

3. As provided in section 13401 of the HITECH Act, the Security Rule’s administrative, physical, and technical safeguards requirements in §§ 164.308, 164.310, and 164.312, as well as its policies and procedures and documentation requirements in § 164.316, shall apply to BAs in the same manner as these requirements apply to CEs.

4. BAs shall be civilly and criminally liable for penalties for violations of the provisions in #3 above.

5. Requirements of BA contracts (or other arrangements) between CEs and BAs will now apply to contracts (or other arrangements) between BAs and their subcontractors. It is important to note here that the burden of obtaining assurances (through contracts) from subcontractors regarding safety of PHI falls on the BA rather than the CE.

6. A subcontractor will be required to notify any breaches of unsecured PHI to the BA who in turn would be required to notify the CE. The CE then notifies the affected individuals, HHS, and, if applicable, the media, of the breach, unless it has delegated such responsibilities to a BA.

7. BAs, like CEs, may not use or disclose PHI except as permitted or required by the Privacy Rule or their contracts with CEs or as required by law. If a CE and its BA have failed to enter into a BA contract or other arrangement, then the BA may use or disclose PHI only as necessary to perform its obligations for the CE.

8. Other proposed changes to the Privacy Rule include:

a. Certain material changes to the Notice of Privacy Practices (NPP) issued by a CE or by a BA, if delegated so by a CE through contract

b. A number of changes to the definition of “marketing” in the Privacy Rule at § 164.501

c. Provisions for individuals to request restriction of disclosure of certain PHI to a health plan under certain circumstances

d. New restrictions on sale of PHI by CEs and BAs

e. Strengthen the right of “access” more uniformly to cover all protected health information maintained in one or more designated record sets electronically, regardless of whether the designated record set is an electronic health record

OCR has also proposed that the compliance deadline for all new and updated requirements in the Security and Privacy rules will be 180 days after the final update which I believe can be expected in Q4 this year. OCR is also proposing an additional one-year transition period to modify certain BA agreements. The NPRM further qualifies the one-year transition period as “The additional transition period would be available to a covered entity or business associate if, prior to the publication date of the modified Rules, the covered entity or business associate had an existing contract or other written arrangement with a business associate or subcontractor, respectively, that complied with the prior provisions of the HIPAA Rules and such contract or arrangement was not renewed or modified between the effective date and the compliance date of the modifications to the Rules.

Assuming that these timelines don’t change in the final rule, all CEs and BAs need to plan for full compliance with the final rules by Q2 of 2011 and for revision of existing BA agreements no later than Q2 of 2012. I want to emphasize here that the current BAs (as defined in section § 160.103  of 45 CFR 160) must already be in compliance with the current  Privacy Rule and certain provisions of the current Security Rule beginning February 18, 2010 as required by the HITECH Act. The new deadlines will apply only to the new BAs (see 2. a-f above)  and for all CEs and current BAs to comply with any new or updated requirements in the final rules.

So, what are the highlights in this NPRM? We have known all along (from the HITECH Act) that the BAs need to comply with the Privacy Rule and certain provisions of the Security Rule. The real highlight to me in this NPRM is the expansion of the definition of a BA. Pretty much everyone (including all subcontractors and others) that has the custody of PHI will now have to comply with both the Security and Privacy Rules. Another highlight to me is the expected compliance deadlines as discussed in the previous paragraph.

As I mentioned earlier in this post, I’ll provide an in-depth coverage of the updates to Security and Privacy Rules in two of my upcoming posts.

As always, we welcome your thoughts and comments. We would also obviously like to hear if you need any consulting support in order to prepare for the anticipated HIPAA changes.

FTC delays enforcement of Identity Theft Red Flags Rule to 12/31/10

FTC announced earlier this morning that it is delaying enforcement of the Red Flags Rule to 12/31/10 pending expected legislation by Congress that would affect the scope of entities covered by the Rule. As I wrote in my blog just a few days ago, organizations representing physicians, lawyers and  accountants have already contested that the Rule shouldn’t apply to them. I wrote that the previous deadline of 06/01/10 was probably too close for FTC to move it again. I guess it is never too close!

Let us wait and watch for next steps from the Congress now!

Identity Theft Red Flags Rule – Is the 06/01/10 deadline looking good?

Frankly, I have lost count of how many times FTC has moved the deadline already (see my related post from 2009).  This time, however, I think the deadline is too close (about a week out at the the time of this blog post) that I think the rule is finally going to take effect.  Again, I may be proved wrong… let us wait and see!

Aside from the rule taking effect, enforcement of the rule is going to be interesting to watch! Just this past Thursday,  AMA and two other physician groups filed a suit contending that the rule shouldn’t apply to physicians.  The rule had already been contested by Lawyers and Accountants.

AMA’s suit comes after several back-and-forth discussions with FTC over the last year or so. It looks like AMA wasn’t obviously convinced that the rule should apply to physicians despite what I thought was this compelling argument by FTC.

AMA’s main contention has been that hospitals and physicians are already subject to HIPAA Security and Privacy Rules and therefore the Red Flags Rule shouldn’t apply to them. From my experience, however, I believe that most HIPAA Security/Privacy Programs may not be effective against Identity Theft tricksters of today. I would recommend that health care providers implement a risk-based, written Identity Theft Prevention Program to supplement the Administrative Requirements (§ 164.530) of the HIPAA Privacy Rule and Administrative Safeguards (§ 164.308) of the HIPAA Security Rule.

I think the below quote from FTC’s letter sums it up well:

“The Rule is designed to prevent identity theft primarily by ensuring that organizations are alert to signs that an identity thief is using someone else’s identifying information fraudulently to obtain products or services, including services such as medical care. Thus, the Red Flags Rule generally complements rather than duplicates the HIPAA data security requirements.”

Privacy Statements, Notices, Policies …

How often do we care to read the privacy statements we receive from any number of sources these days? I must say I’m not a regular either if you ask me as a consumer. As a Privacy Professional, however, I am always interested (and sometimes fascinated) in reading them.

Take the following extract of the Web Privacy Statement of a very prominent institution, for example:

“Thanks for visiting the XXXXXXX website and reviewing our privacy policy! Our privacy policy is plain and simple. We collect NO personal information like names or addresses when you visit our website. If you choose to provide that information to us it is only used to fulfill your request for information. We do collect some technical information when you visit to make your visit seamless. The section below explains how we handle and collect technical information when you visit our website…”

This is actually a website that lets you e-file in addition to providing a whole lot of information on their services . As part of e-filing, however, they do collect all kinds of personal information including Date of Birth, Social Security Number, Credit Card Information etc.  In this particular case, I suspect they forgot to update their Web Privacy Statement when they introduced the e-filing feature.

Consider another example (and certainly a better one) of a Web Privacy Statement. This one is from Amazon.com.

I am sure you can see the difference and what a good privacy statement or notice should look like.

Privacy Policies, Statements or Notices are often the face of an organization’s  Privacy Program.  If a privacy policy is lacking details, it is highly likely that the organization hasn’t gotten its act together on privacy and data protection. A good privacy policy must address most if not all of the privacy principles to a reasonable level of detail.

Finally, as consumers, it is always a good practice to take a good read through the privacy statements we receive from time to time via mail or while registering on the Internet for any number of reasons. Given where we are with growing incidents of data breaches, theft and losses, one would be better advised to be safe than sorry.