Tip:
Highlight text to annotate it
X
>> It's good that I'm going last because I get the benefit of working with all of you.
I work with policymakers.
I've work with hospitals and clinicians.
And I do work both in the ambulatory space --
so working in very small physician practices all the way
up to very large hospitals and health systems.
And I think, you know, one thing that stands out to me as I'm listening to the other comments is
that we're talking about privacy and we're talking about security,
and there are two different things.
And a lot of times we actually co-mingle those terms.
And I think it's important to just sort of get an understanding,
a baseline that privacy is all about, you know, what we're trying to protect,
and the security is how we do it.
So from the clinician's point of view, information needs to flow for all kinds of,
you know, treatment purposes and in a billing office that needs to flow for business purposes.
And I would argue that nobody that is setting policy wants to impede any work flows
that actually need to occur in order to achieve business purposes.
But when you start tying in security regs and you start looking at the technology and looking
at how you have to actually implement certain things to actually make that happen,
that's where we come up with problems because sometimes, you know,
you can put the best security controls in place but then it breaks something.
It breaks a process.
It breaks a work flow.
And so a lot of what I end up doing with clients is advising them on how to find that sort
of middle ground, how to find the reasonable approach to balancing what they're trying
to achieve from a regulatory client's perspective and what they need to be able to do
to show that they're truly maintaining a secure environment.
And, you know, we secure data and information in these systems not only to protect privacy
but also to make sure it's reliable and to make sure it's accurate.
So security has three real components.
It has the privacy piece but the accuracy and the integrity of that data is just as important
for patient safety issues and all kinds of other reasons.
So we can't forget that piece too.
So if you've got a virus that infects your network
and all of a sudden it corrupts your data, that's to me way worse than finding out,
you know, that *** Cheney was at GW again last week.
So, you know, those are the kinds of concerns that security professionals have
when they are trying to "lock down" the system.
I think, you know, it's really achieving that balance.
The other, you know, points that I wanted to touch on somehow is that we do need
to consider the culture, the environment that we have in this country.
We're talking about protecting information that is truly sensitive.
There are negative consequences to data breaches and so we, you know,
have to think about what patients do want, what they expect from us.
Just like when you go and do your online banking, you know, what do you expect the bank
to be doing to protect your financial data?
Well, we as citizens and people who all use the health care system, I'm sure,
have different ideas about what we expect the information and how we expect
that to be protected within the health care setting.
And obviously, you know, if I'm seeing a physician I want that doctor
to know everything he or she needs to know about me.
But I may not want, you know, nurses who are unrelated, who are not treating me
to have access especially if, you know, there's no reason for them to have access.
I don't want, necessarily, other administrative staff if they don't have a reason to access.
And I think that comes into the reasonableness approach and that's sort of the tack
that I take is, you know, let's look at what's reasonable.
And I think that's what policy makers have taken when they talk about things
like the Minimum Necessary Rule which is, you know, getting the minimum amount
of information necessary in order to do your job.
So not limiting it so you can't do your job but making sure
that it's the appropriate level of information.
And I think it's easy when there are common problems or things that are not sensitive
in nature and people are willing to give up their "privacy"
because it doesn't really impact them.
But if you have situations where you've got psychiatric illness or you've got abuse going
on or, you know, there are obvious conditions
that would require a little bit more sensitivity.
But we -- Sumit might see this operationally, I mean, we have issues with employees.
You know, you have to policies we all know that we put in place
for like VIP's, well what about employees?
Do I want necessarily everybody in my office knowing that I was
in the emergency room for some bizarre situation?
Not necessarily.
And so, you know, we come up with, you know, policies and things all the time.
Actually on the way up here I was actually on a call with a current client
who literally I've probably spent 20 or more hours having conversation,
just around how they're going to implement certain break-the-glass technology,
sequestering records, making parts of the note confidential
versus the entire patient confidential.
And even when the technology application can do something and it has
that functionality built in, you still have to create the policy internally and operationally
and how you're going to apply that universally
so that it can actually work with your work flow.
So it is really challenging.
I think it's not a simple issue.
I deal with these issues every single week with clients.
So it keeps us on our toes.