Data Access Protection: A Critical Part of Security

Listen to learn about why it’s critical to know who is accessing patient data and how to know who is accessing critical data.

SecurityMetrics Podcast | 69

Data Access Protection: A Critical Part of Security

Early detection of unauthorized access to electronic Protected Health Information (ePHI) is critical to preventing breaches and meeting HIPAA requirements. The co-founders of SPHER, Inc., Raymond Ribble (CEO) and Robert Pruter (Chief Revenue Officer), sit down with Host and Principal Security Analyst Jen Stone (MCIS, CISSP, CISA, QSA) to discuss:

  • Why it’s critical to know who is accessing patient data
  • How to know who is accessing critical data
  • Real-world stories of unauthorized access and what to do about it

Resources:

Download our Guide to PCI Compliance! - https://www.securitymetrics.com/lp/pci/pci-guide

Download our Guide to HIPAA Compliance! - https://www.securitymetrics.com/lp/hipaa/hipaa-guide

[Disclaimer] Before implementing any policies or procedures you hear about on this or any other episodes, make sure to talk to your legal department, IT department, and any other department assisting with your data security and compliance efforts.

Transcript of Data Access Protection: A Critical Part of Security

Hi, and welcome back to the SecurityMetrics podcast. My name is Jen Stone. I'm one of the principal security analysts here at SecurityMetrics. Very excited about the conversation that I'm going to have today because it has to do, specifically if you're in the healthcare industry, you're going to wanna know.


But it applies to security in a lot of other places. So stay tuned because you'll figure out how it applies to you. Welcome. The guys from Sphere and, oh, I might have mentioned, you guys have seen, like, the Help Me with HIPAA podcast.


Donna Grendel's been on this show. I told you about the, the, PRICEEC boot camp, which I went and helped participate in. And the gentlemen from Sphere were there as well. It was a great, training, exercise, episode.


I don't know what you call it. Like, it was not really a conference because it was three and a half days of really kind of intensive training. Right?


Boot camp. Right?


Yeah. It was a boot camp. And if you weren't there, you missed out. Plan on it next year. But let's focus on you guys for a minute. Please introduce yourselves to, the audience so they know who you are and where you come from.


Alright. I guess I'll go first. Hi, everybody. My name is Ray Ribble. I am one of the cofounders of Sphere Inc.


A little bit of my background, so we're gonna talk a little bit about Sphere Inc later, is, started my career in the aeronautical engineering industry as a designer for flight control systems of fighter jets.


Oh, cool.


How's that kinda interesting? And it actually plays into SPHER, by the way.


And then spent a large portion of my career living overseas doing business in the technology industry across Asia, before coming back to the United States and, getting head first into the health care industry. So I've been involved in technology for well over thirty years.


Wow. Rob.


Everybody, Rob Prueter, from Southern California. Born and raised. One of the few.


Very proud of that, by the way.


I'm in Aries. That's also important. Okay.


Been in health care IT for about twenty something years.


Been in identity management privacy specifically for the last ten years.


It's an interesting story how I met Ray, but I was in a conference probably seven, eight years ago where the headline speaker was talking about users and systems and, breaks of or breaches within the PHI, of users and the seventy seventy percent of and of, theft of patient data was by insiders. And as he looked out at the crowd, he said, nobody in this room knows how to capture it, identify it, or even know it's happening.


And there's no technology out there today. And I was sitting next to somebody that, it was actually an old friend of Ray's. At that moment, I said, well, I think Ray and the team have something going on there. I should probably, get in contact, and that's how we met was from a conference, seven, eight years ago. And then we started Sphere into the marketplace about two thousand sixteen was when we started to actually formulate a a plan to get help to everybody.


Starbucks paid a lot of money off us as we courted each other because as we kept meeting at a Starbucks, local locations.


Well, that's awesome.


I I think the reason that I was so interested in having you on to talk to you is, in health care, you have to know who accessed specific data and when they did it. And and the the company that you've put together really, helps with that. But I think there are a lot of other places where knowing who's accessing what data and whether they should be or not is less, sophisticated or or maybe less, focused on than in the health care industry. And even in the health care industry, a lot of people are not doing this super well from from the just the the the assessments that I do, a lot of lot of risk assessments, in the health care space.


And and this topic specifically is not, is not well understood. And certainly, the tooling behind it is not as great as what I saw you put together, and that's why I wanted to talk about it. Well, maybe let's let's let's look at what is the the kind of the problem set. Rob, you touched on it briefly, but what are people trying to do in terms of understanding who's accessing data, generally, but in the health care, space specifically?


I think one of the things that people underestimate is the value of medical data and patient data.


Many people understand identity theft and theft of credit card information and that sort of thing, but you can cancel that. You can cancel credit cards. You can cancel bank accounts. You can, you know, rejigger your financial life and kinda start over. You can't cancel your patient medical record.


Right.


It's what makes it so valuable. So when people are looking at why things are happening, why there's groups and bad actors accessing patient data, it really emanates from that because there's so many different things they can do with it.


And so as we were getting into this originally, I think there was an under appreciation to why there was, there are folks out there looking to get into these systems and get the data.


And that's kinda where we came in originally. When we were first talking about this, a lot of times we spent most of our time educating educating as to what was happening, but why it was happening.


Right.


Even today, I don't think people understand, like, why are people want my medical data? Why do they want my patient record?


Yeah.


I think that's how we really got into this and why we have a passion for protecting patient data.


When people ask us what we do, that's what we say. We we help protect patient data.


And people ask me that all the time. Well, why would anyone want my patient to what are they gonna do with it? How do they how do they use it in bad ways? What why is it even valuable to them? Is can you speak to that maybe a little bit, Raymond?


Yeah. Yeah. You can. Gotta have a whole slide that talks about that.


It it starts with and something that happened to me personally was they can use that information to file your tax returns Oh. As an example. They can use that information to create phony IDs and go sell them in Echo Park in the middle of Los Angeles.


They can use that data, as Rob just indicated a moment ago, to create credit cards Right.


To do loans, to, set up secondary identities for an individual or a group of individuals, and then go online and make purchases in the name of that person.


Right.


And and then a subset of that, and I could just go on and on about this is Rob and I talk about this a lot with people we speak with, is there's two immensely vulnerable groups within that. Why do they use those medical records? The under eighteen and the over sixty five.


Oh, interesting. Can you go into that a little more?


Just briefly. We don't wanna beat everybody up on it. But what's interesting is now we know kids today are very tech savvy. Right?


And tech dangerous as well. Because they're using it everywhere all the time with no sense of security whatsoever.


Mhmm.


Right? They'll log in to anything and take a look at it. But they have an identity nonetheless. So if I can steal that individual's identity, get a hold of their Social Security number, and go and create a an entity based on that, I can do a lot of purchases. So what we're seeing is that kids from the United States start to apply to universities at seventeen years old, and one of the first things they'll do is look at financial aid.


Well, they find out that they're not qualified for financial aid because there's already a whole bunch of loans out against their identity Yep.


That exist in the market that they should be paying on. Yeah. And they didn't even know the debt was there. So that would be one example.


As the flip side of that is is our our population in the United States is getting older, older older than sixty five years old.


They're using less technology.


Mhmm.


And they're not aware of what's happening with their identity if somebody has accessed it inappropriately and is using that. And those people maybe aren't paying as much attention to what's happening to themselves online to use an expression.


And guess what? It's being used in a bad way. So they're vulnerable because they're not looking or they're not paying attention, and somebody else is. They've got a hold of that record. And like Rob said, your medical record lasts forever. Yeah. I can I can take two years with the government and get a new Social Security number?


It's a pain in the butt.


I can do it. I can't get a new medical record.


Right.


Right? It exists. And the bad guys, let's use the dark web as an example. Well, they can use that record again and again and again. They can sell it to groups in the United States, in Canada, in Mexico. They can sell it to people over in Europe, so on and so on. Is it okay if I give you a little bit of background with why we're here?


Oh, yes. Please. Tell me about why the why you actually create because we kinda glossed over that but I would love to hear about why the company exists in the first place. It sounds like it the company itself is because you guys went, oh, wait. This is an amazing idea. But tell me a little more more about that.


It'd be cool if that was actually what it was.


Oh, no.


It was all about the coffee? Is that what you're saying? It was just you wanted to continue drinking coffee?


Was great. Let's see. Let's be honest. Yeah. Starbucks made a lot of money off of Rabobot.


That that's for sure.


But, and and the best part of that story is, Rob and I, when we first met, we never talked about where we lived, and we ended it ended up that we lived around the corner from each other Oh, wow.


And didn't know each other. In the same community, literally, what less than a mile from each other?


Oh, like like a couple hundred yards.


That's wild.


We never met. We never met. So, you know, life is life is funny how it brings people together.


I guess the the the the catalyst that created what we now call sphere begins in the high-tech program. So from a health care perspective, when we got involved, I was running a company that won the contracts here in Southern California to do the, EMR consulting and implementations under the regional extension centers. Okay. We did about twenty five hundred EMR consultations and implementations, and then help them to go through the meaningful use process as a component of that so that they can get their funds.


What we found in twenty five hundred completions was that zero organizations implemented any policies and procedures to protect the data now that it was in a digital market.


Oh, wow.


Zero. Not one.


That's Right?


That's a that's a painful number.


And it was scary. It was scary because we were running a business, and we thought, wow. We're really exposed here because we're the organization as a consultant, we're the organization helping them to implement this. So if there is a breach and the word breach had already started to come out was being discussed Right.


What's our liability? Because we're the ones to set it up.


Yeah.


And, you know, you know, everybody wants to pass the buck, so are we gonna be the guys that they pass the buck to?


So that became part of the catalyst that led to us saying, well, how do we solve this problem of protecting digital patient data now that we've moved it out of on the wall and behind a locked door Mhmm.


To on premise in a server or in the cloud? Right. Right?


Because obviously cloud computing became a big thing, for a lot of doctors who probably never even knew cloud computing prior to Sure.


Moving to high-tech. Fair enough?


Yeah.


And and by the way, they were really not interested in hearing all this techno battle. No. Right? Yeah.


We needed to just give them a solution Mhmm.


To a potential problem.


Right.


And and, again, I'll go back to what Rob said. Most of the clients when we spoke to them in the early years was HIPAA Schmipa.


Mhmm.


You know, HIPAA police really don't exist. They're not out there.


And by the way, Ray, in the history of my business, I've never been breached.


The and, you know, we still hear that. We still hear, well, I it's never been a problem. We still hear, is this really something that I have to care? Why are you trying to make me afraid? What are you trying to sell me this time? You know?


Oh, absolutely. Absolutely. And and they truly believe that because look. I love the doctors, and and I have a lot of respect for them.


But, unfortunately, they don't have a lot of respect for me because I'm not the guy they're golfing with. And And the guy they're golfing with is another doctor. And, unfortunately, that doctor told him, yeah. We've never had a problem with that either.


Oh, okay. I got a consent I have a consensus already. Yeah. All the guys I golf with have never had a breach.


So why am I gonna go spend any kind of money? Forget high or low. Any kind of money to implement a solution that really has no value to me because I have nothing to worry about.


Right? It's hard to become that hurdle.


And and what's frustrating is that in the health care space, and and maybe you've seen this as well, there is not a central reporting that tells people for a lot of a lot of my listeners know PCI. They're very familiar with if you have a breach of cardholder data, the the payment brands know about it. Your acquirer knows about it. And a lot of times, the merchant is the one that is told by the acquirer, hey.


You we believe you have a breach just because of patterns that they're seeing in the data and the use of the fraud fraudulent, actions in the data. And so there is a kind of a centralized way to say, look. We know that this data is out there, and we know that it probably comes from your systems. Let's go fix it.


HIPAA health care data doesn't have a central view on things. So it's really, the the responsibility becomes distributed and the the visibility in a distributed system is not there in the same way. So people might think that they've never had a breach, but they could be wrong because they haven't even been looking for it.


Well, I think you hit on the, one of the challenges in health care is that as a patient and we talk about this a lot with the kind of the patient ecosystem. Yeah. So you have a hospital, you have the general practitioner, you've got the specialists that hang off the general practitioners in different offices, you've got pharmacies, labs, imaging centers, hospitals, surgery centers, and that's all just for one patient, for one procedure Yeah. That got referred out between all of those different entities with that same exact patient data flowing through different systems all across that. They wouldn't even know if they were breached. So when we talk about providers who say, oh, it's never happened to me, they have no idea. The average, discovery time on breaches is over two hundred days and it's by a third party usually.


Yeah.


And so and that's the thing that the physicians and some of these provider groups and clinics just don't even get. They probably have already they've been breached. They just don't know it.


Yeah. And let's add to that what Rob was saying. The average number of people that will look at your medical record in a small organization is five. When you get into a large hospital group, it's over fifty. Yeah.


When you take the extended ecosystem that you describe described, Jen, it gets to well over a hundred people.


Yeah.


And then go back to what Rob said at the beginning. Seventy percent of all of the breaches that occur in the United States are from people within the organizations.


Right.


So if you have a hundred people looking at it, you have a seventy percent chance that one individual might do something nefarious.


Right.


And you only need one?


Exactly. And you hate to say, hey. The people that to to the people in health care, hey. Hey, the people you're working with probably are gonna cause you a problem because no one wants to believe that of the people that they're working with.


But the more people have to have access, a lot of times, especially when we're talking like you just described, the doctor and the specialist and then their assistants and then their admin. And you look at that, and the health care world says, hey. We need all these people to have access to look at this, and it's hard to, you know, kinda say who needs access. So we're just gonna open it up so people can do their jobs.


Right? Yep. And I get that. People need to be able to have access, especially in health care.


But then that that adds that complexity of, well, who's actually looking at it, and how do we know?


And what's their state of mind? I think one of the things we hear a lot is, well, I I know my people. They're good people. Yeah. And they might be you know, your experience has been good. When you hired them, they were good people.


Yeah.


But you never understand sometimes that life changes that happen behind the scenes. And and we have multiple stories of folks that have had hardship in their life or financial problems or the bad influences around them. And all of a sudden, they're grabbing that, that record to go fake to be a fake patient. They'll impersonate patients.


Yeah. They'll go to multiple doctors to get, you know, oxycodone prescriptions Yep. So they can sell them. They can sell them on the open market.


And you as a patient, all of a sudden one day have a collection agency counting down your back because you haven't paid for your prescriptions, your office visits, and all these things. And you're like, I don't even know these patients, these providers. What are you talking about? Yeah.


You have a collection agency, you know, ready to, you know, turn your wife upside down. So Yeah.


And I and I think it gets extended there too.


We we heard when we were at the conference at the, at the boot camp, the one individual who talked about the business associates business associate who is responsible for the breach. Right? So it's not always the actor that you might think it is. It may not be your primary care physician's office or even the hospital you're at.


It could be one of those ancillary organizations who has access to your medical record that actually stole the data. So as the patient, how am I gonna know? How would I find out? And who's looking to make sure that it is safe?


You know, LifeLock isn't gonna solve that problem for you, by the way.


No. It's not. No. And and then you add to that another avenue, which is the the Medicare fraud. You know, people are taking these and and the government is just spending the money spending the money on on something that's not even being supplied to someone because it's somebody's taken records and set up a fake a fake account and, oh, hey. Give us this money for treating these these people using a fake doctor and fake patients, and all the money is going in places that it shouldn't be going.


We're very popular in Florida because Florida has a lot of that.


Yeah. I mean, I think this is why we got to the business. Right? Why why are we, you know, out there in the marketplace working with groups?


It's because we've we've we're passionate about the fact. You just need to keep a watch and look for patterns and look for things that and surface them early. Right? I mean, that's the the biggest thing about all this is the quicker you find out about these anomalies, these things that are happening, the the better it is for the group, for the practice, for the hospital.


And that that that that's the reason we were born, if you will, is to basically surface these things very quickly. And it's not always that the people within the group are, you know, doing nefarious things. Sometimes it's from pure boredom, but also it's bad actors that have now come in and are impersonating users and now causing some trouble that way. That's that's probably an area that that we see a lot of, activity these days.


And so, you know, we we cover all of that.


Let me let me go back full circle. I don't think I I finished it for our listeners.


So why did we build Sphere? Yeah. How did we build this product? Okay.


Rob and I took a look at this when we started talking about it, and we realized that there was a necessity to know who was looking at your medical records.


Right.


Not just from a a covered entity physician's office perspective, but from a patient perspective. And then if you look at what HIPAA says, you know, we wanna have user activity monitoring and system activity monitoring, and we wanna process in place that tells us, is that data being kept safely? Yeah.


I'm I'm not quoting.


I'm giving you a rough idea of what it says here.


Right?


So we started to look at, well, how are you gonna do that? Is there any technology that would make that easy for those twenty five hundred clients that we had originally to do that? And the answer was no. There was nothing Yeah. That existed in the market that even remotely got close to looking at and not the network, get a little technical here, not the network, the actual application.


Because that's where the data sits Mhmm.


Is in the application. There's a database attached to that app. It's pulling that data. And that data, the way we look at it, it's the audit log data.


Mhmm. Because the audit log tells us who went in, who logged in, who logged out, how long they were there, what records did they look at, what were they working on in a day. And from that audit log, I can pretty much tell you what the performance of an individual is on a day to day basis. Right?


So we said, okay. How do we get that data? Yeah. Hundreds of thousands, if not millions of lines of data every day that are generated.


Do all the analytics on that to determine whether or not the individual who's logged in is who they say they are. Right.


It's a big question.


And then that they're looking at the data they should be looking at in order to provide care to the patient. Right. So as Rob said, the goal is just to protect patient data so that when somebody's in your office visiting you in what I like to say their most vulnerable state, we don't go to them because we're really happy and healthy. We go to them because we don't feel well and we need care. Yeah. And while I'm there, I'm worried about whatever is causing me pain.


Right? And I want you to ease that pain. So prescribe something, give me a specialist, do whatever you need to do. I'm not thinking about my digital data, and is it gonna be safe or not safe? Right. And if you shove some forms in front of me and say, sign these or we won't provide care to you, guess what I'm gonna do?


Sign them.


I'm gonna sign Yeah. Yeah. Right? I'm gonna sign them. But that doesn't mean that it's protected.


Right.


That's you know, that came up. I think Donna mentioned that at the conference. It doesn't mean it's protected. It just means that they've checked the box and given you a form to sign. Yeah. So we looked at this and we said, what could we do to solve that problem?


Mhmm.


And what we what we determined was there was technologies that were available that could be coupled together, and this is artificial intelligence and machine learning. And we could build a model that would read the data coming from the audit log every day.


Mhmm.


A site understand who has credentials to see that data and start to look at Jen, Robin, Ray, and say, okay. They all log in at eight o'clock. They all log off at five o'clock. Jen's Jen's in the back office.


She's doing billing. So she looks at no less than a hundred records a day. Rob's in the front office. He looks at a hundred.


But Ray is a specialist over on the side, and he sees maybe ten people a day. So if Ray looks at a hundred records in one day, that should be a signal to somebody that Ray's probably doing something very different. It could be an assignment he was given, or it could be that Ray's up to something not good. Sure.


We just fully know that so that we can investigate that and determine whether or not that is good or bad. I'm keeping these things simple.


And that's what Sphere does. Sphere looks at one hundred percent of all of the data every day for every user, or you can look at the other way, for every patient that comes into your office and tells you whether or not there is any anomaly associated with the way that move through the process in the office and brings it to your attention so that you can do an investigation and determine if there actually was any breach that might have occurred. Nine times out of ten, not actually, ninety nine times out of a hundred, the answer is no.


But it's that one that you wanna catch, and that's what we do. Since we started our product, we've captured over fourteen hundred breaches that have occurred in our customer base.


That's amazing. Right? You've been really good about keeping this nontechnical, which I really appreciate because, I mean, although we do have quite a few technical people who listen, we also have people who are like, oh, I don't understand what you're saying about the network traffic right now.


Right. But but I would so what I would love to hear is maybe some more examples of how do people use this to know that the the right people are looking at information and the wrong people are stopped from looking at information because they're it's detected.


Yeah. So in our dashboard of of events, what the user is going through is that they'll take a look at some of the the activity that we flag.


Now we're capturing all of the logs and all the data, and so you can run a lot of different types of looks in general as well.


But for there's a couple of things that we we flag graph the best. So every time we go live, two things flag immediately. One, self examination, which is people in their own records, and two is what we call relative snooping.


And it just happened one hundred percent of the time, we fly this immediately.


It's just it's and people are they don't believe when I tell them that. But so give me example. We had a case out of, Arizona where the office manager of this practice and it was a smaller type of cardiology practice. I think six doctors, thirty five people.


Okay.


Not not huge. But by the way, these are the targets of the bad actors now. These smaller entities are actually where they're going first because there's less, you know, hardened surfaces on the outside of these, these entities. Right?


Right.


So they're going to the softer targets. But the person who ran the practice knew intuitively, she's like, I know there's something going on here. I can't quite figure it out. When we went live with them, they found out that one of the people within their practice was in their own record prescribing themselves, drugs.


They not only had a problem well, they had three problems. One, how did this person get authentication capability to prescribe themselves drugs?


Right.


Two, they had a criminal issue because they were doing this in multiple physician practices all over through Arizona.


Oh, wow.


And and they had a HIPAA problem as well. And so that's the sort of thing where we get that a lot. Like, they know stuff's going on, you know. And typically, in a practice that size, there is probably fifty thousand lines of you user activity per day.


And that is impossible for anyone to Right.


And the zeros and ones that are in an auto log like we do with our technology and then have any clue that there's something going on. And so we like to talk about we when Sphere comes in, we help create a culture of compliance.


If people know they're being watched, they tend to do the right thing. Because I don't think everybody is always just out there to cause trouble. I think a lot of times they're just bored. And so and and we have multiple examples where, you know, like, we till we talk about what happened in Chicago with the actor who went into Northwestern Hospital and was claiming all sorts of stuff. But when the actor went into the emergency department, thirty six people looked at his record.


They didn't know about it for weeks and weeks and weeks. It was only after TMZ or somebody published some data out of his emergency visit that they actually go back through and look at the logs.


Well, they had to fire thirty six people. Now most of these people were just bored. Right? They didn't they saw an actor come in as a celebrity.


They saw an opportunity to kinda get some inside dirt. But if they had known they were being watched or their activity was being watched, they may not have done it, and they may not have been fired, and the hospital wouldn't have to hire a bunch of new it's very expensive.


Sure.


You know, just from that old and so that's kind of what we talk about.


And there's there's so many instances where we we we come into a situation, where we had a group of Los Angeles where a former employee was trying to access the system. We flagged it. And what we what we helped them identify was the fact that their termination processes were all screwed up. They terminated an employee.


They denied him access of the EMR. But when they tried to get into the EMR and we flagged it for failed login attempts, they also surfaced the fact that they hadn't, deleted them out of their network.


Yeah.


So through VPN access out of a whole different state, they were accessing into the network to gain access to the EMR. And so it's just things like that that are pretty interesting.


And that was an employee who had been dismissed three years prior.


Oh, no. And the and still had access.


And and I think the other interesting part was from another state. And so a lot of times, people don't realize that I mean, we're so interconnected now that getting access regardless of where you are can be a problem.


And then but not all and I would say, actually, most health care organizations don't have the right, tooling in place to to know when somebody's logging in from someplace else and alert them on that. Let them know that this person isn't even they don't even go here. Right?


Well well, think about that example that Rob just gave you and and why this person, could do this. I think it's interesting. I I maybe I'm I'm morbid in the sense, but they knew what the convention that was being used for passwords.


Mhmm.


They knew what the emails of the employees look like. First name dot last name, hash, such and such. Okay. So they probably could take a wild guess.


And and this is what we always tell people. Sphere isn't assuming that your employees are bad. What we're worried about what I worry about for my clients is that somebody's gonna get a hold of your credentials and use your credentials to get in your system. Because as great as technology is, it doesn't know if you are or aren't you based on the login and password.


Yeah. Right? So if they can log in as Ray Ribble, then they have all the access that Ray Ribble has. Mhmm.


And if I'm your executive in a a large health care organization, I have really good access.


Right.


Right? And so I can go look at a lot of different things.


That's what we're looking for. And the abnormality will be that the person who's looking to steal Mhmm. Won't know how you use the system, and that'll get flagged through our technology in the way you move through the application to get to that pot of gold Mhmm. Will be very different than the way the person who normally uses the application. And we bring that to your attention so that you can take action within twenty four hours and shut that down and stop that. Because as Rob said, two hundred, three hundred days later, you don't wanna be told by some third party, oh, by the way, you got breached.


Right.


And now you have fifteen or sixty days to go solve this problem.


So so what Rob was saying earlier about somebody had a feeling something wasn't right. Is that do you see that, that's why people come to you? There's they they kinda feel like something's not right and they they want the tooling? Or how how do people, you know, engage with you?


Yeah. I think for obviously, there's three reasons why people come to us. One of them or all three of them. One, compliance issues.


Right? They they have to do this as a statute. So, you know, user activity review, documentation, all the stuff that's required under HIPAA. It's in the security risk assessment that they do every year.


And so we get a lot of the kinda compliance stuff. We also have the kind of risk mitigation strategy. There's a lot of civil lawsuits that are happening now now that the attorneys have figured out a way to pass personal protection privacy laws nationwide, which are, extended beyond HIPAA, and it's allowing them to file class action lawsuits. Mhmm. So there's a lot of risk managers who are saying you gotta engage in best practices and set your system up for the, you know, to catch things like this. And I think even, cyber liability insurance companies are starting to require this as well. But the third reason and probably what brings folks to us in general is employee snooping.


There's just an issue out there today where people are concerned that, people within these entities are looking at stuff they shouldn't look at, neighbors, friends, family. There's and and I think that for a long time, it was kind of accepted that, you know, it was okay. You know, look at these sort of things. And not really, again, trying to do it because they have bad intentions.


But a lot of times, they do. I mean, we have, a situation in Ohio where a physician looked at his ex wife's new husband's record so he can exfiltrate data about his, you know, behavioral health issues in a custody filing in order to gain, custody of his children in financial, advantage. Oh, no. And so it's things like that that sometimes people don't really think about, and that's why the the information is so valuable.


We see things, and the FBI sees things, on a daily basis that people just don't understand are happening.


And it's not to it's not to scare everybody. It's just to say, if you set up, you know, basic, you know, best practice from a technical standpoint, you can prevent most of this from ever happening.


And that's the thing. And then that's what, you know, the OCR investigator said at the boot camp. You You know, we're not here to to destroy you. We want you to just just do the right thing so that, you know, you can protect patient data as best you can.


Another thing that I hear a lot is, well, I don't care if people see what I what my data is. It's not gonna it doesn't matter to me who sees it. But they don't understand that seeing it means affecting it, and affecting it can mean, the difference between let's say you have a an allergy to a certain medication, and that person has malicious intent against you and takes out that information about your your allergy to a common medication?


What what if you're hospitalized and then get that medication and then you have some pretty serious or or or or critical health effects because of that information is missing, because of someone else's malicious intent. And so we don't like to think about these things on from other people. Other people would do these things to us. And yet, this is something that we see over and over again.


I heard there was legislation that's being brought forth for the United States just yesterday that they want to, improve the technology around medical devices Yep.


Specifically with, pacemakers because the bad guys have figured out a way to interrupt the pacemaker. Yeah. So to your example, you that example, take it to the other direction. What if I, as the nefarious individual, wanted to create heart attacks?


Yeah.


All I need to do is break into whatever machine, Cedars Sinai or somebody is using Yeah. Go in with pamper with that signal, and everybody who's attached to that signal, I can go and do bad things too and cause a lot of problems really fast. Right? And there's, unfortunately, there's just too many strange individuals out there that might find that fun.


Yeah. Well, you have, like, they you have a nation state actors. Right?


If I have a infusion pump in in any particular hospital, there'll be hundreds of infusion pumps around the hospital that are connected to that network. You can they they actually demonstrate this. They can you can get into an infusion pump and start changing dosage or the, pathway of drugs into people's systems just to cause panic, right, just to cause trouble. You know? I mean, that's you know? And you mentioned something else that's kinda mentioned. If your medical record is corrupted, when you are in the hospital for whatever the procedure is, now they have inaccurate information.


Yeah.


Right? And so how does that affect the quality of care at the point of care? It's it's a big issue potentially, and that's why it's so important to protect the, that data.


Absolutely. Yeah.


And, again, what we're what we're promoting isn't, to replace technology at the network level. We think we work hand in hand as part of a much larger cybersecurity framework.


Yes.


And that you're looking at the network and you're looking at the application. And and Rob's point is really important, and that is there's not just one system, the EMR. Mhmm. You might have ten, fifteen, twenty. A hospital has over fifty different systems that are all contained PHI and can be, accessed remotely that can be used against a patient or against an organization.


Exactly.


Right? Yeah. And that's what we're trying to protect. That's what we're trying to make sure that they know who has access to that information, and is that individual doing and using that system the way that's appropriate for their roles and responsibilities.


I I really appreciate the the stories and the and the conversation, the examples, because we're not trying to scare people. This is that it's not about that, but it is about sometimes we hear, oh, HIPAA, like you said, HIPAA shmippa. We hear HIPAA and people say, oh, it's just this, you know, thing I have to abide by. But but really understanding that there are real world world consequences to not taking it seriously.


The the the cybersecurity side of things, the the enforcement of that privacy, can can really have actual effects. And I know that in the health care world, people truly care about the health and and well-being of the people that they serve. And so, being aware of this can maybe help people make additional decisions about how to protect that information. Well, thank you so much for being here and joining me today.


Is any any final comments before we wrap this up?


You know, I was on the the call with someone from the boot camp.


They have a a kind of a medical practice where they kind of are remote, and they move around a lot throughout, parts of the United States. And he said, you know, I got to meet so many people at the boot camp and experts. He was just giving me peace of mind. It wasn't so much that they were trying to sell me stuff with all that. He goes, I just knew I met all these people. I knew who he'd go to.


Yeah.


And and, you know, whether it's Carden or Jen, it's a Security Metric. It's just everybody seemed to have a huge interest in helping with their expertise, not necessarily, you know, trying to just jam down product and make a bunch of money. It's really, like, no. We're just we we we're gonna do just fine from business standpoint, but we truly wanna help.


And whenever I'm on the, we're talking to people, I always say, look, we need to figure out how to get this into your environment because it's so important. Yeah. And, you know, business is important, but protection is also really important as as our passion. So, that's that's all I have to say about that.


Exactly. Yeah.


Well, thank you again so much for for joining me. I really appreciate it. And and, I think what you bring especially to the healthcare world is very valuable and and I hope that people find a lot of, value in hearing what you had to say today.


Well, thank you for having us, Jen.


We were Thank you so much.


Thanks for watching. To watch more episodes of SecurityMetrics podcast, click on the box on the left. If you prefer to listen to this podcast, it's available on all your favorite podcast platforms. See you on the slopes.

Get the Guide To PCI Compliance
Download
Get a Quote for Data Security
Request a Quote