The CyberPHIx Roundup: Industry News & Trends, 6/30/22

Subscribe on your favorite platform:

About the Podcast: The CyberPHIx is a regular audio podcast series that reports and presents expert viewpoints on data security strategy for organizations handling patient health or personal information in the delivery of health-related services. These timely programs cover trends and data security management issues such as cybersecurity risk management, HIPAA and OCR compliance strategy and vendor risk management. Meditology Services, the healthcare industry's leading security and compliance firm, moderates the discussions with leaders in healthcare data security.

The CyberPHIx Roundup is your quick source for keeping up with the latest cybersecurity news, trends, and industry-leading practices, specifically for the healthcare industry. 

In this episode, our host Brian Selfridge highlights the following topics trending in healthcare cybersecurity this week:
-

  • Bombshell report of hospitals sharing PHI with Facebook
  • HIPAA compliance analysis for covered entities sending PHI to Facebook
  • Legal exposures for sending sensitive information to social media and other website tracking vendors
  • Recommendations for healthcare organizations to assess and respond to patient concerns about unauthorized PHI disclosures to Facebook
  • HHS issues new guidance for healthcare organizations to improve their cyber posture
  • New HIPAA Security Risk Analysis (SRA) tool from OCR
  • New OCR guidance and industry feedback related to “recognized security practices” for healthcare organizations (I.e. safe harbors for OCR enforcement) 
  • HHS issues warning to healthcare entities about dangerous Emotet malware proliferation
  • CISA is developing new guidance for helping organizations overcome supply chain risks
  • FBI prevents “despicable” Iranian cyber attack on Boston Children’s Hospital
  • DOJ shuts down SSNDOB dark web marketplace
  • Massive arrests and seizures of social engineering attack infrastructure across 76 countries
  • OCR issues guidance on the upcoming expiration of COVID-19 enforcement exemptions for telehealth HIPAA security mandates

PODCAST TRANSCRIPT

Brian Selfridge: [00:00:11] Good day and welcome to The CyberPHIx Healthcare Security Roundup. Your quick source for keeping up with the latest cybersecurity news trends and industry-leading practices, specifically for the healthcare industry. I'm your host, Brian Selfridge. In addition to this roundup, be sure to check out our Resource Center on Meditology Services, which includes our CyberPHIx interviews with leading healthcare, security, privacy, and compliance leaders alongside blogs, webinars, articles, and lots of other educational material. We've got a great and pretty lengthy session to cover today, so let's dive into it, shall we? 

Brian Selfridge: [00:00:47] All right. Our first story is that a bombshell news report was issued by The Markup publication on June 16th. This year in their publication called Facebook is receiving sensitive medical information from hospital websites. That sounds like a bombshell, doesn't it? So specifically, the report claims that healthcare organizations across the country have installed meta Facebook's meta pixel tracking tool on patient portals and other patient-facing websites. The meta pixel platform reportedly sends and this has been backed up by screenshots and evidence. So I'm not I'm not putting that in air quotes sends Facebook Protected Health Information, including patient names, IP addresses, names of doctors, appointment information, prescription details. And the list goes on right from many of the nation's leading hospitals. Now, hospitals and other healthcare delivery organizations are scrambling to understand what this means for their organizations from certainly a HIPAA compliance perspective, and a legal perspective. And I think perhaps most important, kind of the patient trust perspective. 

Brian Selfridge: [00:01:45] Hey, we're sharing your PHI with Facebook. Is that okay? I suspect probably not by the average patient's barometer. So I will make just a legal disclaimer here. Right. So I'm going to get into some HIPAA regulatory compliance stuff and my interpretation of it. But this podcast is not your legal counsel, right? So don't take it as informative in your journey here, but certainly engage legal counsel and other experts in your universe to make sure you are looking at your specific situation. So that's my little disclaimer there. So the question is, is HIPAA, is it a HIPA violation to have hospitals send PHI to Meta and Facebook? That's a very legit question. Right. And as with any situation. Right, the answer is a little more muddled than we would like. So the answer is maybe could be it might be a violation, might not be. So let's tease that a little bit. Hospitals and other healthcare providers, you know, we'll just refer to them in the hippo world as covered entities. Right. They typically send large, huge volumes of PHI to third parties. Right. Facebook is not is a third-party vendor in a sense, for all kinds of purposes. I think we should not be unrealistic with ourselves say, oh, no, there's a lot of patient information leaving the hospital and going to third-party companies like that's going on and mass and its scale. 

Brian Selfridge: [00:03:02] So while Facebook has a less than desirable reputation on their ability to manage the privacy and security of that information, that aside, there are many vendors that are in that boat as well that don't get quite the press. So I just want us to be mindful of that. So if, you know, an organization, a hospital has signed a business associate agreement with any vendor in this case Facebook, you know, it's technically legal and okay for hospitals to send that information to Facebook. So and also I put another disclaimer here. So if you don't have a business associate agreement in place and we'll come to that in a minute, if you did have prior patient consent and authorization explicitly in writing, that says, hey, we can share your information with Facebook, that's another way to get legal protection. But I cannot imagine a single healthcare provider giving a sheet of patient paper out to their patients saying, Hey, would you mind if we share your information with Facebook, your patient information or protect health information? So I don't think that's really happening. So we'll just kind of discount that for now. But basically, you've got to perform a review of your business associate inventory. If you are sort of faced with this question like, Oh no, where are we sending stuff to Facebook? Do you have a BAA with the Facebook business associate agreement? If there is, then you may not have a legal HIPAA violation situation to worry about at the moment, but you very likely would have some public relations and reputational fallout with respect to patient trusts. 

Brian Selfridge: [00:04:32] Right. So even if you're protected, I think it may make sense to discontinue the use of these types of tracking tools. And maybe, maybe it's worth sort of backing up a little bit and to what Facebook's trying to do here. So basically, the tool, my understanding of it is that it's a website analytics tool. So Facebook will provide a hospital, for example, free use of their analytics tool. You put this script onto the server that's running your website and it's going to collect a bunch of data about what your users are doing, both on your website as well as other websites, and provide you with insights and intelligence that might help you from a marketing perspective or might help you better serve your patients or understand what communities your patients are logging in from by looking at their IP address and all this stuff that that could, in the use in the hands of good people doing good things could be useful, right? The trick is that in exchange for Facebook providing this free tool, they also get to scrape all the information and send it back to Facebook for whatever Facebook wants to use it for. Right. And that is a little murkier as to what that purpose is and how that data is being used. 

Brian Selfridge: [00:05:39] Even some of the folks internal to Facebook that were kind of anonymously interviewed during this this this expert. Is we're saying, look, we don't even know nobody inside Facebook really knows where the stuff goes. And sometimes we're not even sure it has a direct purpose right away. So that's disconcerting for a lot of reasons, but that's generally how this technology works. I just want to be clear about setting that up. So if you have a BAA in place with Facebook, you know, you've got some legal protections there. If you do not have a business associate agreement in place with Facebook, then covered entities may have some HIPAA regulatory compliance exposure if they quote-unquote kind of knowingly procured and implemented this website tracking application. Like you've acquired a tool, you're using it, it's tracking stuff, it's sending it outside. You do have an obligation to vet that organization, make sure that it's minimum necessary information, supporting a specific business need and not just like, well, we sent our information out to wherever because they gave us a free thing. That's not a great reason. But certainly, you have to also vet the privacy and security of that organization, make sure that that information is going to be protected all the way downstream if nothing. Certainly from a HIPAA perspective, but if nothing else, just from a duty to your patient population that you're going to safeguard and be the custodians of this super-sensitive information and understand that everybody down the chain needs to be making decisions with the patient's trust in mind. 

Brian Selfridge: [00:07:06] And I'm not always sure that that's, you know, for intentionally or unintentionally, I'm not always sure that's the first motive and something that can get folks into challenging situations. Okay. So let's say your organization is using Facebook's meta pixel application. And the question is, what do we do? How do we assess if we've got a problem or not or whether we need to take any action? Certainly, you do have a problem one way or another that you can choose to deal with in a variety of ways. I'm going to spend a lot of time on this story because I think there's a lot that can be extrapolated here, not just in this case, but for other sort of third-party vendors. So let's roll with me on this a little bit. So if you have deployed it, I highly recommend commissioning some sort of analysis alongside your legal counsel compliance teams, bring all the experts to the table, and understanding the scope and scale of the implementation. Where are we using this tool? Is it on patient-facing externally facing applications like your web portals or your patient portals or other sorts of things along those lines? Or maybe it's used somewhere else internally or on a marketing site that doesn't take any high but is just sort of an informational site like all that stuff matters in your analysis. 

Brian Selfridge: [00:08:18] I think that's sort of the first step is understanding where it's used. And then, you know, of course, I mentioned earlier determining whether or not you have a business associate degree in place. And even if you don't, that doesn't mean the analysis stops there and you're like, okay, well, you know, we've got a problem. Or even if you do, it doesn't mean it stops there. But you need to understand any sort of legal protections you have in place from a hippo perspective. And then really the results of that analysis, I think should dictate what are your next steps which may include some things like public communications, right, with the organizations analysis and position on the situation especially. You know, many of the organizations that were are responding to this story were actually named outright as part of this exposé that was done by The Markup, the publication to put this out. So you kind of have to respond if you're one of the hundreds of hospitals that were sort of called to the carpet here pulled the carpet. So, you know, that analysis should include again, maybe you do some public communications stuff, maybe you remove the tracking tools like meta pixel and others that are like it. 

Brian Selfridge: [00:09:21] Like you want to kind of expand your scope to see, are we doing anything else like this? With other applications, you may want to change or augment policies or privacy notifications or notice the privacy practices. If there's some implication here that you are doing a practice that you want to continue doing and just make sure that that's clear to your patient population. And there may be some public breach reporting to HHS. I think in the worst case of this if you've had PHI kind of scraped from the meta pixel application and put out to Facebook without a BAA and those types of things. So that's all stuff that could be considered a possible kind of action here. Now, the meta pixel and Facebook tracking tools may only be the tip of the iceberg, right? Like so we talk about do you have a BAA in place or don't? I think it's really important that organizations if you haven't done this as part of your either high-tech refresh when those policy provisions changed a couple of years back or more recently, do a business associate inventory analysis. And this is something we do all the time at Meditology here is actually a discrete service. I'm not saying that to plug it. I'm just saying that there's a desire for this and a need where essentially what organizations will do is look at their existing business associate agreements. 

Brian Selfridge: [00:10:33] They'll look at their procurement like who have we paid? Who have we bought products and services from? Do they match up? Do we have business search agreements? Does it make sense for everybody? We've paid to have a BAA and some of them perhaps not like your janitorial services. Right. You don't need a business associate agreement with them right there. Yes. They may be throwing out the trash cans with some like, but you're probably not going to need to be back in most cases. Also, your PHI shouldn't be in the trash cans, right? It should be in the shredder bins. So, you know, maybe that's the appropriate place to spend energy. But then identifying where those gaps where are we missing business associate agreements that we need them. Facebook might be an example if you did such an analysis and then you get with your vendors to get the latest versions of your Business Associate Agreement template signed and up to date. Even if they're evergreen, they need to be updated when major shifts happen, like high tech, which I mentioned a little while back. So definitely want to do that. And also look at any other patient portals you have or externally facing websites for any kind of third-party add-ons or trackers to see if they may present a similar risk to the Facebook meta pixel tool. So, you know, HIPAA is one important piece of the puzzle. 

Brian Selfridge: [00:11:43] Are there any other legal exposures? I think this is a common question that I'm hearing crop up. And since this story broke, besides OCR requirements. So I think one really important area to harp on here and as I have on the podcast over the last couple of years, really so loyal listeners will bear with me. But here it is comes up again and that's around class-action lawsuits. So healthcare organizations could face legal damages and costs related to the meta pixel Facebook thing, even if HIPAA violations and OCR enforcement don't come into play for whatever reason, this could be just that OCR is not going to get to everybody or that you actually have some coverage from a protection standpoint. But this is really due to these class action lawsuits for. So, for example, this is not a hypothetical. If you think I'm saying this guy is falling and all that partner's healthcare system in Boston, which is now called Mass General Brigham settled a class-action lawsuit related to this in 2019. Just a couple of years ago, they paid $18.4 million in a settlement, which is settlements go right. We admit to no wrongdoing. We didn't have anything wrong. But we're going to give you $18.4 million to the class parties, Jane, the John Doe, and all that. And it was related to partners use of Facebook's meta pixel and other tracking tools on their website. 

Brian Selfridge: [00:13:04] So this is very real stuff. And also in a more recent lawsuit against Facebook. This was filed after this specific publication from the markup, this exposé. But this lawsuit was filed in the US Northern District of California. And the plaintiffs class plaintiffs claim that he, along with other millions of patients, had their rights to privacy violated by Mehta, which is Facebook's new name. The lawsuit alleged violations of the Federal Electronic Communications Privacy Act, California invasion of Privacy Act, unfair competition law, and a breach of Facebook's duty of good faith and fair dealing. So, you know, one case there resulted in a multimillion-dollar settlement, the other ones just starting out. And that's just related to this particular case. So I think if you're looking to kind of brush this one under the carpet or aside and say, you know, no big deal, we had it used, we turned it off. No prompt like you could have this class action stuff kind of sneaking up behind you. So you may want to do some analysis there with your legal team to understand the scope and scale of your implementation of meta pixel and other platforms just to prepare for any sort of legal challenges that could come down the line. You don't have to overanalyze it if you're not being sued but get some preparation. So an important question here is how have hospitals covered entities and vendors responded to this big publication? Right. 

Brian Selfridge: [00:14:24] In this situation, I always find it fascinating to see the variance in how organizations deal with responding to cybersecurity and privacy incidents. And with any of this, it often matters more how an organization responds rather than the substance or what they are saying. They both matter, of course, but I think organizations tend to get themselves in trouble by responding, in my view, in an inappropriate way that does more damage to eroding patient trust than it does to building it, even when you may have misstepped knowingly or unknowingly. I mean, a lot of these cases, like you put this tool in place, maybe you didn't know that was what was going on, but don't hide it from your patients. So, you know, we've seen it over and over again. Organizations that respond with transparency, with sort of saying, look, here's what happened, here's what we're doing, and then showing some concrete action. Right. Some actual, tangible steps that you're taking to rectify it now and going forward, I think, is my recommendation of like the right way to do this. And conversely, organizations that respond with obfuscation, defensive language, maybe legalese to say we're legally in the rights. And, you know, it's this is ridiculous that you're even asking us this and all that. That stuff just doesn't play well. I don't think with the general public and patients and business partners that are just like, okay, you may be in the clear or maybe you're you have some merit to your argument. 

Brian Selfridge: [00:15:49] But I don't think it does a service to the organization and the brand generally like there was the was it Kaseya had the breach last year the IT company third party vendor and the CEO came just lashing out the first couple of weeks talking about how an intern did this. And it's the intern's fault that he had access to administrative code and that was company, you know, all the stuff. It's like I'm not sure that was the best approach. But anyway, what we've seen from the actual responses you get into what organizations are saying, I'm going to lump them into really three or maybe four types of categories of responses because there were a lot of organizations implicated. Response type number one has been we're reviewing and assessing the situation, right? So this is pretty typical, right? This is a noncommittal response used to buy time organizations that say we're reviewing it, we'll get back to you. You likely have to issue another response sometime in the coming days or weeks that provide some more transparency around the situation. But it does buy you a little bit of time. It's certainly a valid response if you're if there's a lot of external pressure and you're just not really sure enough about the facts to make a statement. That's I think it's best to be quiet until you know what's going on then than to say something silly. 

Brian Selfridge: [00:17:04] So organizations that issued responses like this where Community Health Network and Sanford US Medical Center, so let's all keep them honest and follow up with what they have to say in a couple of days. Hopefully, response number two type let's just put them in buckets is hey, Facebook's megapixel has been removed from a website we're no longer using it. This response, I think is a step in the right direction. It demonstrates as I mentioned earlier, that the organizations take this seriously. We are taking concrete actions to protect patient privacy and security. I like that sort of. We are doing something now, but I also recommend that kind of statement includes some commentary about any reviews and analysis that are going on and what plans you have for corrective action. How do you prevent this from happening going forward? How do you how are you going to clean up any data that's been transmitted to Facebook, sort of knowingly, unknowingly, whatever? So I think this is sort of the right ballpark of response in my view, but it needs a little bit more meat to it. So organizations that have issued responses in this category include Froedtert Hospital. Houston Methodist and Novant Health, just to name a few. The third category of what I'll say for categories really is the more legalistic kind of approach, the more defensive kind of approach. So some quotes where no is disclosed through this portal. 

Brian Selfridge: [00:18:21] The use of this type of code was vetted and is referenced in our terms and conditions. So it's sort of this kind of like hit you with the lawyers stuff. We did nothing wrong here. These responses, these types of responses may be technically accurate and legally accurate, but I think may be perceived by many patients as dodging the issue of their perceived violations of trust and privacy. So the report from the market markup, sorry, indicates that the pixel tool may transmit PHI in hashed formats. For example, I mentioned that earlier, so some organizations may the could be scraped, then it gets hashed, which is just sort of a summary of that information that then can be linked back to the patient later. It's not like the actual, but it sort of it's enough to say the patient was seen here for this purposes and you can match it back to their Facebook profile and other stuff. So it's not it's still high. It still has their IP address, which is an identifier, official identifier under HIPAA. So it's still legit PHI But there is some legal folks saying, well, it's not actually the PHI, it's just this hash stuff and they try to obfuscate it with technical language like that and kind of technology that that really doesn't solve the problem. There are also cases where the transmission of this type of information, even if it is PHI, is permissible. 

Brian Selfridge: [00:19:36] If business associate agreement is place, you know, those types of things may very well be legit for a hospital to say, hey, we followed all our terms and conditions. Nothing wrong with this, but I think being legally correct and within the boundaries of HIPAA does not necessarily mean that patients will let you off the hook for sending their sensitive information to Facebook, of all places. So I recommend the organizations that claim this sort of no foul. Hey, no problem here. Nothing to see here. You know, do some more analysis and consider discontinuing the use of these types of tools. I mean, unless they're mission-critical to your business operations, which I can't imagine, they really are, they might be nice to know information that help you in some pieces. But just for the PR piece of this, I would get I would just discontinue this stuff and, you know, rely on the common sense of patients that are going to snuff out when you're trying to pull the legal wool over their eyes. So organizations that have issued these types of, I guess, sort of combative, legalistic kind of responses have been Henry Ford Sharp Memorial, Northwestern Memorial, and the University Hospitals of Cleveland Medical Center, just to name a few. And I'm not I'm not shaming anybody here. I'm just. Calling the facts like they are those organizations, those their statements. You can read them. 

Brian Selfridge: [00:20:46] There's a summary that's been put out. I actually put out a blog on Meditology Services that covers all this stuff too. So if you want to go to the link there and find out all of the responses, you can read them and interpret them for yourselves. I will note one other type of response from one of their kind of response I saw in there that kind of cracked me up. It said essentially a paraphrase here, but they said, look, we had a third-party vendor and they advised us to install this Facebook thing. So it's really the third-party vendors fault because they told us to whom we did. I don't think that that really is an appropriate response. It may very well be true and it may be emotionally true and feel right. But, you know, it's not enough to point to another vendor and say it's their fault. You've got to be aware of what your vendors are doing and be complicit in these types of decisions. Also, for what it's worth, Epic, the electronic health record vendor, the biggest electronic health record vendor, also responded. They said that they specifically recommend heightened caution around the use of custom analytic scripts. So in other words, they're saying this is not our fault, it's not our problem. If you put another tool on top of our MyChart Web portal that was scraping by and sending it somewhere else, that's kind of your fault. 

Brian Selfridge: [00:21:58] It's hard to disagree with them on that. I mean, what are they supposed to do about it? And I think a lot of other patient portal website vendors are going to take that position. So don't expect that there'll be other vendors that will fall on the sword here. I don't think that's going to happen. Okay. So just wrapping up this Facebook story and I'll get into my sort of editorial mode here. So ultimately, I think, you know, the culpability and responsibility for the implementation and usage of patient portal web tracking tools and scrapers like Facebook's meta pixel rests with the covered entities. Really? So basically covered entities that engage with relationships with any third-party vendors. In this case, it's hundreds and thousands for a typical health system. Right. You've got to do some form of security and privacy risk assessment, right. Prior either prior to procurement and purchasing immediately after, and an ongoing basis as things change and tools change and organizations change. So that may sound really straightforward, but unfortunately, it's just not happening in the volumes and depth that it needs to be happening across the industry, which is why Meditology spends a lot of time in this space. We have a whole separate company called CORL Technologies that does nothing but third-party risk because there's such a need to figure this out. And I think part of it is more than just the security cybersecurity vetting, but also what I'll call kind of the data governance aspect of this. 

Brian Selfridge: [00:23:16] Like what data are we sending? Do we need to send the entire database? Can it be redacted? Can it be just given a minimum necessary sense? And I think that work isn't happening and that focus isn't happening enough. So that I'll get off my soapbox there. But I think that's how we address root cause here with this type of stuff. How do these tools show up? How do we make sure they're being used for the right purposes? And that vetting that goes on with the vendors is really critical. So with this story, we'll continue to monitor it. I'm sure it's going to play out and have ripple effects and ramifications. Right. This is a pretty consequential, I think, privacy situation that we can't allow ourselves to move on to the next breach right away, even though we may have the instinct to do so as the next couple of ones crop up here. But let's move on to other news, because you probably want to hear about more than just that story, although I think that was important. One, let's talk about HHS, Health and Human Services and OCR stuff. There are a bunch of updates in that world this week, which is interesting and new and fun to talk about. So HHS issued a new publication with guidance for healthcare organizations to improve their cyber posture, which is a term that I really like a lot. 

Brian Selfridge: [00:24:24] I like the word cyber hygiene for the regular stuff. I like the word cyber posture, just understanding that it's an ongoing thing and no one wants bad posture, right? I don't have the greatest posture physically, but I'll try to get the best cyber posture I can. So recommended steps from HHS to strengthen your cybersecurity posture include conducting regular security posture assessments, not to be confused with risk assessments and risk analysis actually should be confused with them. It's the same thing. I'm consistently monitoring networks and software for vulnerabilities and patch management. Great defining which departments own risk and assign managers to specific risks. I like that one a lot. So it's you mean it's not just the security officer in the team that manages risk? No, no, no, no. We are the custodians, the informants, the helpers, the analytics. But which departments and which individuals, business leaders own risk and make decisions around risk and leadership teams. Really critical. I love that. They've emphasized that as one of their six bullet points here regularly analyze gaps in your security controls. So that's back to risk assessments. Again, we'll include your vendor risk assessments in that. Just harping on the prior story, define a few key security metrics, a few key security metrics. I wouldn't limit it to a few, but they say a few. I think as you mature the program getting KPIs, key performance indicators, key risk indicators, Chris in place really at every core control level, at least at the major pieces of it I think is really critical to running a successful cybersecurity business, if you will. 

Brian Selfridge: [00:25:55] It's important to running any business, but if you're running your little domain of not little, it's very big and important. But in your world of cybersecurity program within the company or companies, it's really important to have to be able to measure where you are. That which is measured can be improved, can be tracked, and you can understand the value of what you're doing. So that's a good one. And then the last one they say is create an incident response plan and a disaster recovery plan. So nothing completely earth shattering there, but I like how they've sort of brought into focus the key areas to spend time on. They also, in their report went into some specific recommendations to reduce the likelihood of cyber intrusion, including implementing multifactor authentication. I love that one. Keep harping on that one patching, of course, implementing strong cloud security controls, a little bit vague and how they've sort of defined that. But I do agree that super important implementing anti-malware across the entire ecosystem. I think the second half of that sentence is most important. We still see a lot of organizations that have incomplete deployments of their AV software, and anti-malware software as well, and disabling unnecessary ports and protocols. 

Brian Selfridge: [00:27:01] Another one that is just we don't take the time enough to assess our firewall rules and our network-facing capabilities. We want everything to be running and open and clean and working, of course. But, you know, we've got to turn off the stuff that we're not using. So pretty fundamental recommendations there, but all really good stuff. And then for incident response, HHS recommends designating a crisis response team planning for surge capacity for the team during incidents. That's a big one that most. Organizations don't quite have nailed down and conducting tabletop exercises. Right. If we don't practice, we can't get good at this stuff. This is the there was an astronaut, Chris Hadfield, that I like a lot who was talking about, you know, you don't just go into space and you don't just when you deal with incidents like his visor fogged up and he couldn't see anything, he was trying to fix the space station. Like you don't go into those situations without having practiced a lot of different scenarios like that and getting ready for them. And while it the details may be different in the situation itself, like you've practiced and practiced and practiced all kinds of things like it. So I think it's the same analogy here. There's also some guidance in the HHS document about testing backup procedures and conducting security risk assessments. Again, all really, really fundamental stuff. 

Brian Selfridge: [00:28:14] But that's where we are right now. We've got to be doing the basics, we've got to be doing the hygiene, we've got to be improving our posture, all those things. So appreciate HHS putting out that publication. In related news on the security risk assessment from HHS and OCR, the Office for Civil Rights have released a new version of their HHS Security Risk Assessment SRA tool. This whole was originally released in 2014. Some of you may be familiar with it as a freely available downloadable tool, kind of an executable tool to support the security risk assessment process. This new version some nine years later, check my math on that has some bug fixes as well as an Excel version. For organizations that do not have access to Microsoft Windows. So I'm going to kind of lay it to you straight here on this one. I work for OCR at HHS as a HIPAA expert witness. I do. I love them to death and I love everything they do. So I'll say that first. But I but my critique here is a little bit. My experience has been that this tool is mostly applicable for very small organizations that really don't have any resources to either hire cybersecurity specialists in-house or consulting firms to conduct their security risk assessments. And frankly, the tool doesn't really scale very well for most organizations and can be super clunky to be used for any kind of meaningful risk analysis on an enterprise level. 

Brian Selfridge: [00:29:34] So I'm not, you know, not entirely knocking it. I think it's great for super small organizations that really don't have anything. And I am very appreciative of the efforts of HHS and OCR to develop and promulgate free resources to help the industry in these critical areas like security risk analysis. And please, please, please keep doing that. I think it's wonderful. Every little bit helps, but just for those sort of listening that are going to go out and download the Excel version or the word executable here, don't get your hopes too high on its ability to replace your current methodology for security risk analysis. That's my story. I'm sticking to it on that front. In other updates from HHS and OCR, OCR made an announcement recently that they plan to issue a video with updated guidance for cybersecurity, quote unquote, recognized security practices. This is a follow up to their prior announcements of the HIPAA safe harbors, as they were called earlier of sorts that OCR told us about last year or thereabouts, that they said, look, we're going to if you deploy. Industry recognized security practices. We're going to consider that in the reduction of your fines and enforcement activities if you do experience one or more breaches that result in a subsequent investigation. So it's kind of this this incentive to say, look, there's a reason there's a carrot here for you to make proactive investments in cybersecurity. 

Brian Selfridge: [00:30:53] But to do that, the expectation is that OCR and HHS have to and will provide some guidance as to what they mean by recognized security practices. So that's what this is all about. That was the initial announcement. Now we're starting to see some of the detail come out and they're going to put a video out on it when we don't have the video yet, but we can get some clues into. There's a couple of ways you can get some clues into what this recognized security practice might look like since it started. And all the conversations I've had with everybody in this whole ecosystem, the general consensus is that recognize security practices are going to really rally around cybersecurity frameworks like the NIST Cybersecurity Framework, CSF and the High Trust Common Security Framework, also CSF, but a different CSF at a minimum. So I think set your set your mind frame around that, in general, is probably already headed. OCR says the new video will answer questions about how OCR is requesting evidence of recognized security practices. So how are they going to audit you on whether you're doing the things, all the things or not? They're going to. The videos can include resources for information about recognize security practices, get more information, more guidance. And then they're also going to provide a summary of OCR is request for information they put out in April of this year, asking the industry about their perspective on recognize security practices. 

Brian Selfridge: [00:32:10] So they're not they're not doing this in a vacuum, which is good. So we're going to keep an eye out for that video. But in the meantime, we do have some insights into some of the feedback that's been provided to OCR as part of that April request for information from some industry groups. I want to highlight those for you, just to give you a flavor of what a handful of organizations are saying. And this is by no means a comprehensive representation, but, you know, it'll give us some insights. So the first group is HIMSS, Health Information Management Systems Society. I think that's what Jim stands for. I've been working with them all these years. It's something like that, but you guys probably know them. So HIMSS weighed in and recommends the recognition of NIST and HITRUST frameworks rather than trying to define a list of discrete cybersecurity controls that can be authored, audited or reviewed by OCR directly. So basically OCR please don't create another framework for us. That is, I think I think that's a reaction a lot of ways to the OCR audit protocol, which was kind of this deeper dive into the HIPAA security rule of what type of evidence they'd be looking for when they come in on it. And that's still valuable and necessary. But I think the ask here from HIMSS is, hey, can you just, you know, let's all rally behind NIST and HITRUST and not try to reinvent this thing over and over again, because then it just comes in the framework we have to manage to. 

Brian Selfridge: [00:33:25] And that framework may or may the new one may or may not be the right things for us, whereas we know NIST and HITRUST work right. They've been proven in the course of history and time here in our space. HIMSS also would like OCR to validate that a control is in place but allow organizations to determine for themselves how best to implement that control. Like don't tell us that you have to implement multifactor authentication with 256 bit that and you know implemented with eight-character passwords in the phone and SMS like don't tell us that level of detail but tell us that you've got to have multifactor in place for external connections or point back to a NIST or HITRUST that say something like that. So I think they're asking again, don't be too prescriptive here. There are probably a lot of reasons they're asking that. Why? Because these things change, right? Whatever OCR has put out in 20, 22, a year or two from now, it could be totally valid or invalid depending on how the technology shifts and how systems work. So you want to be careful about being too prescriptive. HIMSS also suggests that OCR should earmark some of the fine amounts of the fines that they put on business associates and covered entities for violation type of violations, to use those funds to distribute educational materials and other resources to HIPAA-regulated entities and ensure that all organizations have the knowledge and resources to prevent or mitigate attacks. 

Brian Selfridge: [00:34:47] So I love that. And that's something that's been kicked around and OCR has been asking about that. Hey, should we reapply some of these funds and where should we put them? And I think that's appropriate and necessary and really great stuff. Another group that responded was MGMA. I am familiar with this organization, but I am not recalling what it's called. Do you think I could have looked it up? But I didn't. Sorry about that, but MGMA they represent a bunch of medical groups and physician groups and they're very advocates of sort of the healthcare delivery space. But similar to HIMSS, MGMA also asked HHS to avoid prescriptive control requirements, saying, look, HHS should provide HIPAA regulated entities with the flexibility to choose which recognized security practices to adopt, as there are vast differences in the. And financial capabilities of medical groups, which I think they're really alluding to. A lot of the smaller physician practices groups out in rural areas versus large regional national health systems, and understanding that there's this spectrum of healthcare delivery organizations and just be careful about saying everybody has to deploy this latest and greatest endpoint protection advanced this know what you pick your news your trust pick your new acronym of the day and technology today artificial intelligence this and that you know your single doc practice is just not going to do that or be able to do that financially or logistically or otherwise. 

Brian Selfridge: [00:36:09] So I think they're saying, look, just be mindful that there's a range here and give us a menu to choose from that's appropriate for our organization and our size. And I think there should be some clarity like what you don't want to do is have everybody working to the lowest common denominator and have large multinational, multibillion dollar health systems with endowments like spending a dollar on security and say, well, you know, we use the HIPAA SRA tool and we're good. Like, that's not a recognized security practice, right? So in my view, I think it has to scale up and down the spectrum. The third organization last one that I'll mention that weighed in on his request here is that was from the sci fi organization and they added their thoughts to the mix and asked HHS to make sure that emerging technologies are considered in their new requirements. They cited prior challenges with trying to apply HIPAA models to iPhones when they first came out and I was a security officer during that time and it was challenging everybody talking to us like, well, what's the HIPAA interpretation of iPhones? And initially it was like, you can't use these things. They're not encrypted. PHI goes on them directly and then gets stored in the iCloud. 

Brian Selfridge: [00:37:15] Like it was pretty bad, right? Right off the market when they first came in. And then, of course, it got much better. And now we all are like, Oh, of course, it's secure. But 2009, 2008, not so much. So they would like to see I would like to see more education, and ongoing guidance as these new technologies arrive and kind of have that be an ongoing thing from HHS and OCR. I think that's fantastic. So some good stuff there. We'll keep you posted. Once OCR releases their video on their updates and provides more insights into recognize security practices. And let me know if you have your own feedback of what you would like to hear and see. And we can help get that over to the appropriate authorities, to the man. We'll tell him what you want. All right. We have lots of HHS updates today. Apparently, it is a coincidence I'm not hunting this stuff down. It's just a lot of movement, which is good to see. HHS Cybersecurity Program issued a warning on June 2nd about a dangerous malware called Emotet. I said it wrong, but that's the spelling so you can say however you want. So they've said it's responsible for a majority of malware infections in healthcare organizations, which is a pretty strong statement. It was first detected in 2014 Emotet, which is primarily delivered via email phishing campaigns. 

Brian Selfridge: [00:38:30] Those types of things is considered to be one of the most dangerous malware variants infecting one in five organizations worldwide. Holy crap, that's a lot. Forgive my French, I could have been a lot worse, but I'm going to leave that one in there. So four things to know about Emotet. The malware includes a dropper for delivering other malware variants. Oh great and is offered to other cybercriminal groups under the infrastructure as a service model. Oh, I love that. It's like ransomware as a service. Now we have this malware as a service so, so efficient. 80% of malware infections in healthcare organizations involve Trojans and Emotet was the most common Trojan deployed in attacks on the healthcare sector. Emotet is operated by the Mummy Spider Threat Group. I love these threat groups, man. Their names, they're like, they're better than band names in so many ways. So there you go. Keep an eye out for Mummy. Spider. The US, Canada, and Europe had successfully taken down the Emotet infrastructure in January of 2021 and removed and disabled the malware from infected devices in April 2021. But as we know how this stuff goes, the cyber group began rebuilding Emotet in November 2021, so just a couple of months later, and they now have 246 servers online that are doing their thing and racking up the points of infection. So all, that is to say, says you should be worried about this. 

Brian Selfridge: [00:39:50] I think their stats are pretty compelling. I think you should, too. So do your homework on that one. Talk to your SECOPS teams and your operational folks to see if you've got a handle on variants. Run some scans, just make sure you're covered there. Nasty Trojan. All right. Another really quick update from the federal government involves an announcement from the Cybersecurity and Infrastructure Security Agency, or CISA, that they are developing a guide for helping organizations overcome the challenges of supply chain risks. So it's all coming back to supply chain risks. We started with Facebook. We'll come back to it. So we've covered this evolving story previously at length at the last year or two. If you if you sort of. Watch this. I don't want to rehash it, but this is a follow-up to the Catalyst supply chain breaches with SolarWinds and Kaseya. And so I mentioned earlier they blame the intern and that all sparked President Biden's executive orders on supply chain risk. There's two of them that we're now seeing play out in execution. Right. So the executive order starts with a big directive saying we shall do this and here's the direction we're going to go. And you have to produce X, Y, Z, but it doesn't tell you exactly how to get there. It tells you what needs to be done. So Kaiser's stated goal here is to turn what needs to be done supply chain risk into how to do it. 

Brian Selfridge: [00:41:04] And I'm excited to see this work product when it gets released. No indication yet of when the CISA is going to put this one out, but I'm pleased that they're working on it. And you heard it here first. It's going to come out at some point and we'll tell you all about it when it does. But that's good progress. We like incremental progress here. All right. I'm going to wrap up here with a few lightning-round updates on threats and enforcement activities in our space to keep you in the loop. First is a report about the FBI's successful thwarting of a nasty cyber attack from Iran on the Boston Children's Hospital. You heard that right. Iran's attacking children. Come on. Where are the lines drawn? FBI Director Christopher Wray said Iranian government-backed hackers were behind an attempted cyber-attack on Boston's Children's Hospital in 2021. FBI agents learned about the planned attack from a tip that was provided by some unspecified secret FBI person intelligence partner. The agency was then able to warn Boston Children's Hospital, and then that allowed them to stop the attack before hackers had a chance to do damage. So that is an awesome story of cyber intelligence, threat, intelligence, secret, FBI stuff. But Mr. Wray, Christopher Wray called the attack one of the most despicable cyber attacks I've seen. And he did not detail the motive behind the attempted hack, but he said that Iran and other countries have been hiring hackers to conduct attacks on their behalf. 

Brian Selfridge: [00:42:25] So this might be one of those things where, you know, you let your sort of mercenaries loose on your enemies. And, you know, they may choose tactics that could be a little above and beyond the pale for what you would expect. Let's hope that that's the case here. But who knows? The FBI director also emphasized that privately private companies need to build relationships with the FBI in order to thwart and stop nation-state hackers and ransomware gangs. I think that's really important to remind. Remember, I remember when I was in a security officer role, we had a local sort of field agent, an FBI field agent that unfortunately I did have to call and bring in when we had a large-scale malware event. And they were helpful, largely just validating stuff we had seen already. It was a zero-day type thing, so we were kind of on the front end of providing them with Intel, which was interesting, but they had great resources in helping us respond quickly and brought some folks out on-site to help us through. So definitely get those relationships, look up who that person is, reach out to and say, Hey, when I make sure you know who I am, you know who I am, I'm responsible for this role so that when the stuff hits the fan, you have some help from the FBI and not have to scramble to figure that out. 

Brian Selfridge: [00:43:32] And that should be part of any standard incident response plan these days, not just the FBI, but you've got CISA, you've got local law enforcement. You may need physical security support and other things. So get that Rolodex going of all the entities that can help you there. So scary stuff there. On the attack against Boston Children's. Very pleased that it didn't come to fruition. But kudos to the FBI for their work to protect Boston Children's and the patients and families they serve. Thank you for all the stuff you folks do out there for us every day. All right. So we like good news, right? That was a good news story. Sort of scary, but good news. So I'm going to keep going down that path with our last couple updates. The US Justice Department operation shut down a massive dark web marketplace recently. The SSNDOB marketplace. Not a creative descriptor, but one that I think hits home. Right. So security data, birth marketplace. It's a series of websites that list more than 20 million listed. More than 20 million Social Security numbers for sale was dismantled and seized by the DOJ. Recently, the SSNDOB administrators created advertisements actually on dark web criminal forums for marketplaces services, providing customer support functions, and regularly monitored the activities of the sites, including monitoring when purchasers deposited money into their accounts. 

Brian Selfridge: [00:44:51] So this was an international operation to dismantle and seize the infrastructure. That's a result of close cooperation with law enforcement authorities in Cyprus and Latvia. So, man, great work there by the DOJ. Congratulations. Also on the theme of Dark Web. This has been a big sort of trend that we've been seeing so Meditology here. We've been doing and partnering with organizations to do dark web searches because it's relatively I don't want to say it's easy because this is area that these companies specialize in, but they've got hooks into all the different dark webs and marketplaces like SSNDOB and many more where they can see which organizations have been compromised and which information is being shuffled around about your company. And they can do some queries and really, really cool stuff. So we work with a company called DarkOwl and have incorporated a lot of their dark web research into our risk assessments and other things. So if you're not doing that today, get in touch with us. We can help you do that. It's relatively cost-effective and really important to understand if your information is out there on the dark Web market and what to do about it. If so, it gets you ahead of the big headline-grabbing breaches if you don't want to wait for that to happen in the public sphere. 

Brian Selfridge: [00:46:02] All right. In other good news, an international law enforcement operation codenamed First Light 2022 has seized $50 Million and arrested thousands of people involved in social engineering scams worldwide. I mean, this is huge. So this operation. Is led by Interpol. Interpol is the international policing agency that sort of chases people down when they try to run away to other countries. Right. So if you plan on creating the great heist and then moving to some other country, Interpol's probably going to be there to gobble you up. So the operation was led by Interpol with the assistance of the police in 76 countries. Wow. And focused on social engineering crimes involving telephone deception, romance scams, business email compromise scams, and related money laundering, as you might expect. So there was the operation, which lasted two months between March and May of this year, are the following 1770 locations raided worldwide. Wow. I can't imagine the cost to do that. Some 3000 suspects identified some 2000 operators, fraudsters, and money launderers were arrested. 2000 people. It's amazing. 4000 bank accounts frozen and 50 million US worth of illicit funds intercepted. I'm sure there's more of it. I'm sure they try to get as much as they can. There's one case where they highlighted where a Chinese national person was going to say Guy, but I don't really know if it's a guy or not. They defrauded 24,000 victims out of $35 million via social engineering attacks and tactics. 

Brian Selfridge: [00:47:36] Wow. I mean, we know social engineering is a big deal, right? I do this as a pen tester. We do all this stuff and we try to prove the point. But $35 million, 24,000 victims from one guy or one person. Incredible. So awesome work from the Interpol and this operation "first light 2022". Code name. Absolutely amazing. Got to keep that stuff up. All right. I'm going to close this episode with a quick reminder that the OCR, HIPAA exemptions for telehealth are set to expire soon. These are the ones that during the pandemic if you remember at the beginning, Zoom was cropping up and all this other stuff. And I'm like, Well, we don't know if we can do remote telehealth using these platforms. We haven't vetted them for security, all this stuff. So this was this exception that OCR said, Hey, we're not going to enforce HIPAA security rule enforcement's on telehealth for a little while here. You get a break, let's get through this pandemic. But so they're signaling, OCR, signaling that, hey, we're kind of getting through this pandemic. Good news for everybody. But they haven't said exactly when it's going to happen. But they're issuing guidance on how to get prepared for the expiration, including reminding organizations about the need for business associate agreements. Here we come back to the first story of the Facebook right conducting security risk analysis of telehealth products and vendors. 

Brian Selfridge: [00:48:48] Again, back to the first story. Oh, and by the way, NIST put out a telehealth security standard. Control standard. If you haven't seen that, a great time I was going to dust it off but it's barely new. It's very green. Just put out last year, I think, or earlier this year. So, you know, conduct an assessment of your telehealth products, leverage the NIST telehealth security framework if you can or just CSF or HITRUST or whatever. No big deal. Right, what you use, but do those assessments. And then, of course, OCR says don't forget all the good HIPAA security rule stuff that you were supposed to be doing before the pandemic kicked off relative to your telehealth and other third-party vendors. This is nothing new. So the holiday is coming to a close, folks. So let's get back to the business of securing these platforms. And I think that's we appreciate the heads up and you've now been past the heads up as well. So make sure we get our game together. 

Brian Selfridge: [00:49:39] So that's all for this session of The CyberPHIx Healthcare Security Roundup. We hope this has been informative for you and we'd love to hear from you. If you want to talk about any of this, just reach out to us at [email protected]. And that's all for this session. It's so long and thank you for everything you do to keep our healthcare systems and organizations safe.