Listen to learn how to identify and measure risk across the organization.
Many organizations struggle to translate cyber risk to business risk. When organizations understand how to identify, quantify, and communicate risk, they give senior leadership the tools they need to apply resources to mitigate that risk.
Ryan Leirvik, Founder and CEO of Neuvik Solutions and author of Understand, Manage, and Measure Cyber Risk: Practical Solutions for Creating a Sustainable Cyber Program, sits down with Host and Principal Security Analyst Jen Stone (MCIS, CISSP, CISA, QSA) to discuss:
Resources:
Download our Guide to PCI Compliance! - https://www.securitymetrics.com/lp/pci/pci-guide
Download our Guide to HIPAA Compliance! - https://www.securitymetrics.com/lp/hipaa/hipaa-guide
[Disclaimer] Before implementing any policies or procedures you hear about on this or any other episodes, make sure to talk to your legal department, IT department, and any other department assisting with your data security and compliance efforts.
Hello, and welcome back to the SecurityMetrics podcast. My name is Jen Stone, and I'm one of the principal security, analysts here at SecurityMetrics. If you're new, welcome. We have tons of back catalog.
Go and check it out. We have so many topics about so many things. What what we do is we like to talk to people who have knowledge, experience, or maybe an interesting product. So it kind of runs the gamut of security and compliance.
Hopefully, you find something here that is of value to you today. I'm very excited to talk to our our guest. And let me tell you a little bit about him. I'm gonna read the bio so that I don't miss anything.
Ryan Lyrivic is the founder and CEO of NuVic Solutions, a cybersecurity services company that solves uniquely complex technical and talent vulnerability problems through advanced assessments, cybersecurity training and education, and integrative risk management. He has over two decades of experience in offensive and defensive cyber operations, computer security, and executive communications.
That's the big part for today. Ryan formerly serves as a chief of staff and associate director of cyber for the US Department of Defense, a cybersecurity strategy strategist with McKinsey and Company and a technologist at IBM. He's the author of understand, manage, and measure cyber risk practical securities excuse me, practical solutions for creating a sustainable cyber program. Ryan, thank you for joining me.
Jen, thanks so much for having me. Really good to be here, so I appreciate it.
I think the most important question I have first is, did I pronounce your name right?
That's Perfectly.
Perfectly. Yeah. Even for me already.
Okay. Yeah.
That's it's a hard one.
I think the reason that I wanted to talk to you, so critically was we're we all are talking about risk. Most of the people that that I work with, their primary concern is risk, cyber risk.
But a lot of them really struggle with how to put that in context and communicate it as business risk. And and actually, let let's back up from there. A lot of them don't even know what they mean by risk.
Can you kind of help us understand what risk is?
Yeah. Happy to give a point of view on this because it's it's actually one of the things, Jen, to your point. Surprisingly, a lot of organizations, especially when it comes to cybersecurity, like, haven't really laid out specifically in an organized way what we mean by risk. Right? It which seems sort of odd, right, when we have insurance organizations and banking, you know, banking and finance organizations that are very clear on what the risk, the impact to their business and business model is.
Mhmm.
But for some reason, you know, there doesn't seem to be that clarity in most companies Yeah. That aren't either banking or banking finance or insurance. Right? Sort of all the others that haven't really sort of, you know, it seems that the definition seems to be missing.
Yeah. So, you know, I'm certainly not the expert on risk definitions, but I'll tell you a couple that I like and I've seen that work well in cyber. So there's a number. Right?
So the ISO has a good one. You know, FAIR has a has a good one as well. There's a number of different sort of models out there that work. The one I gravitate to mostly, because largely we do a lot of work in the States.
Right? And the CSF has picked up. Sure. NIST has a good definition sort of in one of its IRs.
It's the NIST IR seventy six twenty one, I think, is and they define it categorically as a risk plus a vulnerability, right, plus likelihood, plus impact equals your risk. Mhmm. Right? In this way, at least, you know, from a definitional standpoint, that's the one I like largely as a professional because it gives, you know, executives or managers or even risk managers a categorical way to present data Right.
Relative to the cyber risk. Right? And then at least it helps us from a from a framing standpoint.
Sure. It's that quantifiable piece that that also helps prioritize. Right? But but, I mean, even stepping backwards, I wanna give a shout out to my HIPAA folks that are listening.
In the HIPAA world, there are only three recognized security practices from HHS OCR. One of them happens to be the NIST CFR. So I'm glad that you mentioned that because, what we're going to talk about today has a direct applicability to, people in the HIPAA world that have to start with that risk analysis. How do how do we know what risk is? And so, hopefully, this will speak to that as well. But it but it's such a broad category for anyone, not just, you know, compliance or regulatory needs.
It seems to be all consuming. Right?
Absolutely. But that compliance and regulatory need is part of that impact.
Yeah.
And these these are some of the things we don't always think through. Right? I mean, kinda look through that model just to continue to use this one for, you know, discussion's sake. What He he it drives you into what's the impact to the business, and it categorically puts things, you know, in different categories, puts them aside.
So a threat is a threat, a vulnerability is a vulnerability. It needs those two to come together to have impact on the business Mhmm. And leave likelihood to the side for now. And that impacts, like, our in the HIPAA world, yeah, if you lose certain type of data or give access to a certain type of data, right, that being those that are protected by by HIPAA in the US, right, there's a specific impact to the business in terms of fines.
Right? So that's one category or one element of of, risk that is really it's fairly easy, right, to get a number around. Right? Back in the day, we used to struggle with, like, you know, how much is a how much is a breach worth, right, or what's the impact of the breach.
And it was relatively difficult to calculate because there weren't sort of defined rules around, alright. Well, what do we mean by impact? But fines and regulatory fines, especially, and, others are fairly straightforward. And if you depending on the volume of data that you lose, you've got a pretty straight calculation saying, hey.
If this data were to be breached or handled in a certain way with unauthorized individuals, here's the fine we'll be we'll be seeing. And that that's a fairly, you know, straightforward way of defining risk, if you will.
I I think this is a really important, nuance to this conversation. It's not even nuance. It's like groundwork foundational because a lot of times when I talk to organizations about risk, the IT or technology or security side of the house talk about risk in almost feels like a completely disconnected way from the way the business discusses it. And and so they get confused about what are they, how are they actually talking about impact for one.
People rarely talk about the impact, related to regulatory compliance, issues. And and then, you know, the the technology side of the house will say, well, our impact is we don't have this system and these people will be calling us because they can't access it or or it'll cost us as much to replace it without additional insight into what the business sees. And so I don't see technology and business, sides often. I don't see them at the same table in the same room doing risk analysis.
Yeah, Jen. That is the center of the problem right there. Right? The the inability to bridge the gap between the technological problem, right, and the business problem, and which is largely interesting because if you take a big step back, you're in business to do something.
You're organized to do something, whether you're a nonprofit or for profit, right, or a government entity or a commercial organization. You're organized to do something. And that lack of understanding or let's put it this way. Actually, it's probably a better way of thinking.
Right? The the inability to connect where the systems risk or the data housing risks or your your your basically, effectively, the assets that you rely on on a day to day basis.
What is the impact to the business, the whole reason you're together? Mhmm. Right? Should one of those things, you know, be compromised, unavailable, right, or manipulated in a way that or lost, right, or manipulated in a way that you didn't expect.
And that's the real crux of the issue. Right? And the ability to sort of sit down and bring business leaders to the table or at least those that are managers that have an understanding what the business does Mhmm. And then make that connection between, make that connection between the system that the asset that's at that is effectively at risk and the business impact is where the risk sits, right, and the quantification of risk.
And what's really interesting is we talked about the what was the nearest IR seventy six twenty one for the definition of risk? What I love about that particular publication is it also takes a next step in defining, alright. Well, here are some categorical ways depending on your asset of identifying either by high, medium, low, right, or some sort of dollar figure, what the actual impact is. And now if you go through that type of an exercise, right, lost work, lost access, incident response, client fees, right, legal fees, right, reputation loss, all the things that we know to be true, a simple exercise now boils it down to, alright, what asset, what is the, you know, dollar amount or at least high medium low categorical view of risk?
And now we're at least talking about the same thing, and you've got a apples to apples comparison in the organization to back to top of your questions. Like, when we talk about risk, now we have a way of talking about it. Right? Right.
Somehow that's gotten lost somewhere. Yeah. You know? Just with all the noise and the busyness that, you know, is cybersecurity these days.
Right?
And I I don't know if you're seeing this, but one of the the real challenges there is then that people get lost in who is supposed to decide what how do we mitigate risk? What are we mitigating? What are we accepting? Who makes the decisions on how we're handling the risk? Do you see that?
Oh my gosh. This is the most challenging management function I think we have in the cybersecurity world. It's and and and and look, We're in this place because, let's be honest, like, cyber was some sort of mysterious place where stuff happened, and we weren't really sure other than the technologists. We're like, oh, yeah.
No. This is really easy. Right? But the business gets lost because it's it's like, wait.
I don't understand the difference between a load balancer, and why is that important with a DDoS attack. It's like, okay. Well, let me explain it to you. Right?
But it's sort of mysterious. But what's interesting is, like, to your point, the real challenge for I would say, and probably this would resonate with a lot of folks that are in the business of actually trying to manage the risk. When you have two responsibilities of being transparent about what the risk is, that is identifying the risk, pulling up, representing data in an organized way for business leaders to make this decisions.
If you also have the decision authority and you own, quote, the risk or are responsible for it, you have two conflicting priorities, right, in your business. And this is trouble because nobody is gonna be transparent about the risk if you're also, you know, also responsible for it. Now I won't say nobody, but very few. It's, you know, sort of against our nature.
Like, oh, well, this isn't an important or this is because, you know, maybe you you have big implications on the line. Maybe your job's on the line. Maybe your raise is on the line. Maybe something else is on the line.
So, yeah, I see this all the time, and the challenge is separating those two out to say, look. We need an organization led by a person, you know, usually CISO or Direct of Information Security to say, we need to be transparent about what the risk is because we're closest to it.
Mhmm.
Right? We see, you know, how many privileged access accounts we don't have. Right? We see how many employees are demonstrating poor security behavior.
Right? We need to be able to present that data to somebody in an organized way for them to make the decision on what's an accepted level of risk and what isn't. Right? Because you can't do both simultaneously.
I I see that all the time, and it can really, you know, short circuit an organization from truly looking at what the risk is. Right?
But I think you Legenduring decision.
The way you said it sounds so sensible and it sounds so kinda easy to do. Yeah. We we need to quantify it and separate it out and be able to to communicate it so that they can make a decision. Are you kidding me? That is one of the hardest things for people to do. So how?
How? Who is supposed to so I think not only do we have the challenge of quantifying it out and communicating it, we also have the challenge that that these people's perspectives and, what drives them is gonna be different from these people. They might not understand why each other find something more important. And then you've got the jargon that people toss in where, like, your if your technology guys start getting a little defensive and kinda get their backs up, they just make the language more difficult instead of quantifying it in an easily accessible way for the business side.
How do we get how do we get past this?
Yeah. This this is the hard part. It's it's a little bit look. The the tech management divide is alive and real.
It always has, and it usually always will. Largely because, you know, one, management can really abstract things really well without thinking through the engineering behind it. Right? You know?
Right? Which would drive an engineer or a technology focused individual crazy. Mhmm. The opposite is without a set of requirements or real understood concrete levels Right.
Sometimes we as technologists can get really wrapped around the axle around, like, tell me exactly what it is that you mean so I know what to build towards.
Yeah.
In the gap between the two is where we've been living for a while. Mhmm. And there doesn't seem to be any sort of reference guide to bridge that gap.
Right.
Right? And that's what's hard because if you play on both sides of it, right, and, like, okay. This is this is what the real advanced threat is looks like right now. Here's exactly how it works on your, you know, on your system, exactly the TTPs that are being used, right, the methodologies inside of that, and what you know, how quickly we can pivot to get sensitive information or sensitive systems offline or, you know, out of the organization. Right? Great. That's super fun.
Why does it matter?
And that's when it starts to break down. It's like, well, you know, great. You penetrate this one cloud instance. Right?
You swept all the credentials, and you got access to all this marketing data. Mhmm. Big deal. Right?
Zero relatively low impact to the organization. You know, assuming for a second, you don't pivot from there into something slightly more critical.
Sure.
Right? But on the other side of that is a management organization that would think, wow. We just had a breach. Well, you didn't really.
Right? Like, somebody's able to do it. So the the part between the two, there just wasn't this sort of guide to say, look. Let's identify what's critical.
Understand what the real problem is we're solving for. Right? What's real critical? Alright. That's critical assets.
Right? Okay. Well, how do we know we've identified those and we have some sort of management piece in place? Well, okay.
These are where frameworks come in. Right? Any of them. ISO twenty seven thousand one, NIST CSF.
Right? COVID. You know, it does matter, you know, but what matters most is you have a framework in place. And then the real piece, Jen, that I've seen, miss out on all this is the next piece of, like, how do you then measure it?
Yeah.
How do you know you're doing a good job?
Right? Right. And this gets into the measurement conversation quite a bit. But the reality is if you don't know a problem you're solving for, right, what assets are actually at risk and why they're at why they're critical, if you will, right, and you don't have a way of managing it, well, then you have no categorical way to understand, you know, those pieces.
And largely, I'd just tell you, like, that's why, you know, the book was written. It's like, this is missing. Somebody needs a guide just to sit down and be like, what risk what's at risk? Why is it important?
Right? And how do I know I'm doing a good job? And it's really fundamental because you sit through that, like, one to one to one relationship between identifying critical assets to, you know, knowing where they are and understanding they're under management, and then setting a risk threshold around, okay. Well, what's an acceptable, you know, loss rate, if you will, or loss, you know, dollar amount?
And so, yeah, we see this all the time, and this is where the center of the problem is. Right?
This is definitely and and to make matters even more complex, different organizations have to protect different types of data at different thresholds.
And it doesn't just stop with, are we protecting the data, but are we protecting the systems? Because without the systems, can you do business or not? So I think what I see, fairly often is in the world of protecting credit card data, so PCI.
The PCI standard was written with very specific requirements because they the payment brands and the the groups that that developed it, they did all the prework on what is the risk and what do we what must you have in place to deal with the risk that we perceive? And and so there's very little wiggle room. A lot of times, people get very frustrated with the prescriptive nature of of PCI. Some people love the prescriptive nature of PCI.
But but, basically, it is a an a group of, you know, people in the industry said this is we're going to shortcut risk for you and tell you what you need. Well, that's just one one item that maybe an organization has to deal with. And so, great. You're PCI compliant.
And we hear all the time, well, just because you're you're PCI compliant doesn't mean you're secure.
True. Compliance does not equal security because you have to put it in context of the larger risk set of risks within the organization. And I think that's what people often get caught up in is their primary focus, whether it's PCI or HIPAA or whatever that is. They look at that risk and then they fail to recognize and address others in the organization. How do you see people successfully putting those different competing or maybe conflicting or or just complexities together in a risk profile?
Yeah. It's not easy, but it's simple. Right? The the not easy part is we get distracted all the time.
Right? We're constantly resource, you know, constrained. We're being pulled in different directions. Some man you know, management has a question on something that we have a simple answer for, but we still have to answer it.
Right? Our tech folks have, like, very complex questions that we need to think through what the right applications are. So the distraction level is way high. The simple answer is typically, you know, four things.
Right? Do we understand what the risk really is? Right? That starts with, do we have a risk profile or sorry.
Not risk profile. Do we have a definition of risk that we talked about at the top here? Right? That at least categorically allows us to explain what risk we're dealing with.
Mhmm. Two, do we have a way of calculating what that risk is? At least giving a a prioritized or a prior priority list of risks relative to either each other or prioritize against, you know, let's just say, either budget or or, loss. Right?
Let's just put those two together. So, again, a a way of understanding what the risk is in in a palatable way. But then the the the next piece is, like, how do we manage it? But what sits in between, right, is the piece that we largely don't get well get right, and that's what assets are really critical to the organization.
Right? So playing off of the PCI stuff. Yeah. Absolutely. They did a lot of hard work trying to identify what particular data types, even metadata are important.
Right? And if something were to happen to those or the CIA model, right, around those, what the particular fines would be and what the impact to either the underwriters, the holders, or the account holders, that is Right. Or the bank providers. Right?
Exactly.
Pretty clear. But what's happening now, and one of the biggest challenges I see in a lot of organization, I'm sure you're seeing it too, and your team is probably seeing the same thing, is when we're looking at assets, you know, there's we we sort of broaden the view about what our real critical assets are because depending on the business, they may not be data. It may be devices.
Yeah.
It may be networks. It might be users.
Mhmm.
Right? Sunil, you, came out with the cyber defense matrix a little while ago. I think it's about two years now. And one of the critical pieces in there was this ability to sort of look at our asset classes, right, across five different assets in a you know, data being one.
So I think it's devices, applications, networks, data, and users. Right? I mean, I think I got that right. And what's interesting is you start as an organization, it becomes critical to understand that we're not just talking about data.
We're talking about the assets because the systems that can get to other systems, the networks that can get to other networks. Right? You know, think OT. It's not necessarily the data.
I mean, it is at the end of the day about, like, what signals you're sending over the HMIs, whatever the you know, whatever you're actually sending to the system. I mean, as an electronic node, technically, it could be considered data. But the application and the system that gets there that could shut down a power plant. Right?
That's that's systems and networks. Right? Or applications devices and networks, really. Network devices, if you will.
And so that's what I've seen start to work as those have started really taking, you know, an appreciative look into the assets. Right? And look and categorizing them in a way where we say, alright. These are devices, either applications, networks, data, end users.
What is most important at what part of the business from an impact standpoint and start there. And that's the simpler answer. It's not easy. It's really hard.
I mean, tell me one organization that really does asset management really well. Right? It's not Almost not.
It is not easy.
Oh, and then try to build a risk register off of that. Right? You get incomplete data off of incomplete data. But at least if you have a cat this is this is why, like, having a categorical view Right. Right, of you know, it's like the silverware drawer. Right? If you just plug the three in there, right, the knife, the spoon, the fork, what happens when this fork shows up?
He didn't know where to put it. Right? Yolo. He throws everything up.
So having a categorical view, what you'd have a place for this fork or those type of items. Right? I don't know exactly what category that falls into, but, right, it is a utensil.
Sure.
So, yeah, it's things like that. And I think from an asset management standpoint, the the more prescriptive we can be about what type of assets we have and the assets we have fitting in there, then it makes categorizing those assets a lot easier. Right? And then prioritizing those based on criticality even easier. Again, not not easy, but it's it can be simplified. Right?
Right. And and the the some of the real benefit to this work is, as you were saying earlier, the the management set almost black boxes what's going on. You know, they just need to know a thing does a thing and this is the impact. This is the these are these are things that I need to care about. And then so for the engineering team to be okay with finding a way to to, really, create the black box for them. You know, The this is the stuff that we know, and here's how we are going to take and give you the information you need. So we're not requiring the business to understand what a load balancer is and how it works.
Exactly. And this is where frameworks you know, cybersecurity frameworks, to be clear, come in handy. Right? I mean, again, like, no one fits anyone organization perfectly, full stop.
And they're not good at prescribing anything. Right? ISO twenty seven thousand one has really good controls, but, you know, kinda breaks apart at the higher level categorical view. NIST CSF, really darn good at the, you know, high level categorical view.
They're adding a six one here in, you know, the end of the end of the year. But but starts to break down because they on the on the control side because, from what I understand, you know, from some of the authors, like, they don't wanna go that far in prescribing.
But but the frameworks themselves, the department, can act as a really important, again, framing or a guideline of how to run a program.
And now you have that sort of description of the black box, if you will. Right. And then and then from there, that's the starting point. What I've seen work really well in organizations is they start with sort of that, you know, let's let's just use an NCS effort.
They start with the the allegorical mutually exclusive categories of a framework. Right? So you've got identify, which means identify your most, you know, most critical assets. Right?
Mhmm. Then how do we protect them? How do we detect them? How do we respond and recover?
Right? Okay. Pretty straightforward categories.
Mhmm.
Right? To then put, hey. If I'm you know, with my IRR team is having a hard time, you know, with business continuity plans, connecting the business continuity plan. Okay.
Great. That goes in recovery. Right? You've got a place to put it to start having a conversation about the organization with the organization.
Right.
Right? And if you could bring in business leaders to agree, hey. This is the framework we're gonna use and the nomenclature around it and have the the tech, you know, agree that, yep, these are the categories things gonna fall into. Now, you know, let's just take detect, for example.
You get into a world where, you know, with when you have certain anomalies and events that typically don't fit into something, well, good. They go into detect under anomalies events, and you can identify, like, certain user behaviors that you didn't expect. You now have a categorical way to represent the data back up to the business that you can then say, given this example, let's just say, alright. Listen.
We're having a typical time detecting these type of anomalies and events, which means an attacker is sitting on a wire or whatever the case might be, you know, in this dataset doing x y z. We won't get into that piece here. And the impact of the businesses, this is our most critical asset category, and they've, you know, our privileged assets management has broken down here. So they basically got, you know, access to x y z.
Okay. Now it's in context.
Yep. Right?
And now we can communicate with the business all because we just applied it over a simple framework that gets us started. Right? And so I've I've seen that work really well over time to start bridge that gap or, like, start to define that black box. Right?
Like, what are we actually doing? Well, you know, let's take this as a framework. Start here and improve upon it over time. Right?
And I was just gonna say something about context because that is so important to be able to even begin the conversations. There are technology groups that don't know what the business does. They, like, they I mean, they have some sort of vague idea that the business does a business thing, but they they don't know they don't know the business processes. They don't know the how the business does what it does.
And so they have no way to put into context the underlying technologies that support these things. There there are certain parts of a business that can maybe not be accessible for a day or two or a week, but other parts that they have to have immediately in order to do the regular, you know, day to day business that they do. And and without really understanding that, it's it's hard for a group in in on the technology side of things to really inform, what their understanding is of the risk to the various items. And so when we're talking about how do we classify, you know, assets, how do we classify data without understanding the business perspective, it's going to be almost impossible for the technology team to get it right.
It has to be, you know, both sides working together to bring that full understanding.
Yeah. That's exactly right.
You you almost have to pull or invite Yeah.
Right, business managers, business unit leaders who are very tied into the effectively, the economics of the business or the organization. Like, what again, whatever it is you're doing, you'll be a government entity where you're not necessarily in business, but you have a primary function to to, you know, to follow or do. Right? Nonprofits are the same way.
You might not be in it for profit, but you're in it to do something. Mhmm. Right? Businesses, clearly, we're in it to you to sell a service or sell a good, right, or perform something.
Right?
And the challenge is, you you know, to invite those business leaders and say, what is it that we do? Great. What is it the systems that you rely on from an IT standpoint Mhmm. That make that work?
And what happens when those particular systems or data or, you know, assets that's in in the asset category list? What happens when those are no longer functioning the way you'd like them to function Right. Or they were maliciously used in a way that we either didn't expect or in some unauthorized way? Great.
Now you've got the conversation about where a real cyber risk lives. Right? Because, you know, what's really interesting, the pitfall agenda that I've seen that, you know, should be avoided or, you know, we try to identify quickly so that you can avoid it is what individuals think is important to the business Mhmm.
May not be important to the business, but may be important to that. Like, we talked about insurance earlier. Perfect example. Yeah.
Underwriting algorithms, right, based on empirical data, right, and is is is really, really important for the business to get the risk right for whatever it is they're looking at. Right? But it's not necessarily as data that's at risk. Right?
So think about this for a second. Because the reality is, what can an attacker do with that other than modify it and get it wrong, which they really other than, you know, being a competitor, they really don't get any advantage out of that or maybe they do in certain, like, malicious ways, but it's not usually an attack, objective for an attacker. Right? They're They're usually going to either, you know, shut down they try to shut down the business in a way, which this might do one, but more importantly, extract something from the organization that they find valuable on the outside.
Those algorithms aren't valuable on the outside because, one, the only person that can really the only organization that can really use that is another insurance organization. Well, guess what? If you're doing work here in Canada or somewhere else, like, you can't use it because that's called illegal.
So it's not really valuable. Right? But what is valuable? Oh, I don't know. Who are you insuring, and what do they have in their houses?
Right? And, like, what's the codes of the alarm system? Because guess what? That's sitting on your data.
Mhmm.
Right? That's in there somewhere. And so that the interesting thing is to tease out. Alright.
What's really important to the business? Mhmm. So for example, like, works of art or very expensive jewelry or whatever. Because if it were discovered that an attacker, in this case, probably a physical one, because you guys could gotta go pick it up.
Right? Where to buy that information somewhere, and it turns out you, the insurance company that leaked that data, well, guess what? Like, back to the fines. Your fines are gonna go.
And that's the real cyber risk, not not the algorithms themselves because, you know, quite frankly, they're you know, some certain somebody out there in podcast land is gonna prove me wrong, and I'm sure there are. But it they're not as valuable as other stuff.
Well, and and you brought up a good point because, you know, somebody out there might think it's important to them or in their case, it might be important. But what we're talking about is business and technology within an organization talking these things out together and and understanding together what actually is important. Because there's nobody nobody on the outside is gonna tell you that as well as you, internally in in conjunction with the right people can figure that out together, and that's really where the juice is for this.
Yeah. Hundred percent. And then you can provide actionable, oh, we're gonna bring up the measures word Yeah. Measures, right, that can measure actual impact to the business.
Yeah. Right? So back to the DDoS example. It's like, alright. So we had fifteen thousand DDoS attacks, you know, over the x period of time.
Okay. Great. So what? Mhmm. Right? Versus something like, oh, we had three outside attackers, escalate privileges because our access management, wasn't up to where the standard was.
Let's say there wasn't, like, you know, any type of multiple multiple factor authentication on it, and they got access to, you know, Acme companies, like, the way they make the anvil. Mhmm. You know? It's like, oh, well, that's, like, the secret sauce.
Right? Because they have the best anvils for Wile E. Coyote or whatever. Right? The the problem there is, like, that's that's what's complicated.
And if you if you've identified, you know, at least what the problem looks like, then you're able to put measures against it that say, like, alright. Well, the the one I like the best, right, is if you align it to the SCSF, right, percentage of assets identified as critical.
Yeah. Shocker. Really hard. Almost impossible to figure out. But if you start there, now you get an understanding of what the real problem looks like, and you get executives hyper focused well, maybe not hyper focused, but, you know, focused on the problem. The problem's at least well defined. Let's put that way.
Right? And then you at least know what you're talking about to say, okay. Now from there, like, how are we protecting those assets? Right?
Like, how many third party applications have access to those, you know, to those assets? Like, oh, yeah. These are all things that we should be thinking about. Right.
And you can put measures against those that actually speak to the risk. Mhmm. Right? And now we've got some sort of a you know, based on what we've been talking about, a one to one relationship or one to one to one relationship between, alright.
Do we know what the risk is? Do we have it under management? How do we know?
Right. Do you know where another place where I've been using risk conversations a lot that that is I hadn't really been focusing on it before is the question of how do we train our people.
And and so many people go, oh, well, we just send them to their annual cybersecurity training. Really? How did so I started asking people about their training. What did you get from it?
In what way does it apply to you? And and people are like, they don't they're trying to be nice about it. People are they they're so sweet. Yeah.
It was great. Uh-huh. No. What they mean is it had nothing to do with their job and wasn't going to affect their behaviors in any way.
And so what I've started bringing to the conversation is if this role has their account in some way compromised, what's the impact of the business?
Is your training addressing those the those real attacks? Are you putting technology solutions into place in addition to the training that can add maybe, either an automated technology piece or a process shift that can address that? You know, how are you training in a way that actually deals with the risk that accompanies that that person's role and and what they do, what what exposure they could bring if the risk isn't addressed? Have you have you seen that at all?
I love that. Yes. Hundred percent. It's hard. It's very hard. What what I've seen work really well, the one that I like the best alright.
I'm gonna stick with the NIST CSF sort of measure here. Right? Is is is percentage of employees demonstrating poor security behavior as, like, a risk indicator that you can align to, you guessed it, you know, awareness and training in the protect in your, you know, NIST CSF for those following along at home. Right?
Yes. And you now you've got that one to one to one relationship. And what I love about that, Jen, is, like, the the ability to say, alright. Well, we can define that the way we want.
It's employees demonstrating poor security behavior by what? Right? And you can say, alright. A number of, employees that have access to critical information.
Mhmm. Right? Are they falling for phishing campaigns? How many, you know, x over y? Right?
How many over how many are actually falling for that? Right? How many are actually failing this the the security training that we get, which, you know, yes, can be mildly effective and if not, you know, in certain instances, you know, you know, I wanna couch myself and say, very rare. Box.
That's right. Yeah. Yeah. Try doing that to security people. They're like, oh, yeah. No. We can walk around this all day long.
But, you know, having some sort of risk measure against that can then tease these problems out and say, well, we do this annual, you know, security training. Like, oh, great. How effective is it?
Yeah.
Right? How how are they actually learning, and what is the what is the impact of the business? Mhmm. And how are you modifying behavior relative to the impact of the business?
Because one of the best things an organization can do, right, from a security standpoint is be very clear with those that have access to critical information, know that it's critical information Right.
And be protected. Mhmm. Right? And then, of course, there's the monitoring and detection and the pieces.
You know? Yeah. Because, you know, even though, you know, insider threat is a thing, it's a it's a low thing. I mean, I think excuse me.
Kevin Mandia at, RSA gave a nice talk. He's like, look. It really winds up being about one percent of your organization. The problem is that one percent can be really big in terms of impact because they know exactly what they're looking for.
But, again, right along the lines of, like, security behavior and piece of the list, like, you wanna have a risk measure that aligns to that to say, alright. How are we really doing? Not a tactical one. A tactical one is we ran x amount of security training and, you know, x amount passed, x amount failed.
Okay. Great. What you're really driving at is that impact. Like, okay.
Well, how many are demonstrating poor security behavior on a consistent basis, right, that have access to privileged information or the piece I was waiting for.
Because I hear over and over again, we're just gonna fire the people that fail over and over. Woah. Woah. What?
Woah. Okay. A, it means your, activities to train them are ineffective, and you should look at yourself, buddy. And then the second piece is, do we care?
I mean, we care if people I care if people are falling for for, fishing a lot because it means that in their home lives, they might be, they might be vulnerable there. You don't want people walking around vulnerable to to things that you could give them a little bit of training for, makes them better both at the company and at home. But if they're falling forward over and over again and they don't have access to anything that's gonna hurt anybody, there are guardrails around it because of automated things and policies.
Well, maybe they're just a super helpful person that just doesn't understand the malicious behaviors of others. Do we wanna fire them? How about let's find another way to make them a valuable resource in our organization.
Right?
Exactly. Yeah. Because this is where compensating controls can come in. Yeah. Right? And this is where really understanding what the risk looks like also comes in.
I love your your point of view on that because the reality is, like, you know, there may be a benefit to that. But the other side is, are we looking at it closely enough to, like, is your training actually effective? Because some of those people like, again, we go back to security community. You get training.
Sometimes there are security people that fail at training all the time. It's like, wow. This is so boring. Yeah.
Right? That they're just clicking through, and all of a sudden, like, oh, wow.
Guilty. Guilty. Yes. If I'm bored, I will fail.
And that's it.
And I know that about myself.
So Yeah.
Hundred percent. I'm right there with you. And what's the point of training? Behavior modification. Right?
So if your training isn't focused on what's important and isn't modifying the behavior in a way that the organization needs it to change, well, maybe it's maybe you gotta look at your training program. Yeah. Right? Maybe maybe it's not the person.
Maybe it's the program. Yeah. Right? I mean, the opposite is true too. Maybe sometimes, you know, there are situations out there where individuals may just may not, you know, be paying attention closely enough and fail for, you know, the the the, you know, Saudi prince or not Saudi prince.
The Nigerian prince wants to do their Exactly.
Yeah. Because they're written for those that aren't paying attention. That doesn't mean they need to leave their channel. Right? It just means they may they may be useful in some but that's a management decision. We're way off the rails.
Yeah. We can find ways.
That's not my decision.
We don't have to go to the nuclear option right out of the gate.
But Exactly. I I think it's a in those cases, it seems to be where people don't understand that there is different levels of risk for different people in the organization, and they need different training based on that risk. They need different guardrails based on that risk. And and so the the lack of quantification of the risk makes it so that maybe you're not making the best decisions on on even training because you don't understand the risk that that varies from person to person.
Yeah. Exactly.
Exactly. In fact, this is a great spot for, you know, your SOxim, right, or just incident responders.
Right? Some of the you know, we all know the mean time to detect, mean time to respond, mean time to mitigate. Right? Mitigation is a hard one because it's very long.
Yeah.
But those are wildly important. It's like, alright. So let's just say you have, you know, a large percentage of individuals, failing, you know, demonstrating poor security behavior. Alright.
Information. No. Okay. Well, then, you know, the next step is do they have a pathway, right, to critical information?
Like, so for example, are the credentials also useful in you know, for others that have, you know, domain admin or admin somewhere else. Right? Fairly simple connection to make, right, from a from a, you know, you're Microsoft from an AD point of view. Right?
But the the barter issue is like, alright.
Well, how quickly can your SOC respond Yeah.
If they do get ransomware, if they do, you know, click on this? Like, so maybe it's a good test. Right?
Mhmm.
I mean, there's a different way to look at it.
Right? Because if your mean time to detect is milliseconds I mean, well, one, congratulations. Yeah. You're actually doing that. That's amazing.
Exactly. You know? I mean, I'd love to see those those metrics being measured in, like, literally, you know, some sort of millisecond. Right? Something sub sub days would even be great. You know?
But, right, this is another opportunity to say, alright. Well, if you have those employees in the organization, well, how quickly can you identify what the issue is and and at least have it identified and be working it.
Yeah.
Right? Not Not necessarily mitigation because that, for various reasons, could take a long time. But, you know, this is where, like, stacking measures in a way that all tell part of the story can be really helpful because not one of them, like employees demonstrating poor security behavior, is, like, the answer. Right? But it is a part of the answer, which is then compensated by, you know, mean time to detect and ability to respond and recover and things like that.
Well, I I just I'd we didn't really talk about your company at all yet. And so I wanted to just give you just a few minutes, talk about your book, talk about your company, whatever you wanna wanna talk about as we wrap up.
Great. Yeah. Happy to. So we run a company called NuVic Solutions. We do three things, you know, really well, we think.
One is we provide, advanced assessment, so that's effectively your, red teaming. Right? Mhmm. Advanced pen tests. Right? Demonstrating where the holes are and why they're important to fix. Right?
Then we do training. So we do a lot of software development life cycle, right, dev dev ops training. So that's along the lines of threat modeling. We do a lot of threat modeling. In fact, you know, so, you know, artillery's out there doing some great jobs on the thinking in terms of, like, teaching threat modeling SCLC to sort of push left. Right?
Get identify the issues early enough so that the the software isn't misused later in life, right, downstream postproduction.
So we do that, and then we tie it all together with risk management, which is to say, you know, I mean, the book's kind of a good lead in on that one because that's a factor of a factor of, like, what is actually important? Like, so why are these holes important? How would you mitigate these and rack and stack them into your organization depending on what however you're looking at it? And then, of course, from a, you know, a DevSecOps standpoint, alright. How are you mitigating your vulnerabilities downstream post production, and what's, you know, what's the value of that downstream? That's a risk management function. So that's that's what we do at New Vic.
Love it. I you guys are doing some really interesting important work. I I love what you said about threat modeling. I think I could probably talk to you for another hour just about threat modeling. But I really appreciate you taking the time, to come and talk about this today, and and I hope we can talk again in the future.
Yeah. Looking forward to it. Chen, thanks so much for having me. Really appreciate it.
Thanks for watching. To watch more episodes of SecurityMetrics podcast, click on the box on the left. If you prefer to listen to this podcast, it's available on all your favorite podcast platforms. See you on the slopes.