The Essential Skills of Evaluating and Communicating Risk
Our guest is IT risk analyst James Dawson. James provides advice to global organizations on the issues of threat and cyber risk, and has also consulted with many organizations in the financial industry, including Danske Bank and Freddie Mac.
James shares his views on the importance of being able to evaluate risk, and to do so with open eyes and a level head. He emphasizes the value in taking risks in the workplace, especially for younger workers looking to make their mark. He shares his thoughts on threat intelligence, and the challenges organizations face when trying to cut through all of the noise.
This podcast was produced in partnership with the CyberWire.
For those of you who’d prefer to read, here’s the transcript:
This is Recorded Future, inside threat intelligence for cybersecurity.
Dave Bittner:
Hello, everyone, and welcome to episode 160 of the Recorded Future podcast. I'm Dave Bittner from the CyberWire.
Our guest is IT risk analyst James Dawson. James provides advice to global organizations on the issues of threat and cyber risk, and has also consulted with many organizations in the financial industry, including Danske Bank and Freddie Mac.
James shares his views on the importance of being able to evaluate risk, and to do so with open eyes and a level head. He emphasizes the value in taking risks in the workplace, especially for younger workers looking to make their mark. He shares his thoughts on threat intelligence, and the challenges organizations face when trying to cut through all of the noise. Stay with us.
James Dawson:
It's really funny, Dave, I'm kind of an older guy and I know no one tells their age on the air, but I'm a 60-year-old guy. And I got my start in doing just regular E-discovery, believe it or not, and actually technology work at Lehman Brothers on Wall Street, where we were writing on SPARCstations in very old programs, trying to figure out how to traunch and how to analyze the risk associated with mortgage-backed securities.
Dave Bittner:
And so where did you go from there?
James Dawson:
I went up there and I went into a different world. It was a little bit more of a technology world. We were doing analysis of bridges and infrastructure in New York City. And I went from being a structural engineer into becoming an IT professional. I like to use the term “technologist” because a lot of young men and women don't realize they're a technologist, but they really are. And they're brought into it by their daily life.
Dave Bittner:
How so? How do you come to that conclusion?
James Dawson:
Well, you don't realize it, but you're sitting there helping the boss, you're typing things, you're using different programs, you're using maybe a little bit more sophisticated programs than for instance — Word, and Excel, the Office Suite. You might be using an IDE, like visualstudio.net. I teach that program, or I have in the past at NYIT, here in New York City, many years ago. But you start to do more complicated things and you think that it's just like your front-end, unstructured data world, but really what you're doing is you're moving into being a technologist.
Dave Bittner:
And how do you define being a technologist?
James Dawson:
I like to think of myself as someone who can solve problems with the current tools that are at hand. I think of technology as like a tool. It's like a screwdriver, or a wrench, or a hammer that you have in the shed, or if you're perhaps an artist or a creative person who writes, your tool is the pen, or your tool is the pencil, or the keyboard. Your tools are also something like, perhaps a paintbrush, or maybe a stage. What I mean by tools is that for a technologist, our tools are the computer, programming language, structured and unstructured data, and being able to use applications to get things done.
Dave Bittner:
Now, you've spent a lot of time and have a lot of experience in the financial side of the house. Can you take us through and give us some insights on some of the things you've done and the experiences you've had there?
James Dawson:
Yes, I've been really fortunate in my career. I got really lucky every time I made a change in my career. I have done enterprise information governance and also data governance and data strategy very early on when that sort of profession was in its infancy for companies like Goldman Sachs, or MetLife, or JPMorgan Chase. And in those times, I realized that I was, to be honest, I was actually winging it a little bit and I was greenfielding it. I didn't really know all the answers and I didn't really know the right things to do.
But, I'm bragging now, but I was brave enough to say, "Hey, this is what I think." So I think that the technology professionals, particularly for your Recorded Future profile here, that the people in your audiences, they've got to take a little risk. They’ve got to get out there and say, "I don't really know everything, but I'm an innovative thinker and this is the way I recognize risk." I think that there's a big void in recognizing risk. And that's what I learned very early on with these financial services companies. I was on Wall Street as early as 1999. I know that dates myself. I'm going to sound like an old man to most people. But that was when people took risks and I think that they didn't know their technology, and they had to start saying that this is just the tool in my shed. I'm going to get something done and I'm going to exhibit some good skill for this company using technology.
Dave Bittner:
What have you seen in terms of the maturation of that, the use of technology within those financial companies going from when you were pioneering some of the ways to use them to now, where I suppose we have a pretty extensive set of best practices?
James Dawson:
We do. We like to use the term “leading practices” because no one knows what's best. But, yeah. So, I heard one of your other guests before say “best practice.” But it's really leading practices. We like to say “leading practices.” What I think is a leading practice is, no one really knows what to do. No one really knows the answer and no one really knows the technology to use. And what I mean by that is if you have the skill to be able to explain something very, very complicated, something that is hard to understand to a person, a man or a woman who's in a high-level position, he or she is the CEO or the CISO of an organization, you can explain some complicated risks, some complicated situation that involves not only technology but it involves people and their interactions with their environments and whether or not they do something for love or money. I always like to say that. If you have that ability to be able to explain that, so that professional, that decision-maker understands the risk, then you're going to be as successful.
It's not whether or not you're a great coder. It's not really whether or not you know great threat intelligence or any other great program. It's whether or not you have the ability to bring it down to the level of someone who's very busy, has a lot on their mind, they have family, they have life issues just like you do, and they can absorb what you tell them and they can make a decision on it. If you have that skill, that's what I've learned, Dave. I've learned that is the skill to have.
Dave Bittner:
Yeah. It's a really interesting insight. I mean, I hear from many people that their advice to those folks who are on the tech side of things, especially students who are coming up, is don't underestimate the value of that communications class, of that journalism class, those English classes, to be able to write. And, you're not going to be able to make your case if you're ineffective at communicating it.
James Dawson:
Yeah. I always tell the students or the people that I mentor, I say to those men and women, "Look, write a friendly email." Even if you have a very, very bad thing to say, try to make it friendly, have a salutation, have an introduction, make sure you have something friendly to say at the end. And don't be afraid to put something personal upfront. Why? Because the person you're speaking to has got a lot on their mind, they have 150 emails in their box, and they don't know whether or not you are bringing them important information, information that they need to act on. So to lead them in, it's like walking a horse to water, you want to give them something friendly, say something nice. I always, right now, recently, I've been using in my emails, "I hope you're safe and sound in this time of crisis for COVID."
There's a way to lead them in to get them a little more comfortable. And then speak in layman's terms, don't use acronyms. Don't try to sound technical. Don't try to sound like you know it all. And explain the situation, explain the risk, give a little bit of technology, give a little bit of your subject matter expertise. There's nothing wrong with doing that. And then try to get them to the point where they can adjudicate on what you're telling them and make a decision.
I know that's a little far off from Recorded Future. But I think about that because, I love the title of the program because you think about Recorded Future with people who have to make decisions, they need to make decisions about things that are not yet happening. They have to guess the future. That's probably a little far-flung, but you know.
Dave Bittner:
No. But, it's an interesting thing to consider. I mean, how important is it and was it when you were experimenting with these things? How important was it that the organizations themselves gave you the leeway to do that, that they were willing to take the risks to let you try things?
James Dawson:
Well, that's a tough question, Dave. Remediation takes two forms. You're either putting out fires and fixing old things or things that are happening today and trying to figure them out, or you're acting on your knowledge. You're acting on what you know is likely to happen. Likelihood is probably the most valuable cyber risk analysis tool. You think to yourself, you say, "Is this likely? Is it very likely? Is it unlikely? Is it very unlikely?" If you ask yourself that question first, then you'll know how much work you have to do. You'll know what you need to do and you'll know what you need to explain to your managers or your constituents. Is it likely? Make a judgment first because very often, people spend too much time on the unlikely things and not enough time on the very likely things.
So try to make a judgment about likelihood. And that goes back to what we were saying earlier about the human aspect of threat and cyber and managing APTs is that you've got to look at the likelihood of it and see whether or not you really think this is going to happen. Do I need to escalate this? Even though the impact could be very, very large to my boss or to my organization, do I need to escalate this right now or is it just not that likely? I also like to say, Dave, is it an if of an if of an if? Will this perpetrator do this, and do they have the means? Are they ready, willing, and able to do — able is the most important thing — to do the bad thing task? I mean, are they likely to do it? Are they able to do it? If you do that analysis first, you save a lot of work because you're not analyzing and you're not trying to figure out answers for the future so far ahead of time. You've only got a smaller bucket to work on.
Dave Bittner:
It reminds me of that notion that there are a lot of people who are afraid of flying. And statistically, you're more likely to lose your life in your car or crossing the street in front of your apartment building than you would in an airplane. But, I think the way people visualize what an end of life situation —
James Dawson:
Going down in flames or something?
Dave Bittner:
Yeah. I mean, it's horrifying and so it captures your imagination. And I suppose that's one of the things that you have to do, is make sure that the folks you are advising don't let their imagination run away with them on those highly unlikely, but catastrophic events.
James Dawson:
Yeah. That's risk responsibility design. That's what we call that. Responsibility design, when it comes to cyber risk, is really difficult because you have to incorporate in that scary aspect of it. Everyone always worries, what if this really happens? I mean, right now we're in the COVID pandemic and everyone's worrying, what if an entire country is taken out, and all of our critical people can't come in, or can't even work from home? Risk responsibility design has to take in that scare factor. Are we going to go down in flames or are we going to go down in that plane? When, if you look at it and you look at the likelihood of it, what is the likelihood? I mean, I travel a lot, but I take maybe ... I think that my statistics show that I've been around the globe four times, around the entire earth in flights, whatever that means. I have no idea what that means, but I'm sure it's not good for my carbon footprint. But what is the likelihood that I'm going to go down in flames? So, we're scared about that and we envision that. We see it in our mind.
But it's really not the way to convey to the decision maker the risk responsibility. In cyber design and cyber risk, and whenever you're analyzing APTs, don't just put out fires. You’ve got to do that. You’ve got to put out the fires. But look at the responsibility design in the sense that is this really likely to happen? Are you really going to have a crash in the plane? Is the organization really going to be hit with this threat?
As you and I know, and you broadcast on many of your podcasts, 90%of all threats come from phishing. They come from phishing initially. And is it really likely that the person is going to get the perpetrator or the group is going to get beyond the initial phish to actually hook some data and actually act on it? Are they going to be a perpetrator? Are they going to commit the fraud and actually abuse the organization and their data? Well, I don't know. I don't always know with all threats and that's what I struggle with every day. I think about how many of those threats are really going to happen. What's the likelihood of it really going to happen? It would be an if and an if and an if. And the bad guy or the bad gal would really have to be lucky to get to that point. There's not a lot of pros in the cyber bad guy world.
Dave Bittner:
Well, when you're being mindful of letting your imagination run away with you, is there still a place in all these equations for a gut feeling?
James Dawson:
There is. And I think that we need to rely on that. I rely on that every day. I really don't know the answers. And I'm often presented with complex questions and I need to respond to decision-makers. And oftentimes, those decision-makers are compensating me, so I'm really worried about how I respond to them. But I do rely on my gut. I try to lean towards more of a human decision. People do good or bad things for either love or money in the world, and I think that that's very true. I know that sounds very, very basic, but I think it's true. And when I answer a decision-maker, especially one that is my boss or someone who's keeping me employed, I really, really worry about that. I worry about how I'm going to answer and I try to go with my gut feeling.
I know that when you're young and you're a young professional, you don't want to go with your gut feeling. You're afraid to take risks. You're afraid to give a bad answer. I had a colleague at one of the banks that I'm advising for, she didn't want to take the risk on the decision. She was worried about going ahead and using her gut feeling and giving some advice to a manager or senior leadership. And she was worried because she didn't want to make it look like she did something wrong, but you’ve got to take those risks. You can't do everything right. You're not a perfect employee. You're not going to make perfect decisions.
As a young person, I think you’ve got to go with your gut instincts, and you’ve got to take a chance and say, "You know what, I think we need to move forward with this. We need to raise this risk. We need to alert all the people in." And maybe they'll all say, "Oh, that was a dumb thing to bring up," or something, but that's a better risk to take then letting that actual threat get into the organization, do something really bad, and it results in reputational risk or multimillion-dollar risks to the organization in the end. Take a chance, expose yourself, and don't be afraid to say that you could do things wrong, and you're vulnerable, and you're not always right.
Dave Bittner:
I want to touch on threat intelligence and I'd love to get your insights on the role that you think it plays in an organization's defenses.
James Dawson:
I worry about threat intelligence in the sense that, and this is just me being James, but I worry that it's way too much putting out fires, responding to APTs, anything that's happening now. We do a lot of work. I think that organizations do a lot of very good work and very hard work looking at current threats. Threats are very, very individualistic in the sense that they're happening now. What I think is more important is to step back from that, have your firemen and women right up front on the frontline in the bastion working on those threats that are pounding on your door and on your critical data.
But then, step back and have a larger team, what I like to think of as a blue-purple team, maybe a blue team that is more along the lines of why are these threats coming? What are people after? Why are they trying to extort us? Why are they trying to extort the organization? Are they a disgruntled employee? Are they just an actor from another country trying to make money? Look at the bigger picture, and then you'll be able to put out your current fires much better.
Dave Bittner:
Yeah, that's interesting. So if you're walking out to get your newspaper every morning and you're finding someone's lit your mailbox on fire. Rather than just buying a new mailbox every day, maybe it's a good idea to find out who has it in for you?
James Dawson:
Yeah. Why are they after you? What is this? Why are these threats coming? I mean, we look at persistent threats. We look at probably, we ... I would say the organizations that I advise to look at thousands of threats at any given time. I mean, they're coming in, they're very persistent. They go higher and lower. In any given week, you'll find the beginning of the week, just like a human aspect of that threat is that people don't do a lot of threats in the beginning of the week, and then they get a little more aggressive on Wednesday and Thursday.
Look at those human aspects, those men and women, and those organizations, and those groups of people who are providing those threats to your organization, that's where you need to provide your analysis. Because it's just like a fireman or a firewoman in a local town in Oklahoma, they'll be much better at going out and putting out the fire at the local farm if they know the people at the farm, they know what animals they have, they know the location of the houses. They know what's going to happen when the threat happens to the farm, the fire happens.
Be ready for that. Think about your perpetrators, think about their lives, and why they're after you. If you do that, you will be much better at handling persistent threats and all threats. Look at the reason that you have perpetrators rather than at just trying to put out fires and understand things at the moment. What is this, why did this happen? The instantaneous trying to figure out what's going on in threats to your organization just doesn't work,
Dave Bittner:
But you bring up an interesting aspect of this, which is, I would imagine as you're traveling around the world and dealing with organizations, doing the work that you do, it must be an important part of it to get that local perspective, to get those ears on the ground and to find out what the individual situation is so that you have those insights to inform what you're doing.
James Dawson:
There's a lot of great organizations and professional vendors out there that provide those services. I think some of them are your sponsors. I'm not trying to plug them or anything, but you can come up with intelligence. Intelligence is not the hard thing. There's a lot of sources for intelligence. A lot of it is good. I think that 20% of it is bunk. I heard one of your guests say from the government that she didn't really rely upon some of that intelligence. But a lot of it is very, very good. I don't think that's the issue.
You can always get a body of information and you can buy a technology that will assess that and do AI on it and figure out whether or not that is going to provide you any information for future threats. But it's very, very, very difficult to predict a future and it's very difficult to predict what people are after you for unless you look at the human aspects of why are you being threatened? Why are people trying to extort you? Why are countries or organizations going after your bank or your hospital? There's got to be a reason.
And I know it's a little bit of slang, but it's either love or money. Either they loved you and they hated you, and they used to work there and they want to get after you, or they're a country that wants to go after you for money and they want to extort your critical data assets and your mission-critical assets, where they could hold them in ransom and you'll have to pay them for them. I mean, those are the most likely threats that happen in the world. And, it's just a fact. I know it sounds very simplistic, but that's the fact.
Dave Bittner:
Our thanks to James Dawson for joining us.
Don't forget to sign up for the Recorded Future Cyber Daily email, where every day you'll receive the top results for trending technical indicators that are crossing the web, cyber news, targeted industries, threat actors, exploited vulnerabilities, malware, suspicious IP addresses, and much more. You can find that at recordedfuture.com/intel.
We hope you've enjoyed the show and that you'll subscribe and help spread the word among your colleagues and online. The Recorded Future podcast production team includes Coordinating Producer Monica Todros, Executive Producer Greg Barrette. The show is produced by the CyberWire, with Editor John Petrik, Executive Producer Peter Kilpe, and I'm Dave Bittner.
Thanks for listening.
Related