WEBVTT 1 00:00:00.360 --> 00:00:03.090 Anna Delaney: Hello, welcome to the ISMG Editors' Panel. I'm 2 00:00:03.090 --> 00:00:06.060 Anna Delaney and this is a roundup and analysis of the 3 00:00:06.060 --> 00:00:09.720 week's top stories with some of ISMG's leading journalists. 4 00:00:10.230 --> 00:00:13.410 Introducing today's team: Tom Field, senior vice president of 5 00:00:13.410 --> 00:00:17.610 editorial, Rashmi Ramesh, senior sub editor for ISMG's global 6 00:00:17.610 --> 00:00:20.550 news desk and Mathew Schwartz, executive editor, 7 00:00:20.760 --> 00:00:25.080 DataBreachToday and Europe. Great to see you all. Rashmi, 8 00:00:25.230 --> 00:00:28.590 where are you? Looks amazing. I want to jump in. 9 00:00:29.880 --> 00:00:32.430 Rashmi Ramesh: So, I am currently in India's 10 00:00:32.460 --> 00:00:38.430 northeastern state of Meghalaya. So, behind me is a living root 11 00:00:38.430 --> 00:00:42.570 bridge, which is a suspension bridge made entirely of live 12 00:00:42.570 --> 00:00:47.220 roots of trees around it. So they're, most of them are about 13 00:00:47.220 --> 00:00:50.400 500 years and there are many of these in the state. I think this 14 00:00:50.400 --> 00:00:53.730 specific one is about 180 years old, so quite young. 15 00:00:54.060 --> 00:00:55.320 Anna Delaney: Yeah, quite young. 16 00:00:56.190 --> 00:00:57.270 Tom Field: By Indian standards. 17 00:00:58.830 --> 00:01:02.910 Anna Delaney: Tom, very different backdrop to Rashmi's. 18 00:01:03.950 --> 00:01:08.120 Tom Field: Also quite young. Yes, this was flying en-route 19 00:01:08.120 --> 00:01:11.750 from Washington DC to New York City last week and spent a 20 00:01:11.750 --> 00:01:14.720 little time circling around Manhattan and Manhattan was kind 21 00:01:14.720 --> 00:01:15.530 enough to pose. 22 00:01:17.160 --> 00:01:19.590 Anna Delaney: Very good, very good, and Mathew, back in the 23 00:01:19.590 --> 00:01:20.040 wild. 24 00:01:20.940 --> 00:01:24.270 Mathew Schwartz: Back in the wilds of Dundee, Scotland, yes, 25 00:01:24.300 --> 00:01:28.440 the nation's fourth-largest city. This is the law, which is 26 00:01:28.470 --> 00:01:32.280 a hill in Scots and this is the largest hill in Dundee. There's 27 00:01:32.280 --> 00:01:36.000 a war memorial on the top. But it's a lovely walk to get up, 28 00:01:36.030 --> 00:01:39.120 especially when there's a beautiful weekend like we just 29 00:01:39.120 --> 00:01:39.480 had. 30 00:01:40.080 --> 00:01:43.110 Anna Delaney: Gorgeous. And I'm back to showcasing the South of 31 00:01:43.110 --> 00:01:47.070 France in a marvelous quaint town called Ramatuelle. It's my 32 00:01:47.070 --> 00:01:51.780 favorite time of the day: dusk, aperol o'clock. Of course.. 33 00:01:51.810 --> 00:01:52.620 Mathew Schwartz: Magnifique. 34 00:01:52.950 --> 00:01:56.790 Anna Delaney: Magnifique. Tom, how does it feel not to be 35 00:01:56.790 --> 00:02:00.150 flying this week and to be grounded? 36 00:02:00.510 --> 00:02:03.150 Tom Field: It actually is pretty good. It's been a busy month or 37 00:02:03.150 --> 00:02:06.990 so between what we had in New York, Healthcare Summit, we had 38 00:02:06.990 --> 00:02:11.220 a Washington DC Government Summit, Roundtables in Chicago, 39 00:02:11.220 --> 00:02:15.270 in Charlotte and New York and wherever else I'm forgetting. 40 00:02:15.330 --> 00:02:17.370 It's good to be home for a little bit. But it's a good time 41 00:02:17.370 --> 00:02:20.250 to reflect as well on everything that I've been privileged to see 42 00:02:20.250 --> 00:02:21.360 over the past month or so. 43 00:02:21.810 --> 00:02:24.540 Anna Delaney: So, talk to us about Government Summit or other 44 00:02:24.780 --> 00:02:27.060 Roundtables. What were some highlights for you? 45 00:02:27.000 --> 00:02:29.610 Tom Field: Oh, well, I think the Government Summit was important. 46 00:02:29.610 --> 00:02:32.550 It's the first time we've been back in Washington, DC since I 47 00:02:32.550 --> 00:02:37.860 believe, December of 2019. A lot has changed since then, you 48 00:02:37.860 --> 00:02:42.570 know, particularly we've had the 2021 Executive Order from 49 00:02:42.570 --> 00:02:47.100 President Biden that handed down mandates about zero trust and 50 00:02:47.100 --> 00:02:50.400 multifactor authentication and public-private partnerships. And 51 00:02:50.790 --> 00:02:53.880 these themes were strong. At our event last week, I was very 52 00:02:53.880 --> 00:02:57.000 pleased to see great representation on our stage from 53 00:02:57.450 --> 00:03:03.600 the alphabet agencies, CISA, NSA, FBI, Secret Service, I'm 54 00:03:03.600 --> 00:03:07.470 forgetting some but we had excellent leadership there, and 55 00:03:07.470 --> 00:03:11.370 we did talk about the progress that agencies have made in 56 00:03:11.370 --> 00:03:15.270 moving forward with zero trust architectures. And it was good 57 00:03:15.270 --> 00:03:18.360 to get beyond the conversation of what is zero trust, what's 58 00:03:18.360 --> 00:03:22.290 misunderstood about zero trust, to hear how organizations are 59 00:03:22.350 --> 00:03:25.890 actually pursuing it and the challenges that they're having 60 00:03:26.130 --> 00:03:29.370 with connected devices that we don't think about, including 61 00:03:29.400 --> 00:03:33.450 Defense Department weaponry. So, excellent conversations on the 62 00:03:33.450 --> 00:03:36.840 stage about that. The theme of public-private partnership 63 00:03:36.840 --> 00:03:39.990 beyond this is something we need to do. And this is something 64 00:03:39.990 --> 00:03:43.380 that government wants to hearing that the private sector is much 65 00:03:43.380 --> 00:03:46.440 more ready for this and pursuing this now. And there are 66 00:03:46.440 --> 00:03:51.360 technologies that are available to not only ensure real-time 67 00:03:51.390 --> 00:03:55.110 data sharing, but to anonymize that data. So no particular 68 00:03:55.110 --> 00:03:58.410 organization is exposed and everybody wants the same thing. 69 00:03:58.650 --> 00:04:01.350 We want to see what's coming so they can be able to respond 70 00:04:01.500 --> 00:04:04.470 proactively and not just check and say "yep, you saw what we 71 00:04:04.470 --> 00:04:07.740 saw." So, encouraging to see that that was a consistent theme 72 00:04:08.070 --> 00:04:11.160 throughout the day. I say one of the other topics that I found 73 00:04:11.160 --> 00:04:14.970 fascinating was seeing the Secret Service and the FBI 74 00:04:14.970 --> 00:04:19.590 coming together to talk about business email compromise. This 75 00:04:19.590 --> 00:04:22.920 gets overlooked. This Matt will tell you. Ransomware gets the 76 00:04:22.920 --> 00:04:26.280 headlines every time. Everyone is attracted to the, you know, 77 00:04:26.640 --> 00:04:33.450 ransomware drama. Phishing within organizations gets the 78 00:04:33.450 --> 00:04:36.780 attention of security leaders just because the volume of 79 00:04:36.780 --> 00:04:40.980 phishing attempts but business email compromise consistently is 80 00:04:40.980 --> 00:04:45.420 taking billions of dollars from organizations every year. And 81 00:04:45.420 --> 00:04:48.270 those are just the cases that are reported. So many aren't 82 00:04:48.270 --> 00:04:52.140 reported. Now, the concern from law enforcement is that these 83 00:04:52.140 --> 00:04:58.110 cases are continuing to grow. And as Rashmi knows, as the 84 00:04:58.860 --> 00:05:04.440 cyber criminals move more toward cryptocurrency, once this money 85 00:05:04.440 --> 00:05:09.600 is taken, transferred and gone, it's a lot harder to get back 86 00:05:09.600 --> 00:05:13.170 now and it moves a lot faster. So, there's urgency on the part 87 00:05:13.170 --> 00:05:16.350 of law enforcement to find out about these incidents as quickly 88 00:05:16.350 --> 00:05:19.860 as possible so they can respond. And if they can respond quickly, 89 00:05:19.890 --> 00:05:22.680 they can return a lot of this money. Now I bring this up, 90 00:05:22.680 --> 00:05:26.010 because just last evening, I hosted a roundtable event with 91 00:05:26.010 --> 00:05:30.300 abnormal security on email-borne threats. And we talked a lot 92 00:05:30.360 --> 00:05:34.080 about business email compromise. Significant organizations in the 93 00:05:34.080 --> 00:05:40.320 room, JPMorgan Chase, Blue Cross Blue Shield, other financial 94 00:05:40.320 --> 00:05:43.650 institutions and consistently, they said the email-borne 95 00:05:43.650 --> 00:05:46.650 threats are a top priority for their organization, as one of 96 00:05:46.650 --> 00:05:50.700 the leaders said, "this is a main control for us." And yet, 97 00:05:50.820 --> 00:05:55.080 they all have got significant concerns about their defenses. 98 00:05:55.560 --> 00:05:59.460 There's a feeling that they're trying to respond to 2022 99 00:05:59.460 --> 00:06:04.380 threats with 2012 technology. And they're particularly 100 00:06:04.380 --> 00:06:07.200 concerned about third-party relationships. Certainly, once 101 00:06:07.200 --> 00:06:10.050 someone gets into a third party, they're already trusted. And 102 00:06:10.050 --> 00:06:12.960 then they've got access to their own systems. And then what I'm 103 00:06:12.960 --> 00:06:17.010 seeing and hearing from the security leaders is a change in 104 00:06:17.010 --> 00:06:20.790 attitude. 10 years ago, I would go to these Roundtables and we 105 00:06:20.790 --> 00:06:23.910 would talk about phishing exercises, for instance, you 106 00:06:23.910 --> 00:06:27.450 phish your employees and you see who clicks on the suspicious 107 00:06:27.450 --> 00:06:32.310 link or opens up the attachment. And the reporting click rates 108 00:06:32.310 --> 00:06:36.270 are always very high. And the discussion then was "okay, if 109 00:06:36.270 --> 00:06:40.440 you have someone that consistently clicks on these 110 00:06:40.440 --> 00:06:43.650 links, and opens these attachments, what do you do?" Do 111 00:06:43.650 --> 00:06:47.070 you discipline them? Is termination, ultimately, an 112 00:06:47.070 --> 00:06:49.770 objective if you've got someone that's consistently exposing 113 00:06:49.770 --> 00:06:54.390 him? That's changed, it's turning now into how can we get 114 00:06:54.390 --> 00:06:58.620 people to not be scared of us and to report these things 115 00:06:58.650 --> 00:07:02.190 sooner, even if they do click on that link or open that 116 00:07:02.190 --> 00:07:05.820 attachment. The security leaders are trying to bring the 117 00:07:05.820 --> 00:07:09.300 employees more on their side and report these things so they can 118 00:07:09.300 --> 00:07:13.350 respond a lot sooner. And that's a significant change in the 119 00:07:13.350 --> 00:07:16.590 attitude of organizations. So, there you go, over the past few 120 00:07:16.590 --> 00:07:19.350 weeks, those are the types of things that I've been observing 121 00:07:19.000 --> 00:07:23.380 Anna Delaney: And Matt, at our recent UK Summit, I think there 122 00:07:23.380 --> 00:07:28.030 was a lot about the sort of change of attitude and helping 123 00:07:28.150 --> 00:07:31.780 our employees and how psychology plays a big part of that. 124 00:07:31.000 --> 00:07:32.680 Mathew Schwartz: Yes, one of the things that I hate to hear and 125 00:07:32.680 --> 00:07:33.790 we're hearing less of it now is that humans are the weakest 126 00:07:33.790 --> 00:07:37.690 link. And I think that says more about the person saying it than 127 00:07:39.490 --> 00:07:48.310 it does about the reality today and the imperative today. In an 128 00:07:48.310 --> 00:07:52.360 ideal world, I always say none of us would have to screw around 129 00:07:52.360 --> 00:07:56.680 with password managers or cybersecurity best practices, 130 00:07:56.860 --> 00:08:01.120 everything would just work and be secure. If engineers can't 131 00:08:01.120 --> 00:08:04.720 build systems that do that, if they're relying on people to 132 00:08:04.720 --> 00:08:07.900 help, they need to do much more than meet them halfway. And I 133 00:08:07.900 --> 00:08:11.950 think there needs to be a real attitude shift still that posits 134 00:08:11.980 --> 00:08:14.980 users not as a hindrance, but as people that you need to work 135 00:08:14.980 --> 00:08:17.830 with. And so, yes, we've been hearing a lot more about that, a 136 00:08:17.920 --> 00:08:20.950 lot more attempts to study psychology, how we work, how we 137 00:08:20.950 --> 00:08:24.310 tick, because certainly, criminals are studying that. 138 00:08:24.310 --> 00:08:28.900 They're expert at how to hack the human. So, we all need to do 139 00:08:28.900 --> 00:08:31.930 a much better job. And like you said, there's a lot of good 140 00:08:32.170 --> 00:08:35.440 motion and thinking and initiatives around that. 141 00:08:35.470 --> 00:08:38.800 Although, still in engineering-first mindset, I 142 00:08:38.800 --> 00:08:40.780 think too often. So there's a little ways to go. 143 00:08:41.880 --> 00:08:44.460 Anna Delaney: And Rashmi, you recently took part in a summit 144 00:08:44.460 --> 00:08:47.970 in Bengaluru. I'm not asking you for an analysis, because I know 145 00:08:48.690 --> 00:08:52.410 how much has happened in between, then, but was there a 146 00:08:52.410 --> 00:08:54.480 dominant theme of the day for you? 147 00:08:55.650 --> 00:08:59.940 Rashmi Ramesh: Oh, well, the one thing that stood out to me was 148 00:08:59.940 --> 00:09:03.420 that every single person that I spoke to after the summit, and I 149 00:09:03.420 --> 00:09:07.320 spoke to a lot of people, everyone said that, you know, 150 00:09:07.440 --> 00:09:10.800 "we attend a lot of summits over the years, and we've attended a 151 00:09:10.800 --> 00:09:13.950 lot of summits after the pandemic, because, you know, 152 00:09:14.160 --> 00:09:17.730 yay, finally you get to socialize. But the one thing 153 00:09:17.730 --> 00:09:21.840 that ISMG stands out with their summits is that every single 154 00:09:22.950 --> 00:09:26.100 session had something that I could implement in my company." 155 00:09:27.390 --> 00:09:34.530 So they, you know, spoke to the speakers after the event. They 156 00:09:34.800 --> 00:09:37.620 had a chat with each other. We did a lot of leadership 157 00:09:37.620 --> 00:09:41.550 interviews. But yeah, I mean, that really, really made a lot 158 00:09:41.550 --> 00:09:45.810 of difference because people had lots of takeaways, implementable 159 00:09:45.810 --> 00:09:50.040 takeaways after the summit. And I think that's sort of the point 160 00:09:50.040 --> 00:09:50.970 of all of this, right? 161 00:09:52.140 --> 00:09:54.180 Anna Delaney: Tip to our organizers. That is fantastic to 162 00:09:54.180 --> 00:09:56.400 hear. So the next is in New Delhi, I believe. 163 00:09:57.630 --> 00:09:58.590 Rashmi Ramesh: 24th, yeah. 164 00:09:58.710 --> 00:10:02.490 Anna Delaney: Right, good. Okay. 24th of August. Rashmi, you have 165 00:10:02.490 --> 00:10:05.520 a fascinating story this week. It's been described as the 166 00:10:05.520 --> 00:10:10.290 chaotic viral Nomad attack. Talk us through the sort of frenzied, 167 00:10:10.560 --> 00:10:11.610 free-for-all hack. 168 00:10:12.680 --> 00:10:17.750 Rashmi Ramesh: Yes. So, just this week, the hackers drained 169 00:10:18.110 --> 00:10:22.940 about 190 million from a cross-chain bridge called Nomad, 170 00:10:23.090 --> 00:10:26.240 like I said, so it was one of the biggest hacks of the year in 171 00:10:26.240 --> 00:10:31.220 the space. And the biggest, I think, was Iranian network about 172 00:10:31.340 --> 00:10:34.970 at around 600 million, followed by Wormhole around 300 million. 173 00:10:35.750 --> 00:10:41.000 But what really stands out to me, you know, as a trend, is 174 00:10:41.000 --> 00:10:44.330 that there's been a lot of action in the blockchain bridge 175 00:10:44.330 --> 00:10:48.680 space, like a lot of action, 2 billion worth of action. So 176 00:10:48.950 --> 00:10:52.400 that's how much has been lost in blockchain bridges, bridge 177 00:10:52.400 --> 00:10:55.640 attacks just this year. And you know, it's just the first week 178 00:10:55.640 --> 00:10:58.550 of August now. So this is the number four in the last seven 179 00:10:58.550 --> 00:11:03.590 months and get it, 69% of all crypto thefts this year have 180 00:11:03.590 --> 00:11:07.880 happened on cross-chain purchase. So that's chainalysis 181 00:11:07.940 --> 00:11:11.870 analysis - not mine. But it's crazy, right? Like a technology 182 00:11:11.870 --> 00:11:15.380 application you haven't probably hadn't even heard of back in 183 00:11:15.380 --> 00:11:20.720 2020 is now causing such significant damage. So, the 184 00:11:20.900 --> 00:11:24.740 obvious questions that, I mean, at least that came up to me when 185 00:11:24.770 --> 00:11:29.720 I looked at it as a trend, were why are these bridges such an 186 00:11:29.720 --> 00:11:34.400 attractive target for criminals? Why now? And how do we secure 187 00:11:34.400 --> 00:11:40.880 them? And this is where I very casually drop a plug for a story 188 00:11:40.880 --> 00:11:44.240 that I'm currently working on, where blockchain security 189 00:11:44.240 --> 00:11:48.170 experts answer all of these questions. But for you, here's a 190 00:11:48.170 --> 00:11:52.610 little bit of a sneak peek. So why are crossing bridges a 191 00:11:52.610 --> 00:11:56.060 target? To understand that, we need to understand what they 192 00:11:56.060 --> 00:12:00.650 are. So, crossing bridges allow users to exchange digital assets 193 00:12:00.650 --> 00:12:04.340 like, you know, crypto tokens between otherwise siloed 194 00:12:04.340 --> 00:12:07.460 blockchains. So, for instance, if I have a token and blockchain 195 00:12:07.490 --> 00:12:11.300 A and I want to token in blockchain B, I would send token 196 00:12:11.300 --> 00:12:14.240 A to a bridge protocol where the funds will be locked into a 197 00:12:14.240 --> 00:12:18.860 contract as collateral and I will be issued a wrapped token B 198 00:12:18.890 --> 00:12:22.610 equivalent to the value of token A. So the rap token is like a 199 00:12:22.610 --> 00:12:27.080 representation of my funds like a gift card also. So I can 200 00:12:27.080 --> 00:12:30.590 redeem it for token B or just go through the same process in 201 00:12:30.590 --> 00:12:33.440 reverse to get token A back. Now, the funds locked into the 202 00:12:33.440 --> 00:12:37.040 contract have to be stored in a reserve, right? That reserve, if 203 00:12:37.040 --> 00:12:40.880 not stored with proper security measures can become a treasure 204 00:12:40.880 --> 00:12:43.520 trove for hackers. And this is just one of the many 205 00:12:43.520 --> 00:12:47.510 vulnerabilities that are plaguing bridges right now. So 206 00:12:47.930 --> 00:12:51.740 I'll stop here for now, and hope that your interest is piqued 207 00:12:51.740 --> 00:12:53.270 enough to read my story. 208 00:12:53.000 --> 00:12:55.310 Anna Delaney: I think you have definitely whetted our 209 00:12:55.310 --> 00:12:58.160 appetites. Thank you, Rashmi. But also, what's interesting 210 00:12:58.160 --> 00:13:01.790 about this is there's no one culprit, no one group. There's 211 00:13:01.820 --> 00:13:06.260 40 or so exploiters, let's say. And it's a curious case of, 212 00:13:06.260 --> 00:13:09.980 well, the better criminals are going to get away with this, but 213 00:13:10.160 --> 00:13:13.790 some less experienced decided not to conceal their real-life 214 00:13:13.910 --> 00:13:17.090 identity. So what's next is law enforcement on the case? 215 00:13:18.680 --> 00:13:22.580 Rashmi Ramesh: Apparently. So, we don't know yet. But the 216 00:13:22.580 --> 00:13:25.790 company has said that they have contacted law enforcement. 217 00:13:25.790 --> 00:13:29.240 They've got, you know, blockchain forensics team 218 00:13:29.240 --> 00:13:33.890 working on it. They've got white hat hackers who have apparently, 219 00:13:34.130 --> 00:13:37.490 you know, taken some of the money for safekeeping. There's 220 00:13:37.490 --> 00:13:41.690 no protocol yet on how they can return the funds. But that 221 00:13:41.690 --> 00:13:43.100 apparently is in progress. 222 00:13:43.660 --> 00:13:46.330 Anna Delaney: Okay, well call to action to our audience. Here is 223 00:13:46.330 --> 00:13:46.720 Rashmi. 224 00:13:47.530 --> 00:13:50.080 Tom Field: Rashmi, could I get you to do a promotion? A teaser 225 00:13:50.080 --> 00:13:53.800 for the next interview I do? I think I need you. Matt, how 226 00:13:53.800 --> 00:13:56.110 about you? Let's get Rashmi on our side here? 227 00:13:58.180 --> 00:14:00.670 Mathew Schwartz: Oh, yeah, no, definitely. Definitely. We'll be 228 00:14:00.670 --> 00:14:01.480 in touch, Rashmi. 229 00:14:03.160 --> 00:14:04.690 Anna Delaney: So Matt, what's happening on the ransomware 230 00:14:04.690 --> 00:14:07.630 scene I hear is not so good news for SMEs. 231 00:14:09.040 --> 00:14:11.470 Mathew Schwartz: That's correct. I mean, it's always not so good 232 00:14:11.470 --> 00:14:14.920 news. And just to amplify what Tom was saying, business email 233 00:14:14.920 --> 00:14:18.490 compromise attacks, phishing, when you look at the total 234 00:14:18.490 --> 00:14:23.350 volume of damage being caused, bear in first and second place. 235 00:14:23.560 --> 00:14:27.280 Now ransomware gets a lot of attention. And I think that's 236 00:14:27.280 --> 00:14:31.390 probably because of the disruptive elements of it. I do 237 00:14:31.390 --> 00:14:35.140 worry sometimes we focus on it too much. Russian hackers 238 00:14:35.170 --> 00:14:38.800 wielding nasty code against corporate America, corporate 239 00:14:38.800 --> 00:14:44.740 Britain, corporate everybody. It's a hard story to not pursue, 240 00:14:44.950 --> 00:14:48.340 but it's important to remember that it's just one of the 241 00:14:48.370 --> 00:14:52.120 various types of cybercrime that's been happening, and it's 242 00:14:52.120 --> 00:14:57.370 interesting as well, because the business model being wielded by 243 00:14:57.730 --> 00:15:02.320 ransomware groups, ransomware individuals, sometimes continues 244 00:15:02.320 --> 00:15:07.090 to get tweaked in order to cause maximum, to cause maximum profit 245 00:15:07.090 --> 00:15:11.200 basically for the criminals at the expense of the victims. So, 246 00:15:11.200 --> 00:15:16.000 like you say, SMBs have been a particular focus of these 247 00:15:16.000 --> 00:15:20.380 groups, following on the disastrous Colonial Pipeline 248 00:15:20.380 --> 00:15:25.360 hack in May 2021, which disrupted the pipeline, not 249 00:15:25.360 --> 00:15:27.730 directly, but because the pipeline took its billing system 250 00:15:27.730 --> 00:15:31.540 offline. And because you couldn't make money because 251 00:15:31.540 --> 00:15:36.130 systems were disrupted, it caused panic, as we know, long 252 00:15:36.130 --> 00:15:41.590 lines to buy gasoline, a very unhappy President Biden decrying 253 00:15:41.590 --> 00:15:45.820 these attacks, and much more focus on a law enforcement front 254 00:15:45.820 --> 00:15:49.090 in particular for disrupting these groups. So, what we've 255 00:15:49.090 --> 00:15:56.170 seen is that the big brands appear to be under pressure. One 256 00:15:56.170 --> 00:15:59.590 of the big groups that was left, DarkSide flamed out after it 257 00:15:59.590 --> 00:16:02.440 attacked Colonial Pipeline, the REvil Group, also known as 258 00:16:02.470 --> 00:16:05.620 Sodinokibi , hit a lot of big targets in the middle of that 259 00:16:05.980 --> 00:16:10.840 last year, appears to have been targeted, possibly by US Cyber 260 00:16:10.840 --> 00:16:16.030 Command and disrupted. It's gone by the wayside. Conti, however, 261 00:16:16.480 --> 00:16:21.100 managed to hang on until this spring, when it disastrously 262 00:16:21.100 --> 00:16:28.480 backed, as in Putin's invasion of Ukraine, which I heard at RSA 263 00:16:28.540 --> 00:16:32.020 Conference in San Francisco, could have led to a change in 264 00:16:32.020 --> 00:16:35.860 its legal status. Thus, it was believed to have been just a 265 00:16:35.860 --> 00:16:41.020 cyber criminal operation before. I suspect, nobody has given me 266 00:16:41.020 --> 00:16:46.600 proof of this. But I expect that US Cyber Command, NSA maybe, 267 00:16:46.630 --> 00:16:50.740 said "they've come out in support of Russia's war, or I 268 00:16:50.740 --> 00:16:54.070 think they're a target." And what certainly did happen, I 269 00:16:54.070 --> 00:16:57.460 heard, is that many, many fewer victims were paying Conti 270 00:16:57.580 --> 00:17:02.350 because Conti had basically said it was an extension of the 271 00:17:02.350 --> 00:17:07.570 geopolitical aims of Moscow. So, a lot of damage caused there by 272 00:17:07.570 --> 00:17:12.340 this support. And Conti, unfortunately, appears to have 273 00:17:12.340 --> 00:17:17.200 been kind of smart, it started up a number of brands, and then 274 00:17:17.200 --> 00:17:20.350 later announced that it was shutting down. So, it's not 275 00:17:20.350 --> 00:17:24.340 really clear to what extent Conti has gone away since it's 276 00:17:24.370 --> 00:17:28.030 shut down. But what we have seen is that brands such as Conti, 277 00:17:28.720 --> 00:17:31.330 and some of the other big brands, appear to be a big 278 00:17:31.330 --> 00:17:37.300 target. And that's good news because it means that smaller 279 00:17:37.300 --> 00:17:41.320 operations are having to spin up to try and stay under the radar 280 00:17:41.320 --> 00:17:44.980 of law enforcement. And smaller is better, because they have a 281 00:17:44.980 --> 00:17:48.100 harder time hitting so many victims, things that Conti used 282 00:17:48.100 --> 00:17:52.510 to do was to provide centralized services. So they, groups such 283 00:17:52.510 --> 00:17:56.260 as Conti, not always, not all of them did all these things, but 284 00:17:56.260 --> 00:18:00.280 they would negotiate with all the victims, for example, they 285 00:18:00.280 --> 00:18:04.660 would possibly engage call centers to phone victims and put 286 00:18:04.660 --> 00:18:07.900 on pressure or to phone the customers evicted from pressure. 287 00:18:08.140 --> 00:18:10.720 And so, there is this centralized operation, a lot of 288 00:18:10.720 --> 00:18:14.710 bang for the buck, to bring pressure. The fact that these 289 00:18:14.710 --> 00:18:19.000 big brands now are under fire means that it's a lot harder to 290 00:18:19.000 --> 00:18:22.450 have these large operations and things are being left in the 291 00:18:22.450 --> 00:18:25.630 hands of individual attackers, typically affiliates or business 292 00:18:25.630 --> 00:18:28.600 partners have these ransomware operations, that means they're 293 00:18:28.600 --> 00:18:30.580 not going to be able to do so many of these things, it's going 294 00:18:30.580 --> 00:18:36.190 to be harder for them to make attacks that end up with them 295 00:18:36.250 --> 00:18:40.510 getting paid. So, all of that is great news. As you said, 296 00:18:40.510 --> 00:18:44.170 unfortunately, the criminal groups that hit really big 297 00:18:44.170 --> 00:18:46.990 victims tend to get really big law enforcement attention. So 298 00:18:46.990 --> 00:18:49.900 they've been looking at more small and mid-sized groups. And 299 00:18:49.900 --> 00:18:52.510 unfortunately, the ransomware incident response firms who 300 00:18:52.510 --> 00:18:56.800 track these sorts of attacks say that the SMBs are really under 301 00:18:56.890 --> 00:19:00.880 fire. And they've got relatively less spend on their 302 00:19:00.880 --> 00:19:05.560 cybersecurity, relatively less expertise. So this is a great 303 00:19:05.800 --> 00:19:09.460 reality for businesses of this size. And it's a good reminder 304 00:19:09.460 --> 00:19:14.230 that anybody can and potentially will get hit by ransomware. So, 305 00:19:14.230 --> 00:19:17.320 everybody should be putting the right defenses in place. 306 00:19:17.840 --> 00:19:20.240 Anna Delaney: Now, I'm not sure if you came across it yet. But 307 00:19:20.270 --> 00:19:23.300 this week, the Atlantic Council published some interesting 308 00:19:23.480 --> 00:19:27.770 recommendations with a surge of ransomware attacks and some 309 00:19:27.770 --> 00:19:31.220 sound suggestions, including legislation, mandatory reporting 310 00:19:31.490 --> 00:19:35.030 of all ransomware incidents, but also tax relief programs for 311 00:19:35.030 --> 00:19:39.680 SMBs to encourage them to implement best security 312 00:19:39.680 --> 00:19:43.100 practices, and also employ people with cybersecurity 313 00:19:43.100 --> 00:19:47.600 expertise. So, what do we do with the SMEs? Because I don't 314 00:19:47.600 --> 00:19:49.850 know about the rest of the world, but apparently in the UK, 315 00:19:49.850 --> 00:19:54.530 they comprise 99% of our economy. And as you say, perhaps 316 00:19:54.530 --> 00:19:58.400 no CISO, no SOC, no security budget, what do we do? 317 00:19:58.690 --> 00:20:00.010 Mathew Schwartz: Right and they're a huge part of the 318 00:20:00.010 --> 00:20:03.310 critical infrastructure. Often, we think of massive 319 00:20:03.610 --> 00:20:06.610 organizations being critical infrastructure. But it can be as 320 00:20:06.610 --> 00:20:09.970 small as a municipality wastewater treatment plant. It's 321 00:20:09.970 --> 00:20:13.510 got 30 or 40-year old equipment that's been hooked up to the 322 00:20:13.510 --> 00:20:17.410 internet and is suddenly at risk of somebody remoting in and 323 00:20:17.410 --> 00:20:20.800 doing something bad. Great points. I mean, it's all fine 324 00:20:20.800 --> 00:20:23.770 and dandy to talk about the threat posed by ransomware. But 325 00:20:23.770 --> 00:20:26.380 what we're increasingly seeing is not just a law enforcement 326 00:20:26.380 --> 00:20:30.550 focus, increasing diplomatic efforts, but also a focus on 327 00:20:30.700 --> 00:20:34.720 resiliency. And that means getting domestic organizations 328 00:20:34.870 --> 00:20:39.880 to up their game. And, as you say, that needs to include a lot 329 00:20:39.880 --> 00:20:43.420 of things, that needs to include, I think, tax credits, 330 00:20:43.690 --> 00:20:47.620 because money is tight, especially at the moment. So, 331 00:20:47.620 --> 00:20:50.650 what can be done to create incentives to get better 332 00:20:50.650 --> 00:20:55.510 cybersecurity in place? I think mandatory reporting should also 333 00:20:55.510 --> 00:21:00.130 be mandatory. We simply don't know about so much of this 334 00:21:00.130 --> 00:21:03.400 ransomware problem because the criminals with their business 335 00:21:03.400 --> 00:21:06.730 model have designed things in a way to get victims to pay 336 00:21:06.880 --> 00:21:12.250 quickly and quietly. If the FBI doesn't learn about attacks, it 337 00:21:12.250 --> 00:21:16.270 doesn't know maybe which groups are the worst, or the tactics 338 00:21:16.360 --> 00:21:19.990 they're bringing to bear. All of this helps criminals operate 339 00:21:19.990 --> 00:21:22.870 from the shadows, which is what they want to do. That is the 340 00:21:22.870 --> 00:21:26.470 best way for them to keep making a payday and ransomware has been 341 00:21:26.470 --> 00:21:30.910 really lucrative. So yes, these sorts of initiatives to bring 342 00:21:30.940 --> 00:21:35.650 light to the actual problem are essential because people are not 343 00:21:35.650 --> 00:21:37.750 stepping up and doing it on their own. 344 00:21:39.460 --> 00:21:42.310 Anna Delaney: Here we are voting Mathew Schwartz for president. 345 00:21:43.690 --> 00:21:44.890 Has great, great insight on... 346 00:21:44.890 --> 00:21:45.730 Tom Field: Prime Minister, today. 347 00:21:49.000 --> 00:21:50.410 Mathew Schwartz: Britain's shorter Prime Minister. 348 00:21:51.160 --> 00:21:54.940 Anna Delaney: There is a job for you there. So do them. Apply 349 00:21:54.940 --> 00:21:55.300 now. 350 00:21:55.780 --> 00:21:58.960 Mathew Schwartz: I'll do both at once. I'll be like the Elon Musk 351 00:21:58.960 --> 00:22:03.820 of prime ministerialness. 352 00:22:03.850 --> 00:22:08.770 Anna Delaney: Well, finally, imagine a world where instead of 353 00:22:08.770 --> 00:22:11.980 writing and analyzing cybersecurity stories, you are 354 00:22:12.490 --> 00:22:15.670 putting your hand into creating a solution for all the 355 00:22:15.670 --> 00:22:19.840 challenges we discuss. You are the founder and/or CEO of the 356 00:22:19.840 --> 00:22:23.680 latest cybersecurity company on the market. What would you call 357 00:22:23.680 --> 00:22:28.810 it? Oh, Tom, are you ready with the branding? 358 00:22:29.200 --> 00:22:32.050 Mathew Schwartz: Wow. I feel on equal to this. 359 00:22:33.860 --> 00:22:35.180 Tom Field: That would be Initech. 360 00:22:35.690 --> 00:22:36.350 Anna Delaney: Okay 361 00:22:36.650 --> 00:22:40.220 Tom Field: If you're a fan of the 1990s film Office Space, 362 00:22:40.610 --> 00:22:44.900 you'll know exactly what I mean. If you don't know it, I'll send 363 00:22:44.900 --> 00:22:48.440 you the memo. Did you receive the memo, Anna? 364 00:22:48.800 --> 00:22:49.070 Anna Delaney: No. 365 00:22:49.550 --> 00:22:50.210 Tom Field: I'll get the memo. 366 00:22:51.560 --> 00:22:52.400 Anna Delaney: You need to send it. 367 00:22:52.400 --> 00:22:52.970 Mathew Schwartz: Okay! 368 00:22:55.070 --> 00:22:55.850 Anna Delaney: Rashmi? 369 00:22:58.520 --> 00:23:01.820 Rashmi Ramesh: I kind of want to take Tom's word on camera that I 370 00:23:01.820 --> 00:23:05.930 won't get fired for what I'm about to do. Do I have it? 371 00:23:06.590 --> 00:23:07.850 Tom Field: I'll send you the memo as well. 372 00:23:09.310 --> 00:23:12.640 Rashmi Ramesh: Right. So I'll give you two clues and two 373 00:23:12.640 --> 00:23:17.470 seconds to get the name of my company. The first clue is I 374 00:23:17.470 --> 00:23:25.090 like puns and I cover crypto. And the second clue is Superman 375 00:23:25.120 --> 00:23:28.900 can never be near my company. So what is my company's name? 376 00:23:29.080 --> 00:23:32.950 Mathew Schwartz: Oh! Pun alert! Kryptonite! 377 00:23:33.280 --> 00:23:33.370 Tom Field: Easy. 378 00:23:33.390 --> 00:23:36.270 Rashmi Ramesh: Yes. Perfect. 379 00:23:36.690 --> 00:23:38.430 Anna Delaney: Like it. I'm surprised that's not on the 380 00:23:38.430 --> 00:23:40.170 market yet. That's a good one. 381 00:23:40.470 --> 00:23:42.930 Mathew Schwartz: I'm imagining a vivid green logo. 382 00:23:43.890 --> 00:23:44.490 Rashmi Ramesh: Yeah. 383 00:23:47.280 --> 00:23:48.993 Mathew Schwartz: I've got nothing so good as either of 384 00:23:49.040 --> 00:23:51.818 those. Mine is going to be the Cyber Bomb. Why? Because it's 385 00:23:51.865 --> 00:23:52.560 the Cyber Bomb. 386 00:23:55.260 --> 00:23:56.580 Tom Field: The one that can get funding, you know? 387 00:23:58.400 --> 00:24:00.650 Anna Delaney: Well, I was going to turn to Greek mythology for 388 00:24:00.650 --> 00:24:05.600 some information. Cerberus, the vile three-headed dog who's 389 00:24:05.690 --> 00:24:09.380 guarding the gates of the underworld. I mean, if that 390 00:24:09.380 --> 00:24:12.830 can't scare hackers away, what can? I look forward to... 391 00:24:12.860 --> 00:24:14.720 Mathew Schwartz: Add a crypto before it and I think you've got 392 00:24:14.720 --> 00:24:15.290 funding as well. 393 00:24:15.630 --> 00:24:18.601 Anna Delaney: Will have to work on it maybe. Well, I look 394 00:24:18.670 --> 00:24:22.748 forward to seeing all these products online soon. Thank you 395 00:24:22.817 --> 00:24:27.240 very much, everyone. Matt, Rashmi and Tom. It's been a pleasure. 396 00:24:27.750 --> 00:24:29.010 Mathew Schwartz: Yours in branding Anna. 397 00:24:29.990 --> 00:24:31.370 Tom Field: Yeah, we will send you the memo. 398 00:24:31.910 --> 00:24:34.040 Anna Delaney: Fantastic. Look forward to it and thank you so 399 00:24:34.040 --> 00:24:35.690 much for watching. Until next time,