WEBVTT 1 00:00:00.210 --> 00:00:03.060 Anna Delaney: Hi, this is the ISMG Editors' Panel. I'm Anna 2 00:00:03.060 --> 00:00:05.730 Delaney and this is where I am joined by three eminent 3 00:00:05.730 --> 00:00:09.930 journalists to review this week's top stories. Introducing 4 00:00:09.930 --> 00:00:13.470 the team: Tom Field, senior vice president of editorial; Suparna 5 00:00:13.470 --> 00:00:17.490 Goswami, associate editor at ISMG Asia; and Matthew Schwartz, 6 00:00:17.520 --> 00:00:20.880 executive editor of DataBreachToday and Europe. Very 7 00:00:20.880 --> 00:00:21.780 good to see you all. 8 00:00:22.500 --> 00:00:23.400 Tom Field: Always good to be seen. 9 00:00:24.270 --> 00:00:25.260 Matthew Schwartz: Great to be here. 10 00:00:25.500 --> 00:00:26.700 Anna Delaney: Oh, yes! 11 00:00:27.450 --> 00:00:29.310 Tom Field: Who walked in these eminent journalists? 12 00:00:30.930 --> 00:00:33.930 Anna Delaney: Do you know them? So, Suparna, we've got to start 13 00:00:33.930 --> 00:00:36.030 with you with the architecture like that. 14 00:00:37.290 --> 00:00:39.690 Suparna Goswami: Oh, yes. I went to Agra recently, a couple of 15 00:00:39.690 --> 00:00:43.290 weeks back—the city of Taj Mahal—but I thought, let me not 16 00:00:43.290 --> 00:00:45.960 put the picture of Taj Mahal. Let me put the picture of Buland 17 00:00:45.960 --> 00:00:49.110 Darwaza - the highest gateway in Asia and showcase the 18 00:00:49.110 --> 00:00:50.700 architecture from the Mughal era. 19 00:00:51.800 --> 00:00:56.810 Tom Field: Is it like a BNB you stay in if you go to visit the 20 00:00:56.810 --> 00:00:57.230 Taj? 21 00:01:00.530 --> 00:01:03.080 Suparna Goswami: I visited it again but I thought let me just 22 00:01:03.080 --> 00:01:04.160 put a different picture this time. 23 00:01:04.000 --> 00:01:07.840 Anna Delaney: A bit of a history lesson, Suparna. That's great. 24 00:01:09.400 --> 00:01:12.970 Tom, I think we're in the same city again this time. 25 00:01:13.300 --> 00:01:15.820 Tom Field: Imagine that! This is the Times Square, which is close 26 00:01:15.820 --> 00:01:17.470 to the American Taj Mahal. 27 00:01:20.560 --> 00:01:21.250 Anna Delaney: Matthew? 28 00:01:23.400 --> 00:01:26.550 Matthew Schwartz: I am once again in Dundee. Behind me, you 29 00:01:26.550 --> 00:01:31.740 can see the RRS Discovery, which sailed Scott and Shackleton to 30 00:01:31.740 --> 00:01:38.400 Antarctica in 1901. And, to my side, we have the V&A 31 00:01:38.460 --> 00:01:42.540 Museum—Victoria and Albert Museum—which exists in London 32 00:01:42.570 --> 00:01:45.180 and now has an outpost in Dundee. 33 00:01:45.660 --> 00:01:48.300 Anna Delaney: That's a wonderful cultural lesson. This is great. 34 00:01:48.300 --> 00:01:52.020 And I promised you a scene from New York again, Central Park, of 35 00:01:52.020 --> 00:01:57.780 course. Beautiful spring day, got to happen. So Matthew, 36 00:01:57.810 --> 00:02:01.770 starting with you this week, is Monero taking over from Bitcoin 37 00:02:01.770 --> 00:02:04.860 as the preferred currency for ransom payments? That's the 38 00:02:04.860 --> 00:02:05.310 question. 39 00:02:05.810 --> 00:02:07.730 Matthew Schwartz: That is the question, isn't it? I thought it 40 00:02:07.730 --> 00:02:09.860 would be an interesting question to explore. There's been some 41 00:02:09.860 --> 00:02:14.960 really interesting research that's come out lately, from 42 00:02:15.200 --> 00:02:19.160 security firm CipherTrust, for one, and also from the US 43 00:02:19.160 --> 00:02:23.360 Treasury Department's FinCEN, the bureau that looks at 44 00:02:23.360 --> 00:02:28.730 financial crime. And it's been looking at payments for ransoms 45 00:02:29.000 --> 00:02:31.790 and what those trends look like. And one of the interesting 46 00:02:31.790 --> 00:02:35.090 questions is the use of cryptocurrency because we 47 00:02:35.090 --> 00:02:39.320 continue to see a number of crackdowns against 48 00:02:39.470 --> 00:02:42.770 organizations, individuals, criminal syndicates that are 49 00:02:42.770 --> 00:02:46.790 using Bitcoin. Everyone thinks Bitcoin makes you anonymous, 50 00:02:47.120 --> 00:02:52.880 hard to trace, and so on. And while it might offer some degree 51 00:02:52.880 --> 00:02:55.670 of difficulty when it comes to trying to track the people who 52 00:02:55.670 --> 00:03:00.620 are using it for illicit purposes, we do see law 53 00:03:00.620 --> 00:03:05.090 enforcement having a number of successes when it comes to 54 00:03:05.180 --> 00:03:09.620 tracking and sometimes identifying and arresting 55 00:03:10.220 --> 00:03:15.050 criminals who use Bitcoin. So, open question, given the fact 56 00:03:15.080 --> 00:03:19.970 that it seems like Bitcoin, which has a public ledger—the 57 00:03:21.350 --> 00:03:24.110 blockchain—allows these transactions to be traced and 58 00:03:24.110 --> 00:03:28.340 intelligence from investigations helps police crackdown. Are we 59 00:03:28.340 --> 00:03:32.390 going to see criminals moving to different forms of 60 00:03:32.390 --> 00:03:35.780 cryptocurrency, such as Monero, which is known as a privacy 61 00:03:35.780 --> 00:03:41.420 coin? It's much more difficult to trace by default than 62 00:03:41.420 --> 00:03:45.500 Bitcoin. And we know this in part because a lot of criminals 63 00:03:45.500 --> 00:03:51.530 will pay to use Bitcoin mixers, or tumblers, or illicitly use 64 00:03:51.530 --> 00:03:54.320 peel chains where they peel off a little bit from a lot of 65 00:03:54.320 --> 00:03:57.980 different transactions. All of this is designed for money 66 00:03:58.010 --> 00:04:02.630 laundering to disguise the flow of funds. With Monero, however, 67 00:04:02.630 --> 00:04:06.410 you don't need to do that or you need to do a lot less of that 68 00:04:06.410 --> 00:04:11.330 attempted obfuscation. So to cut to the chase, the interesting 69 00:04:11.330 --> 00:04:16.220 takeaway I found is Bitcoin is still widely used - far and away 70 00:04:16.310 --> 00:04:21.650 the most common cryptocurrency for crime. You would think that 71 00:04:21.650 --> 00:04:24.830 more criminal syndicates would be relying on the likes of 72 00:04:24.830 --> 00:04:28.730 Monero, or perhaps some other type of cryptocurrency, but it's 73 00:04:28.790 --> 00:04:34.430 a small fraction. When the Feds traced flows of cryptocurrency 74 00:04:34.520 --> 00:04:37.160 in the first half of last year, that's when we have the most 75 00:04:37.160 --> 00:04:41.540 recent information from them. They found negligible Monero 76 00:04:41.540 --> 00:04:45.890 use. They did find a lot of ransomware operations requesting 77 00:04:45.890 --> 00:04:50.570 Bitcoin payments or Monero payments. Very few of them only 78 00:04:50.570 --> 00:04:54.740 accepted Monero. But what we do often see is if you pay a 79 00:04:54.740 --> 00:04:59.270 ransom, the attackers will charge you a premium, typically 80 00:04:59.270 --> 00:05:04.040 between 5 to 20%. I hear, 10 to 20% is pretty average. They'll 81 00:05:04.040 --> 00:05:06.890 charge you a premium to pay in Bitcoin, because it costs them 82 00:05:06.890 --> 00:05:11.540 money to launder it. Interesting fact, right? So Monero, they 83 00:05:11.540 --> 00:05:13.940 charge a little bit less, because they don't need to spend 84 00:05:13.940 --> 00:05:18.170 as much to launder it. But I do think we're going to see a lot 85 00:05:18.170 --> 00:05:21.800 more use of Monero because we have seen a huge amount of 86 00:05:21.800 --> 00:05:25.220 cracking down on Bitcoin. One challenge for criminals, though, 87 00:05:25.220 --> 00:05:29.210 is there's not a lot of Monero relatively speaking. So they're 88 00:05:29.210 --> 00:05:31.940 continuing to use Bitcoin, despite the fact that it has 89 00:05:31.940 --> 00:05:35.420 some downsides because the liquidity is so good. And they 90 00:05:35.420 --> 00:05:38.600 are attempting to, again, obscure their use of it to 91 00:05:38.600 --> 00:05:39.650 hopefully stay out of jail. 92 00:05:41.060 --> 00:05:43.100 Anna Delaney: Matt, what's the chatter on the cybercrime 93 00:05:43.100 --> 00:05:46.310 forums? What are they saying? What are the criminals talking 94 00:05:46.310 --> 00:05:46.730 about? 95 00:05:48.430 --> 00:05:50.800 Matthew Schwartz: They're always looking for the best, easiest, 96 00:05:50.800 --> 00:05:55.420 fastest way to do anything. And so what I think you have here is 97 00:05:55.450 --> 00:05:59.950 Bitcoin remains easy to use. The illicit use of cryptocurrency 98 00:06:00.010 --> 00:06:04.090 including Bitcoin is a very, very, very small fraction, I 99 00:06:04.090 --> 00:06:07.900 should emphasize, of cryptocurrency use. All the 100 00:06:07.900 --> 00:06:10.840 experts I speak to always highlight this. Cryptocurrencies 101 00:06:10.870 --> 00:06:15.730 aren't bad, cryptocurrencies get used by bad people. But we have 102 00:06:15.730 --> 00:06:18.490 visibility now into cryptocurrency that we never 103 00:06:18.490 --> 00:06:23.440 have into cash. For example, if you do some multimillion dollar 104 00:06:23.740 --> 00:06:27.730 illicit transaction, drug deal, whatever, and you pay with cash, 105 00:06:28.150 --> 00:06:32.380 you can't often track that. But if you pay with cryptocurrency, 106 00:06:32.530 --> 00:06:36.040 you might not be able to track it right away as law enforcement 107 00:06:36.220 --> 00:06:39.850 but intelligence might come to light as you do other 108 00:06:39.850 --> 00:06:43.480 investigations, as you bust people, as you analyze their 109 00:06:43.480 --> 00:06:48.940 computers. And so we're seeing much more reliable estimates of 110 00:06:48.940 --> 00:06:52.480 the amount of crime that's happening using cryptocurrency 111 00:06:52.780 --> 00:06:56.470 coming to light—sometimes months or years later—but fascinating 112 00:06:56.500 --> 00:06:59.830 upsides for law enforcement when it comes to cryptocurrency use. 113 00:07:00.820 --> 00:07:02.650 Tom Field: And I think I've landed on a new marketing 114 00:07:02.650 --> 00:07:06.760 slogan. Cryptocurrency doesn't steal, cybercriminals do. 115 00:07:10.970 --> 00:07:13.130 Anna Delaney: You do have an alternative career and in 116 00:07:13.130 --> 00:07:14.360 marketing, I think, Tom. 117 00:07:14.960 --> 00:07:17.510 Matthew Schwartz: Don't blame the crypto coin users. Yes. 118 00:07:19.040 --> 00:07:22.550 Anna Delaney: Matt, do you have an insight into how the analysts 119 00:07:22.610 --> 00:07:26.300 really assess what types of cryptocurrency the criminals are 120 00:07:26.300 --> 00:07:28.400 using and are accepting? 121 00:07:29.950 --> 00:07:31.690 Matthew Schwartz: It's fascinating. You have companies 122 00:07:31.690 --> 00:07:35.980 like CipherTrust, for example, and you have other ones such as 123 00:07:36.100 --> 00:07:38.440 I should say, CipherTrace, you have other ones like 124 00:07:38.440 --> 00:07:42.040 Chainalysis, you've got TRM Labs. You have lots of different 125 00:07:42.040 --> 00:07:45.490 blockchain intelligence firms, and the government is working 126 00:07:45.490 --> 00:07:49.960 with possibly all of them, using them for different reasons. But 127 00:07:49.960 --> 00:07:54.070 they are able to identify wallets that are being used by 128 00:07:54.070 --> 00:07:57.820 criminals. The FBI is also seeking this information. Just 129 00:07:57.820 --> 00:08:01.270 think of all of this as going into a huge pool of intelligence 130 00:08:01.360 --> 00:08:03.910 and they're tracing which ransomware groups are tied to 131 00:08:03.910 --> 00:08:07.300 which wallet addresses. And a lot of times there's crossover 132 00:08:07.300 --> 00:08:10.120 between different types of criminality. And the more 133 00:08:10.120 --> 00:08:13.450 information that comes to light, the more they can trace this and 134 00:08:13.450 --> 00:08:18.220 they can see ransomware flowing to groups in Russia, for 135 00:08:18.220 --> 00:08:21.820 example. There has been some great research that's come out 136 00:08:22.270 --> 00:08:25.210 from these intelligence firms looking at just how many 137 00:08:25.210 --> 00:08:27.970 hundreds of millions of dollars' worth of cryptocurrency flow 138 00:08:27.970 --> 00:08:32.110 every year. So we do have much better insights into the groups 139 00:08:32.110 --> 00:08:34.330 involved and their locations. 140 00:08:35.530 --> 00:08:38.080 Anna Delaney: So this time next year, Bitcoin will still be 141 00:08:38.080 --> 00:08:41.890 here, will still be used by the criminals rather. 142 00:08:43.600 --> 00:08:45.910 Matthew Schwartz: Yes, I spoke with a lot of experts about 143 00:08:45.910 --> 00:08:50.560 this. And while Monero offers huge upsides, Bitcoin's big 144 00:08:50.590 --> 00:08:54.190 upside is ease of use and availability. So you would think 145 00:08:54.220 --> 00:08:57.010 that they will migrate to privacy coins. But Bitcoin is 146 00:08:57.010 --> 00:09:00.700 much easier for victims to get and to pay with and so on. And, 147 00:09:00.970 --> 00:09:03.310 obviously criminals want to get paid. They don't want to make it 148 00:09:03.310 --> 00:09:05.590 too difficult for victims to pay them. 149 00:09:07.400 --> 00:09:09.140 Anna Delaney: Always fascinating, Matt! Thanks for 150 00:09:09.140 --> 00:09:12.770 that insight. So Suparna, you recently conducted a panel on 151 00:09:12.770 --> 00:09:17.330 the topic of zero trust. It was excellent, I got to say. I think 152 00:09:17.330 --> 00:09:19.730 it's really useful for organizations across many 153 00:09:19.730 --> 00:09:22.190 verticals, but could you just share some highlights? 154 00:09:23.530 --> 00:09:26.620 Suparna Goswami: Thank you for that, Anna. Yes, it was a panel 155 00:09:26.620 --> 00:09:31.180 discussion between security practitioners in Australia. I 156 00:09:31.180 --> 00:09:34.360 had a vendor from Okta, I had somebody from E&P, which is a 157 00:09:34.360 --> 00:09:37.660 financial institution in Australia, and somebody from EY. 158 00:09:38.230 --> 00:09:42.130 So, free perspective from all. The topic was how does one 159 00:09:42.130 --> 00:09:44.650 decide the right approach to zero trust and what are some 160 00:09:44.650 --> 00:09:49.060 important considerations to keep in mind. As we know, different 161 00:09:49.060 --> 00:09:51.790 industry verticals have different regulatory 162 00:09:51.790 --> 00:09:57.190 expectations. For example, a higher education university, by 163 00:09:57.190 --> 00:10:01.060 default has an approach of open trust and open collaboration and 164 00:10:01.060 --> 00:10:04.030 sharing. That is what they thrive on. Here, you can't 165 00:10:04.030 --> 00:10:08.620 really go full throttle trying to restrict everything. That 166 00:10:08.620 --> 00:10:11.200 will have a very different approach to zero trust than, 167 00:10:11.200 --> 00:10:13.840 say, a financial institution, because you need to take a 168 00:10:14.170 --> 00:10:16.480 risk-based approach for financial institution, and there 169 00:10:16.480 --> 00:10:21.670 is heightened expectation of prevention and detection 170 00:10:21.670 --> 00:10:26.110 activities. So yes, my entire conversation was how different 171 00:10:26.110 --> 00:10:30.490 industries need to decide what approach to take. Now you can 172 00:10:30.490 --> 00:10:34.150 classify your requirements into four buckets. The first one is 173 00:10:34.150 --> 00:10:37.840 of users, which has components of say identity governance - 174 00:10:37.840 --> 00:10:43.000 your bank. The second one is identity and access. This is 175 00:10:43.000 --> 00:10:47.500 where you have your adaptive access management, and all that 176 00:10:47.500 --> 00:10:50.380 stuff. The third one is resources, which includes your 177 00:10:50.380 --> 00:10:53.800 data and your services. So in order to classify your data, you 178 00:10:53.800 --> 00:10:56.200 need to leverage your encryption, your 179 00:10:56.200 --> 00:10:59.920 containerization. And the fourth bucket is of rationale where you 180 00:10:59.920 --> 00:11:03.940 apply your analytics, your reasoning. These buckets are 181 00:11:03.940 --> 00:11:07.090 common for all industries, but an assessment of each of these 182 00:11:07.090 --> 00:11:10.000 buckets will give you an idea of where to start from. Suppose for 183 00:11:10.000 --> 00:11:14.860 example, you have a weak domain in terms of detection and 184 00:11:14.860 --> 00:11:19.150 response and your cyber risk is mainly around denial-of-services 185 00:11:20.830 --> 00:11:25.150 and you're an e-commerce industry. In that case, network 186 00:11:25.150 --> 00:11:28.600 will be the one where you start from, whereas if malicious 187 00:11:28.600 --> 00:11:31.900 insider is a big problem for you, then you better start with 188 00:11:31.900 --> 00:11:34.900 PAM. So that is how you essentially decide which 189 00:11:34.930 --> 00:11:40.360 approach of zero trust to take. It essentially depends on the 190 00:11:40.360 --> 00:11:44.590 business objective and goals. That is what the entire takeaway 191 00:11:44.590 --> 00:11:45.910 was from the discussion. 192 00:11:46.980 --> 00:11:48.510 Anna Delaney: Very good! And Suparna, you've had so many 193 00:11:48.510 --> 00:11:53.250 conversations around zero trust over the past year or so. How do 194 00:11:53.250 --> 00:11:55.260 you think this conversation has shifted? Just in the 195 00:11:55.260 --> 00:11:58.860 conversations you're having, say, thinking back to this time 196 00:11:58.860 --> 00:11:59.460 last year? 197 00:12:01.400 --> 00:12:03.950 Suparna Goswami: The conversation has shifted. Like I 198 00:12:03.950 --> 00:12:06.560 said, even in one of the previous interviews, I think the 199 00:12:06.560 --> 00:12:10.490 first couple of years, it was about why is zero trust 200 00:12:10.490 --> 00:12:14.780 important? What is zero trust, and now we are into what 201 00:12:14.810 --> 00:12:18.500 approach of zero trust we are taking. And now even we have got 202 00:12:18.500 --> 00:12:21.440 more specific like, say, identity approach or what are 203 00:12:21.440 --> 00:12:23.930 the problems of identity. If you take the identity-centric 204 00:12:23.930 --> 00:12:26.300 approach of zero trust, what are the specific challenges that you 205 00:12:26.300 --> 00:12:29.690 face? Because mainly people take either the network or identity. 206 00:12:30.290 --> 00:12:34.880 Majority of organizations either start by that. So I asked Brett, 207 00:12:35.930 --> 00:12:40.280 who said that he interacts with a lot of CISOs and they take 208 00:12:40.280 --> 00:12:44.270 this identity approach. Where are the challenges? Identity and 209 00:12:44.270 --> 00:12:47.930 governance is one area where there are massive gaps. PAM 210 00:12:47.930 --> 00:12:51.800 continues to be a challenge. And another challenge, he said, is 211 00:12:51.800 --> 00:12:54.890 helping people change or break out of that mindset that 212 00:12:55.370 --> 00:13:00.260 addressing all these problems will increase friction and will 213 00:13:00.290 --> 00:13:04.040 impact user experience. So he brought about an interesting 214 00:13:04.040 --> 00:13:08.900 point. He said, we have these technologies, like passwordless 215 00:13:09.320 --> 00:13:12.860 or device ID authentication. As an industry, we haven't done 216 00:13:12.860 --> 00:13:16.250 enough to evangelize these technologies. We need to do 217 00:13:16.250 --> 00:13:20.480 that. This has been there, all your web authentication has been 218 00:13:20.480 --> 00:13:23.120 existing for a long time now, but there has not been much of 219 00:13:23.120 --> 00:13:26.630 an adoption. So we need to evangelize these technologies as 220 00:13:26.780 --> 00:13:27.500 an industry. 221 00:13:29.420 --> 00:13:30.920 Anna Delaney: Tom, would love your thoughts too. 222 00:13:32.250 --> 00:13:34.080 Tom Field: We've got a big litmus test coming up. First of 223 00:13:34.080 --> 00:13:36.540 all, it was a terrific panel discussion. I've really enjoyed 224 00:13:36.540 --> 00:13:39.870 it thoroughly. It's nice seeing people talk about the roadmap in 225 00:13:39.870 --> 00:13:43.500 a mature way. But I think a litmus test is coming up in just 226 00:13:43.500 --> 00:13:45.720 over a month. We've got RSA Conference, the first 227 00:13:45.750 --> 00:13:50.850 live-in-person RSA Conference, since 2020. 2020 is really a 228 00:13:50.850 --> 00:13:53.520 launch pad for zero trust in a lot of ways for a lot of 229 00:13:53.520 --> 00:13:56.940 organizations, and the pandemic certainly accelerated that 230 00:13:56.940 --> 00:14:01.200 conversation. Where are we now? When we get back together as a 231 00:14:01.200 --> 00:14:04.650 global security community, I will be very interested to hear 232 00:14:04.650 --> 00:14:06.510 the conversations in San Francisco. 233 00:14:08.010 --> 00:14:11.730 Anna Delaney: Next week, we have the founder of zero trust join 234 00:14:11.730 --> 00:14:15.090 us on our Editors' Panel, John Kindervag, of course. So prepare 235 00:14:15.090 --> 00:14:16.590 those challenging questions, please. 236 00:14:17.010 --> 00:14:19.230 Tom Field: The Godfather of zero trust. 237 00:14:19.650 --> 00:14:22.050 Suparna Goswami: Always a pleasure to have an interaction 238 00:14:22.050 --> 00:14:22.560 with him. 239 00:14:22.510 --> 00:14:25.840 Anna Delaney: Yeah, absolutely. So looking forward to that. Tom, 240 00:14:26.110 --> 00:14:30.490 talking of trailblazers, you've interviewed Octavia Howell, 241 00:14:30.610 --> 00:14:33.010 CyberEd Board member. Tell us about it. 242 00:14:33.330 --> 00:14:36.506 Tom Field: I certainly have and each one of us is privileged in 243 00:14:36.567 --> 00:14:39.989 our role that we get the opportunity to speak with these 244 00:14:40.050 --> 00:14:43.899 just wonderful CISOs as part of the CyberEd Board's Profiles in 245 00:14:43.960 --> 00:14:47.320 Leadership. Every one of us—Matt, Suparna, Anna—you all 246 00:14:47.381 --> 00:14:51.047 record these interviews. They are a brilliant opportunity to 247 00:14:51.108 --> 00:14:54.529 get to know leaders—how they shaped their careers, their 248 00:14:54.590 --> 00:14:58.317 focus, their passions. I did speak with Octavia Howell. She's 249 00:14:58.378 --> 00:15:02.227 the vice president, she's head of information security and risk 250 00:15:02.288 --> 00:15:06.259 for Equifax, Canada. We talked a lot about fraud. We talked about 251 00:15:06.320 --> 00:15:09.558 privacy, we talked about responding to incidents. But 252 00:15:09.619 --> 00:15:13.407 what was most interesting to me was to hear her talk about the 253 00:15:13.468 --> 00:15:17.317 ground that she had to break, as an American going into Canada, 254 00:15:17.378 --> 00:15:20.921 to oversee a large organization; as a woman coming in to a 255 00:15:20.982 --> 00:15:24.709 management role as an African American woman, extra degree of 256 00:15:24.770 --> 00:15:28.191 difficulty. And as she says, as one that wears four inch 257 00:15:28.252 --> 00:15:32.162 stiletto heels and walks in the room with a big presence. And so 258 00:15:32.223 --> 00:15:35.217 I asked her about how she impacted culture of the 259 00:15:35.278 --> 00:15:38.822 organizations that she joined. And she shared with me some 260 00:15:38.883 --> 00:15:42.060 insights that I want to share with our audience now. 261 00:15:42.000 --> 00:15:45.092 Octavia Howell: I just be myself. I tell a lot of jokes 262 00:15:45.171 --> 00:15:50.008 all the time. I am serious about work, the actual business of 263 00:15:50.087 --> 00:15:54.844 work. But when things are light, and we have a little bit of 264 00:15:54.923 --> 00:15:59.998 time, I tell a lot of jokes, and I pick on myself a lot as well. 265 00:16:00.077 --> 00:16:04.200 And so I think the way I overcame it really was with 266 00:16:04.279 --> 00:16:08.878 self-awareness and understanding because I have a powerful 267 00:16:08.957 --> 00:16:13.794 presence. So understanding when I walk in the door, what that 268 00:16:13.873 --> 00:16:18.155 does to other people, and then also understanding what 269 00:16:18.234 --> 00:16:22.991 relationships I can have with other people and how to relate 270 00:16:23.070 --> 00:16:27.828 to them. So I didn't come in there busting through the door, 271 00:16:27.907 --> 00:16:32.902 right? I came in, really what I would call, playing nice in the 272 00:16:32.981 --> 00:16:37.739 sandbox. Hey, I understood the players, I understood who was 273 00:16:37.818 --> 00:16:42.813 making the moves, who the people who were influencers, and then 274 00:16:42.892 --> 00:16:47.729 also start talking to them. We really built the relationships 275 00:16:47.808 --> 00:16:52.328 with those people. And so I think, just showing myself, I 276 00:16:52.407 --> 00:16:57.323 performed. I am a techie so I can read a packet capture, I can 277 00:16:57.402 --> 00:17:02.238 decrypt, I can encrypt, I could tell you how many rounds in a 278 00:17:02.318 --> 00:17:06.996 cipher and there was no problem with that. So once they got 279 00:17:07.075 --> 00:17:12.149 over, the fact that I wore four inch heels to work, then I think 280 00:17:12.229 --> 00:17:17.700 we were over that. And we were just talking technology at that point. 281 00:17:17.000 --> 00:17:21.020 Tom Field: Playing nice in the sandbox. Nice objectives for 282 00:17:21.020 --> 00:17:21.650 each of us. 283 00:17:22.740 --> 00:17:26.010 Anna Delaney: Absolutely. I love what she said about humor being 284 00:17:26.010 --> 00:17:29.040 a vehicle for change. It reminded me of something our 285 00:17:29.280 --> 00:17:33.420 friend, CISO Thom Langford said that it's a powerful weapon. 286 00:17:34.200 --> 00:17:38.910 Humor can be a powerful weapon. And he often quotes Maya Angelou 287 00:17:39.570 --> 00:17:43.530 saying people won't remember what you said or did, but they 288 00:17:43.530 --> 00:17:48.000 will remember how you made them feel. I'm paraphrasing, but it's 289 00:17:48.000 --> 00:17:50.970 that powerful impact, I think. 290 00:17:51.540 --> 00:17:54.600 Tom Field: What they're going to remember is Thom Langford, Matt 291 00:17:54.600 --> 00:17:58.050 Schwartz, and I posing as Charlie's Angels for a photo at 292 00:17:58.050 --> 00:17:59.670 RSA Conference 2020. 293 00:18:00.990 --> 00:18:02.280 Anna Delaney: You're going to have to share a picture, Tom. 294 00:18:03.450 --> 00:18:04.560 Tom Field: Matt has it somewhere. 295 00:18:08.110 --> 00:18:09.040 Matthew Schwartz: It'll cost you, Anna. 296 00:18:11.800 --> 00:18:13.960 Tom Field: Maybe we'll recreate it at our London summit. 297 00:18:13.000 --> 00:18:17.470 Anna Delaney: I think so. With Suparna one day as well. 298 00:18:17.000 --> 00:18:18.920 Suparna Goswami: Oh, yes! 299 00:18:19.190 --> 00:18:20.300 Matthew Schwartz: Everyone's welcome. 300 00:18:21.080 --> 00:18:23.210 Tom Field: We are inclusive. Foreign shields welcomed. 301 00:18:25.310 --> 00:18:28.730 Anna Delaney: Well, final question to you is a bit of a 302 00:18:29.090 --> 00:18:31.880 future gazing question. What's the next big thing in 303 00:18:31.880 --> 00:18:34.040 cybersecurity that we haven't seen coming? 304 00:18:34.000 --> 00:18:37.004 Tom Field: Pizza in a cup. That comes from the 1980s Steve 305 00:18:37.068 --> 00:18:40.775 Martin movie, The Jerk, by the way. But to give you a more 306 00:18:40.839 --> 00:18:44.610 serious answer, I think it's here. I think it's the SBOM. I 307 00:18:44.674 --> 00:18:47.934 think consistently I've been getting out and having 308 00:18:47.998 --> 00:18:51.961 conversations about application security. And it comes back to 309 00:18:52.025 --> 00:18:55.924 the notion of are you providing software bills and materials? 310 00:18:55.988 --> 00:18:59.887 Are you asking for a software bills of materials? And because 311 00:18:59.951 --> 00:19:03.850 there have been so many issues with zero days and application 312 00:19:03.914 --> 00:19:07.430 security risks and open source code, there is a greater 313 00:19:07.494 --> 00:19:11.457 awareness among our constituency now, that you're not going to 314 00:19:11.521 --> 00:19:15.484 ingest food if you don't know what the ingredients are. You're 315 00:19:15.548 --> 00:19:19.447 not going to buy automobiles if you don't know what the parts 316 00:19:19.511 --> 00:19:23.282 have been attested to. You're not going to fly in airplanes 317 00:19:23.346 --> 00:19:27.309 with unsafe components. There are no open source components in 318 00:19:27.373 --> 00:19:30.825 airplanes. There's a greater awareness that we need to 319 00:19:30.888 --> 00:19:34.915 protect the software that we use by knowing what the components 320 00:19:34.979 --> 00:19:38.559 are within it. There's certainly regulatory push in this 321 00:19:38.623 --> 00:19:42.394 direction, starting with the executive order from President 322 00:19:42.458 --> 00:19:46.293 Biden just about a year ago. I think this is really going to 323 00:19:46.357 --> 00:19:50.065 start to take root this year, and I believe we're going to 324 00:19:50.128 --> 00:19:54.283 have serious conversations about the SBOM over the second half of 325 00:19:54.347 --> 00:19:57.543 2020 to what that is, standardized forms, how it's 326 00:19:57.607 --> 00:20:01.442 going to be provided, how it's going to be presented and how 327 00:20:01.506 --> 00:20:05.341 it's going to be adhered to. To me, it's a big topic for the 328 00:20:05.405 --> 00:20:06.940 second half of the year. 329 00:20:09.250 --> 00:20:09.790 Anna Delaney: Matt? 330 00:20:11.400 --> 00:20:14.160 Matthew Schwartz: I'm going to go really big picture. Hopefully 331 00:20:14.160 --> 00:20:18.060 not too vague, but just the pervasiveness of cybersecurity, 332 00:20:18.420 --> 00:20:21.750 in everything around us, is going to be my answer. Because 333 00:20:21.750 --> 00:20:26.040 if you think how far we've come in recent years, in terms of 334 00:20:26.280 --> 00:20:29.130 Russian interference, for example, in the 2016 US 335 00:20:29.130 --> 00:20:33.540 elections; if you think about now, the Russia-Ukraine war, I'm 336 00:20:33.540 --> 00:20:35.940 hitting that Russia theme again there, but if you think about 337 00:20:35.940 --> 00:20:39.420 the war that we have now in Ukraine; and the degree to which 338 00:20:39.420 --> 00:20:43.440 cyber hasn't been seen. Cyber is such a component of so many 339 00:20:43.440 --> 00:20:47.400 things in the world now. And it's being discussed like never 340 00:20:47.400 --> 00:20:50.940 before. I don't know where we go from here. Obviously, 341 00:20:50.940 --> 00:20:54.660 cybersecurity is not going to get any less important. I 342 00:20:54.660 --> 00:20:57.630 suspect, we're going to have a bit more nuance about how we 343 00:20:57.660 --> 00:21:01.800 discuss it. And again, with this war, the Russia-Ukraine war, 344 00:21:01.980 --> 00:21:05.850 we're talking about where it's not appearing. 5-10 years ago, 345 00:21:05.850 --> 00:21:08.850 we wouldn't have been having those types of conceptual 346 00:21:08.850 --> 00:21:12.150 discussions. It would have been, oh, hey, there's this 347 00:21:12.150 --> 00:21:15.780 cybersecurity component everyone should be concerned about. So 348 00:21:15.810 --> 00:21:19.530 we've come a long way and I don't know what happens next. 349 00:21:19.560 --> 00:21:22.110 But I think there's going be a lot of cyber in the discussion. 350 00:21:22.830 --> 00:21:25.080 And again, hopefully, it does come with more nuance. 351 00:21:26.460 --> 00:21:27.840 Anna Delaney: Right on! So I thought you're going to say 352 00:21:27.900 --> 00:21:30.870 AI-killer robots but you didn't. 353 00:21:32.370 --> 00:21:34.230 Matthew Schwartz: We can go there too, if you want. It was 354 00:21:34.230 --> 00:21:37.590 my last week's response. You looked all sad when I said that. 355 00:21:37.590 --> 00:21:40.740 So I thought I would save the killer robots for later, like 356 00:21:40.740 --> 00:21:41.670 maybe Halloween. 357 00:21:41.700 --> 00:21:45.120 Anna Delaney: Good idea! Suparna, what are you thinking? 358 00:21:45.630 --> 00:21:48.146 Suparna Goswami: I'll probably continue Matt's theme of 359 00:21:48.212 --> 00:21:52.185 Russia-Ukraine war. We'll likely have a global cybersecurity 360 00:21:52.251 --> 00:21:55.959 coalition going ahead. Something like the UN or the NATO 361 00:21:56.025 --> 00:22:00.263 specifically for cybersecurity, perhaps, I'm just imagining. But 362 00:22:00.330 --> 00:22:04.567 I don't think it's too far away, where we all have these nations 363 00:22:04.634 --> 00:22:07.680 collaborating with each other, in a coalition. 364 00:22:07.000 --> 00:22:13.630 Anna Delaney: Yeah, that's a good suggestion. What about just 365 00:22:13.630 --> 00:22:17.830 another basic error? Maybe that's the next big thing we're 366 00:22:17.830 --> 00:22:20.590 going to be reporting on. Unfortunately! 367 00:22:22.820 --> 00:22:27.680 Tom Field: E-R-A or E-R-R-O-R? 368 00:22:27.730 --> 00:22:29.050 Anna Delaney: Keep us thinking Tom. 369 00:22:31.280 --> 00:22:34.190 Matthew Schwartz: Error as in widespread vulnerability in 370 00:22:34.220 --> 00:22:37.730 literally everything in the world, which seems to happen 371 00:22:38.630 --> 00:22:41.150 once, twice, three, four times a year sometimes. 372 00:22:41.960 --> 00:22:42.650 Tom Field: Pick a holiday. 373 00:22:43.110 --> 00:22:45.840 Anna Delaney: Pick a holiday. Well, thank you very much, 374 00:22:45.840 --> 00:22:52.020 Suparna, Tom, Matthew. Always a pleasure. Thank you so much for 375 00:22:52.020 --> 00:22:53.340 watching. Until next time!