WEBVTT 1 00:00:00.000 --> 00:00:02.880 Anna Delaney: Hello, I'm Anna Delaney and welcome to the ISMG 2 00:00:02.880 --> 00:00:05.880 Editors' Panel. And this week, we are joined by none other than 3 00:00:05.880 --> 00:00:09.300 the father of zero trust, and that is, of course, John, 4 00:00:09.300 --> 00:00:12.600 Kindervag - the creator of the zero trust strategy, and also 5 00:00:12.600 --> 00:00:16.200 senior vice president of cybersecurity strategy at ON2IT 6 00:00:16.230 --> 00:00:20.370 Cybersecurity. And we also have, of course, ISMG's brilliant 7 00:00:20.370 --> 00:00:22.980 executive editor of DataBreachToday and Europe, 8 00:00:23.040 --> 00:00:26.700 Matthew Schwartz. Hello, both and welcome, John. Great to have 9 00:00:26.700 --> 00:00:27.600 you join us. 10 00:00:28.170 --> 00:00:30.300 John Kindervag: Hey, it's great to be here. Always fun to talk 11 00:00:30.300 --> 00:00:30.930 to you guys. 12 00:00:31.200 --> 00:00:32.310 Matthew Schwartz: Great to see you again. 13 00:00:32.750 --> 00:00:35.270 John Kindervag: Yeah. It's been a while, hasn't it, Matthew? 14 00:00:35.570 --> 00:00:37.640 Matthew Schwartz: Yeah, there's been a small pandemic in a way, 15 00:00:37.640 --> 00:00:40.400 but hopefully, we're getting through that. 16 00:00:40.970 --> 00:00:43.190 John Kindervag: Oh, yeah. I vote that it's over. 17 00:00:45.860 --> 00:00:46.640 Anna Delaney: I second that. 18 00:00:46.640 --> 00:00:48.830 John Kindervag: There we go. We have a quorum. 19 00:00:50.340 --> 00:00:52.800 Anna Delaney: John, I usually ask guests where they are in 20 00:00:52.800 --> 00:00:57.120 their virtual worlds but you look quite comfortable in your 21 00:00:57.120 --> 00:01:01.140 home studio setup. Is this as a result of the pandemic and home 22 00:01:01.140 --> 00:01:04.170 working? Or were you always so advanced in your setup? 23 00:01:05.410 --> 00:01:10.510 John Kindervag: Yeah, so unlike everybody else who built their 24 00:01:10.810 --> 00:01:14.890 home working setup around their desk, I built it around my 25 00:01:14.890 --> 00:01:18.160 chair, because I figured I was going to be sitting in the 26 00:01:18.160 --> 00:01:21.370 chair. So everybody else has a desk and their chair moves 27 00:01:21.370 --> 00:01:27.550 around. My chair is stationary and special. And then my desk 28 00:01:27.580 --> 00:01:33.760 moves around. So I inverted the whole paradigm so that I could 29 00:01:33.760 --> 00:01:39.310 always be comfortable in. After 30 years of leaning over a 30 00:01:39.310 --> 00:01:44.110 computer, that does a number on your shoulders and your back. So 31 00:01:45.280 --> 00:01:51.250 I lean back when I type now and lots of things to just try to 32 00:01:51.640 --> 00:01:58.330 survive the onslaught of technology illnesses that we 33 00:01:58.360 --> 00:02:05.380 haven't ever thought about. Our ancestors never had carpal 34 00:02:05.440 --> 00:02:09.160 tunnel syndrome. I grew up on a farm. That was not a common 35 00:02:09.520 --> 00:02:13.300 discussion that we had on the farm. I've got carpal tunnel 36 00:02:13.300 --> 00:02:16.720 syndrome. No, I mean, if you had a syndrome, it was from 37 00:02:17.620 --> 00:02:23.380 shoveling too much or throwing hay bales around or something 38 00:02:23.380 --> 00:02:23.830 like that. 39 00:02:23.920 --> 00:02:25.360 Matthew Schwartz: Or getting your hand caught in something 40 00:02:25.360 --> 00:02:28.360 you shouldn't. John, I want to know, is that a mood light 41 00:02:28.390 --> 00:02:31.720 behind you? Are you are you suddenly warning us away here? 42 00:02:32.000 --> 00:02:37.730 John Kindervag: Yeah, I do have a mood light around. And so I 43 00:02:37.730 --> 00:02:39.140 can do all kinds of things with it. 44 00:02:41.970 --> 00:02:42.360 Matthew Schwartz: Look at that. 45 00:02:42.360 --> 00:02:44.610 Anna Delaney: Like a home office disco! Love it! 46 00:02:44.000 --> 00:03:08.780 We like to inject a little chaos into these discussions. So, perfect! 47 00:02:44.130 --> 00:02:48.155 John Kindervag: Yeah, I can play it to music. I can make it do 48 00:02:48.242 --> 00:02:53.842 all kinds of weird things. So I just thought, well, if I'm going 49 00:02:53.930 --> 00:02:59.005 to be here, I might just change it around and do different 50 00:02:59.092 --> 00:03:03.730 things. And so it can get brighter and darker and all 51 00:03:03.817 --> 00:03:05.130 kinds of stuff. 52 00:03:11.330 --> 00:03:15.890 I thought I'm going to get a background so that it sets it 53 00:03:15.890 --> 00:03:19.310 off a little bit. And you can see it and everything. 54 00:03:20.150 --> 00:03:23.270 Anna Delaney: You've got us in the mood, John. Speaking of 55 00:03:23.270 --> 00:03:27.260 chaos, Matt, which corner of Dundee are you showing us this 56 00:03:27.980 --> 00:03:28.070 time? 57 00:03:28.090 --> 00:03:32.680 Matthew Schwartz: Oh, like I always use my local backdrop. 58 00:03:32.710 --> 00:03:35.770 I'm actually just a few miles south this time, in St. Andrews. 59 00:03:35.770 --> 00:03:39.160 This is the St. Andrew's Cathedral. I've given it a 60 00:03:39.160 --> 00:03:42.580 little bit of a digital collage kind of look while I was killing 61 00:03:42.580 --> 00:03:46.150 time the other day. So it's static, unfortunately. 62 00:03:46.480 --> 00:03:50.590 Obviously, I didn't get the Kindervag memo here, but it's my 63 00:03:50.590 --> 00:03:52.120 humble contribution to this week. 64 00:03:52.820 --> 00:03:54.740 John Kindervag: Do you live in Scotland? 65 00:03:55.130 --> 00:03:57.590 Matthew Schwartz: I do. I know the accent gives it away. 66 00:03:58.640 --> 00:04:01.310 John Kindervag: I did not know you lived in Scotland. I love 67 00:04:01.310 --> 00:04:06.530 Scotland. I can't understand anybody who went there. But I 68 00:04:06.530 --> 00:04:07.820 still love it. It's great. 69 00:04:08.120 --> 00:04:09.050 Matthew Schwartz: Getting better. 70 00:04:13.220 --> 00:04:20.383 John Kindervag: I'm also in Europe this week. I'm in sunny 71 00:04:20.553 --> 00:04:30.958 Valencia. So I'm giving you a taste of Valencia. It's amazing 72 00:04:31.128 --> 00:04:41.362 because there's such a great perfume because there's so many 73 00:04:41.532 --> 00:04:51.766 lemon trees and orange trees. I rarely say that about a city 74 00:04:45.170 --> 00:04:49.530 No, the strategy hasn't changed and won't change and doesn't 75 00:04:49.603 --> 00:04:53.819 need to change. There's been some terminology changes. For 76 00:04:51.936 --> 00:05:02.852 that has a great smell but this one does. So, John, back to zero 77 00:04:53.891 --> 00:04:58.325 example, early on I used to talk about the first step of zero 78 00:04:58.398 --> 00:05:02.395 trust is defining your data, which was still asking the 79 00:05:02.468 --> 00:05:07.047 question - What do you need to protect? Now I asked define your 80 00:05:03.022 --> 00:05:11.891 trust. it might be worth baselining zero trust for a 81 00:05:07.120 --> 00:05:11.335 protect surface because we put data, we put assets, we put 82 00:05:11.408 --> 00:05:15.769 applications, we put services, what we call DAAS elements in 83 00:05:12.062 --> 00:05:20.760 moment. Has your working definition of the strategy 84 00:05:15.842 --> 00:05:19.985 there. So that people who are in, say the OT environment, 85 00:05:20.057 --> 00:05:24.418 manufacturing, oil and gas can understand more easily how to 86 00:05:20.931 --> 00:05:29.630 changed as more organizations implement zero trust? 87 00:05:24.491 --> 00:05:28.707 consume it. And that's probably been the only major change 88 00:05:28.779 --> 00:05:33.068 strategically in there. Of course, zero trust, the strategy 89 00:05:33.140 --> 00:05:37.574 and the tactics, the tooling are decoupled. So the tooling is 90 00:05:37.647 --> 00:05:42.080 getting better in some cases and in some cases getting worse. 91 00:05:42.153 --> 00:05:46.441 Oddly enough, I think we're in this weird time where people 92 00:05:46.514 --> 00:05:50.875 have forgotten the past. And we're going back to a world pre 93 00:05:50.948 --> 00:05:54.800 2000. In a lot of the technologies, especially in the 94 00:05:54.873 --> 00:05:59.161 native cloud technologies, where we, instead of having next 95 00:05:59.234 --> 00:06:03.667 generation firewalls, protecting data, we're having stateless 96 00:06:03.740 --> 00:06:07.956 ackles, like we did in the 90s on routers. So I'm a little 97 00:06:08.028 --> 00:06:11.880 concerned that we're decrementing our security, based 98 00:06:11.953 --> 00:06:16.460 upon the technology that we're using, especially in the cloud. 99 00:06:16.000 --> 00:06:19.810 Matthew Schwartz: That's a really interesting trend you're 100 00:06:19.810 --> 00:06:22.930 calling out, John, just because having covered the space for a 101 00:06:22.930 --> 00:06:26.530 little while, it seems like a lot of the same mistakes, keep 102 00:06:26.530 --> 00:06:30.280 getting made, not just on a technical front, but almost on a 103 00:06:30.280 --> 00:06:33.550 soft skills front. I mean, we had a breach involving Okta 104 00:06:33.580 --> 00:06:37.090 recently, where they did a lot of things right, they found out 105 00:06:37.090 --> 00:06:39.550 something was wrong, they alerted their business partner 106 00:06:39.550 --> 00:06:42.610 to investigate it. And then they failed to follow through to make 107 00:06:42.610 --> 00:06:45.730 sure that their business partner had investigated in a timely 108 00:06:45.730 --> 00:06:49.120 manner. And I just wonder, are there some common culprits or 109 00:06:49.120 --> 00:06:53.080 causes you're seeing since, as you say, this does detriment, 110 00:06:53.290 --> 00:06:57.880 zero trust? Is this new people coming into the market? Or new 111 00:06:57.910 --> 00:07:01.420 new people entering the field? Is this new types of technology? 112 00:07:01.420 --> 00:07:04.960 They're all shiny, and people forget the basics? What do you 113 00:07:04.960 --> 00:07:05.440 think? 114 00:07:06.530 --> 00:07:09.650 John Kindervag: Well, the first culprit is Linux, right? I mean, 115 00:07:09.650 --> 00:07:11.930 if you think about it, Linus Torvalds, should be the richest 116 00:07:11.930 --> 00:07:14.960 person in the world. Because we don't have cloud, we don't have 117 00:07:14.960 --> 00:07:18.860 most of the things that we have without Linux. So Linux has, 118 00:07:19.190 --> 00:07:22.700 what people pretend is a firewall called IP tables, which 119 00:07:22.700 --> 00:07:27.290 is really just a way to turn on an ACL - an access control list 120 00:07:27.290 --> 00:07:30.740 - that doesn't even maintain state. So we go back to the pre 121 00:07:31.010 --> 00:07:35.240 CheckPoint days, right? When CheckPoint invented the stateful 122 00:07:35.240 --> 00:07:41.360 firewall, it was because hackers could bypass very easily access 123 00:07:41.360 --> 00:07:45.410 control lists. Well, now we're saying, hey, hackers, we're 124 00:07:45.410 --> 00:07:51.440 going back to the early 90s, so have at it, compromise every 125 00:07:51.440 --> 00:07:55.850 cloud environment. Because it's easy. Secondly, we have a new 126 00:07:55.850 --> 00:07:58.220 generation of people who haven't been trained in some of the 127 00:07:58.220 --> 00:08:08.780 basics of what is TCP/IP? What is the OSI model? What is a 128 00:08:08.780 --> 00:08:13.010 network? How does a packet flow? What are the basics of our 129 00:08:13.010 --> 00:08:18.500 industry, and they get into the higher level stuff, agile versus 130 00:08:18.500 --> 00:08:23.870 waterfall versus DevSecOps, and all of the things that sound 131 00:08:23.870 --> 00:08:30.260 sexy without understanding the basics. And buildings fall down 132 00:08:30.260 --> 00:08:34.550 if you don't deal with the basics. You can have great 133 00:08:34.760 --> 00:08:38.630 modern architectures but if the foundation isn't there, they're 134 00:08:38.630 --> 00:08:39.500 going to fall down. 135 00:08:41.670 --> 00:08:44.460 Matthew Schwartz: Excellent. Yeah, great analysis there. I 136 00:08:44.460 --> 00:08:48.240 think nothing is such a great educator as failure. And if 137 00:08:48.240 --> 00:08:50.370 you've just not been in the field for that long, maybe you 138 00:08:50.370 --> 00:08:52.920 haven't had these horrible things happen to you to the 139 00:08:52.920 --> 00:08:55.890 point where you internalize what you should have or could have 140 00:08:55.890 --> 00:08:59.790 done in order to deal with them. Well, switching gears just a 141 00:08:59.790 --> 00:09:02.370 little bit, Anna and I are going to tax him a little bit, but 142 00:09:02.940 --> 00:09:06.750 because you and I last saw each other at a notable event in my 143 00:09:06.750 --> 00:09:11.400 life, which was RSA 2020. Notable because it was kind of 144 00:09:11.400 --> 00:09:14.910 the last time I got to go outside and play in terms of the 145 00:09:14.910 --> 00:09:19.170 last really big cybersecurity event I was at, because it was 146 00:09:19.170 --> 00:09:23.760 right on the cusp of there being this, we thought at the time, 147 00:09:23.850 --> 00:09:27.690 maybe a health concern. And lo and behold, within a month or 148 00:09:27.690 --> 00:09:32.160 two we were just totally locked down. So we were having 149 00:09:32.160 --> 00:09:37.980 obviously some discussions about zero trust at RSA 2020. And 150 00:09:38.250 --> 00:09:42.450 we've got this year now, we've had this blink, nearly two years 151 00:09:42.510 --> 00:09:45.660 later. Is it worth looking at how some of those discussions 152 00:09:45.660 --> 00:09:48.870 have changed because I feel like with what's happening in the 153 00:09:48.870 --> 00:09:52.860 White House and a number of other different arenas, we've 154 00:09:52.860 --> 00:09:58.440 seen a massive increase in the sophistication of the discussion 155 00:09:58.440 --> 00:10:01.470 we're having and before you said there might be some steps 156 00:10:01.470 --> 00:10:04.830 forward some steps backward. But I would say, on the whole, we're 157 00:10:04.830 --> 00:10:06.600 in a much better place than we were before. 158 00:10:07.230 --> 00:10:13.770 John Kindervag: Oh, absolutely. First of all, the pandemic, 159 00:10:14.370 --> 00:10:19.020 incentivized people to really take remote work seriously. And 160 00:10:19.050 --> 00:10:23.610 in order to seriously work remotely, you'll almost need 161 00:10:23.610 --> 00:10:27.540 some level of zero trust concepts built into your system, 162 00:10:27.540 --> 00:10:34.350 because you don't have that ubiquitous perimeter that you 163 00:10:34.350 --> 00:10:37.290 were dependent upon, even though it wasn't very effective, you 164 00:10:37.290 --> 00:10:41.910 still had confidence in it. And now suddenly, it's completely 165 00:10:41.910 --> 00:10:45.780 gone. So you have to think about what resource are the people 166 00:10:45.780 --> 00:10:52.980 accessing from home. And so that helped create that. And then 167 00:10:53.010 --> 00:10:56.790 some of the technologies that just are really good in enabling 168 00:10:56.790 --> 00:10:59.640 zero trust remote access technologies and things like 169 00:10:59.640 --> 00:11:07.410 that became not nice to have, but an imperative. And I was 170 00:11:07.410 --> 00:11:13.020 talking to a group of people in the government, who were 171 00:11:13.020 --> 00:11:20.400 struggling to do remote access, and they were wearing gas masks 172 00:11:20.550 --> 00:11:23.430 in the office because they didn't know how bad it was, or 173 00:11:23.430 --> 00:11:28.440 how bad is this going to be. So we were in the government, we 174 00:11:28.440 --> 00:11:32.100 have gas masks available. So let's wear gas masks around, 175 00:11:32.400 --> 00:11:39.090 while we figure out how to do remote work. So if you were in 176 00:11:39.090 --> 00:11:44.010 that process of heading down a zero trust path, and you were 177 00:11:44.010 --> 00:11:46.650 thinking about, it doesn't matter where the resource is 178 00:11:46.650 --> 00:11:49.620 located, I just need to have secure access to the resource 179 00:11:49.890 --> 00:11:54.420 with some layer 7 policy in place. It made it much easier to 180 00:11:54.420 --> 00:11:58.380 transition into remote work during the pandemic. 181 00:12:01.300 --> 00:12:03.160 Matthew Schwartz: Sorry to interrupt. People just had to 182 00:12:03.160 --> 00:12:05.830 get it done. They said, just make it happen today. 183 00:12:05.870 --> 00:12:09.320 John Kindervag: Right! I was at Palo Alto Networks at the time. 184 00:12:09.320 --> 00:12:12.620 And because one of our key technologies that every single 185 00:12:12.620 --> 00:12:18.380 person had on their devices was a remote access technology. I 186 00:12:18.380 --> 00:12:23.210 mean, it took us a week, right? I mean, there was just no time 187 00:12:23.210 --> 00:12:26.330 at all. For me, it was just completely seamless, because I'd 188 00:12:26.360 --> 00:12:30.830 always work that way as a remote worker or a traveling worker. So 189 00:12:30.830 --> 00:12:33.980 I think that that's a very interesting thing. And we're 190 00:12:33.980 --> 00:12:37.010 seeing people not wanting to go back to the office. I don't know 191 00:12:37.010 --> 00:12:39.920 if you've seen some of these news things, well, you're a 192 00:12:39.920 --> 00:12:44.060 journalist, of course, you have. But people at Apple are 193 00:12:44.060 --> 00:12:47.630 complaining that they don't want to go back to their office. So 194 00:12:47.630 --> 00:12:49.820 many people I know, are just saying, I'm not going to go back 195 00:12:49.820 --> 00:12:54.590 to the office. And executives are saying, well, you have to go 196 00:12:54.590 --> 00:12:57.170 back to the office, because we're paying a lot of money for 197 00:12:57.170 --> 00:13:02.300 this real estate. And that's their justification. So we're 198 00:13:02.300 --> 00:13:07.070 going to see some interesting dynamic shifts in just every 199 00:13:07.070 --> 00:13:08.450 industry here coming up. 200 00:13:09.200 --> 00:13:12.020 Anna Delaney: John, you'vegot me thinking about manufacturing, 201 00:13:12.020 --> 00:13:14.990 because this seems to be one of the most targeted industries at 202 00:13:14.990 --> 00:13:17.240 the moment when it comes to ransomware attacks and other 203 00:13:17.390 --> 00:13:23.360 cyberattacks. How do you think OT is doing when it comes to 204 00:13:23.750 --> 00:13:27.290 implementing zero trust? And how do you think zero trust needs to 205 00:13:27.290 --> 00:13:28.850 be part of that conversation? 206 00:13:28.000 --> 00:13:30.400 John Kindervag: I was just at the big S4 conference in Miami, 207 00:13:30.400 --> 00:13:36.490 which was about OT, two weeks ago, and there's a lot of 208 00:13:38.290 --> 00:13:43.840 pushback of people saying, well, you can't do zero trust for OT. 209 00:13:44.200 --> 00:13:47.680 And I was on a panel with a number of luminaries and we're 210 00:13:47.680 --> 00:13:51.520 like, well, why can't you? Why can't you apply a strategy to 211 00:13:51.520 --> 00:13:55.090 this particular technological problem? And the answer is you 212 00:13:55.090 --> 00:13:59.500 can and I've done it, but that is a business that's very 213 00:13:59.500 --> 00:14:02.800 entrenched in the old ways of doing things. And so it's hard 214 00:14:02.800 --> 00:14:07.150 for them to transition into new thinking, but But it's 215 00:14:07.150 --> 00:14:11.590 happening. It's happening with great frequency. And it's 216 00:14:11.590 --> 00:14:16.570 important, because as I talk to people who are very 217 00:14:16.570 --> 00:14:22.540 knowledgeable about the threats to those environments, things 218 00:14:22.540 --> 00:14:25.900 like Colonial Pipeline, for example. Well, there's a lot of 219 00:14:25.900 --> 00:14:33.970 Colonial Pipelines that are in the process of happening. One of 220 00:14:33.970 --> 00:14:37.300 the one of the governmental people that I talked to said, 221 00:14:39.430 --> 00:14:46.270 the malware and the tools to disrupt these industrial control 222 00:14:46.270 --> 00:14:48.940 systems, these critical infrastructure systems are 223 00:14:48.940 --> 00:14:52.600 already embedded inside of these environments and we're just 224 00:14:52.600 --> 00:14:56.890 waiting for these malicious actors to turn them on. They've 225 00:14:56.890 --> 00:15:01.180 already built in all of the stuff to take it down. And the 226 00:15:01.180 --> 00:15:03.880 only thing that's keeping them from taking it down is the 227 00:15:03.880 --> 00:15:09.160 desire to flip the switch. And that's a scary thought. And so 228 00:15:09.160 --> 00:15:17.800 you see, in the news, attacks that have happened where the 229 00:15:17.830 --> 00:15:21.160 attacker was in there eight or nine months, and no one noticed. 230 00:15:21.430 --> 00:15:26.140 And that's unacceptable, right? How can that be? And the answer 231 00:15:26.140 --> 00:15:28.720 is because you don't have the controls in the right place, 232 00:15:28.720 --> 00:15:33.670 looking at the right thing, you don't have enough street lamps. 233 00:15:34.450 --> 00:15:39.040 If you remember the old joke about the drunk guy looking for 234 00:15:39.040 --> 00:15:43.960 his keys, and the cop says, hey, I don't see your keys anywhere 235 00:15:43.960 --> 00:15:46.240 around here. And he says, yeah, I lost them way over there. 236 00:15:46.450 --> 00:15:48.850 Well, why you crawling around in your hands and knees over here, 237 00:15:48.970 --> 00:15:52.600 well, the lights so much better. And that's what we're doing. We 238 00:15:52.600 --> 00:15:57.520 were just looking where the illumination is, and not adding 239 00:15:57.520 --> 00:16:00.820 enough streetlights. So you have to have more streetlights. 240 00:16:01.980 --> 00:16:04.620 Matthew Schwartz: Colonial Pipeline, that was the billing 241 00:16:04.650 --> 00:16:08.730 server, I think of the billing system, right? The OT, I think 242 00:16:08.730 --> 00:16:12.870 didn't get hit, but they couldn't build customers for the 243 00:16:14.310 --> 00:16:16.890 product that they were giving them. And so they proactively 244 00:16:16.890 --> 00:16:20.250 said, we're just going to shut it down until we can charge our 245 00:16:20.250 --> 00:16:23.610 customers again. But you were talking about an attack surface, 246 00:16:23.610 --> 00:16:27.090 I think was your term before, not just the data but the 247 00:16:27.090 --> 00:16:30.750 message for OT is to think about this in the bigger picture. 248 00:16:31.410 --> 00:16:33.660 John Kindervag: The protect surface is what I was saying, 249 00:16:33.660 --> 00:16:37.140 Matthew, so we can invert the attack surface down the protect 250 00:16:37.140 --> 00:16:40.470 surface. So for Colonial Pipeline, the billing systems 251 00:16:40.470 --> 00:16:44.190 should have been protected, the PLCs that run it have to be 252 00:16:44.190 --> 00:16:47.370 protected. So there's multiple things that you need to protect. 253 00:16:47.520 --> 00:16:50.400 And instead of worrying about all the attacks that are in the 254 00:16:50.400 --> 00:16:53.490 world, because that's too big of a problem, worry about the 255 00:16:53.490 --> 00:16:56.490 things that you have that you need to protect. And now you've 256 00:16:56.490 --> 00:16:59.520 taken this massive problem, and chopped it down into small 257 00:16:59.520 --> 00:17:01.650 chunks. And each chunk is solvable. 258 00:17:03.490 --> 00:17:06.490 Matthew Schwartz: Speaking of chopping things down into 259 00:17:06.490 --> 00:17:09.490 smaller pieces, I suspect that might be your answer to my next 260 00:17:09.490 --> 00:17:14.320 question, which is just what are the big missteps you commonly 261 00:17:14.320 --> 00:17:17.590 see when it comes to zero trust? And I'm thinking about this as 262 00:17:17.590 --> 00:17:21.220 we go into RSA as well, because I think zero trust is going to 263 00:17:21.220 --> 00:17:25.570 be one of the big topics that we discussed there. And what would 264 00:17:25.570 --> 00:17:29.080 you advise organizations that are still pursuing this, that 265 00:17:29.080 --> 00:17:32.560 they should be doing better to make their life and their sanity 266 00:17:33.150 --> 00:17:36.357 John Kindervag: People seem to think it's a technological 267 00:17:33.160 --> 00:17:33.910 nicer? 268 00:17:36.435 --> 00:17:41.129 problem. And it's not, it's a strategic problem. And so they 269 00:17:41.207 --> 00:17:45.980 want to buy a product. So I'm often on the calls with people. 270 00:17:46.058 --> 00:17:50.908 And we bought widget X, Y, or Z, where do we put it? How do we 271 00:17:50.987 --> 00:17:55.759 use it? I don't know. What are you going to protect? Well, we 272 00:17:55.837 --> 00:18:00.844 haven't thought about that yet. Well, then you're going to fail. 273 00:18:00.922 --> 00:18:05.068 Because every zero trust environment has to be custom 274 00:18:05.147 --> 00:18:09.215 made, or I'll use an English word for Anna. Bespoke! 275 00:18:09.293 --> 00:18:14.065 Everything is bespoke. For the protect surface, I often use a 276 00:18:14.143 --> 00:18:18.681 tailor analogy. So you have to figure out what you need to 277 00:18:18.759 --> 00:18:23.766 protect, you design the pattern for it, you cut it out, then you 278 00:18:23.844 --> 00:18:28.773 sew it. It'd be like, here, I've got my sewing machine, and I'm 279 00:18:28.851 --> 00:18:33.545 going to sew something up. And now I have to find the person 280 00:18:33.623 --> 00:18:38.004 who fits the garment. That's what we do in cybersecurity 281 00:18:38.083 --> 00:18:42.855 today, as opposed to say, let's find the person who wants the 282 00:18:42.933 --> 00:18:47.862 garment and tailor make it for them. And so there's a five step 283 00:18:47.940 --> 00:18:52.712 model called the 5-Step Model that you use to do that. And if 284 00:18:52.790 --> 00:18:57.719 you follow that, you're going to be pretty darn successful. And 285 00:18:57.797 --> 00:19:02.335 one of the things that I did since I've last talked to you 286 00:19:02.413 --> 00:19:06.559 Matthew, is I was on the President's NSTAC zero trust 287 00:19:06.637 --> 00:19:10.080 subcommittee. NSTAC is the National Security 288 00:19:10.158 --> 00:19:14.304 Telecommunication Advisory Council. It's got a lot of 289 00:19:14.382 --> 00:19:19.233 leaders of big companies who are on the NSTAC itself, and then 290 00:19:19.311 --> 00:19:24.318 they sponsor research. And so we did want a subcommittee on zero 291 00:19:24.396 --> 00:19:28.777 trust and trusted identity. I was involved, but also the 292 00:19:28.855 --> 00:19:33.471 federal agencies, who are some of the key stakeholders like 293 00:19:33.549 --> 00:19:38.165 CISA and NIST, and DISA and the DOD and the NSA, and we all 294 00:19:38.243 --> 00:19:43.016 synthesized a report that's been delivered to the White House 295 00:19:43.094 --> 00:19:47.631 about this. And I really look at that as the authoritative 296 00:19:47.710 --> 00:19:51.856 document now because if you follow the things in that 297 00:19:51.934 --> 00:19:56.159 document, it has the 5-Step model, it has the maturity 298 00:19:56.237 --> 00:20:01.087 model, it has the Kipling method policy construct, then you're 299 00:20:01.166 --> 00:20:06.016 going to be successful. But it's usually starting in the wrong 300 00:20:06.094 --> 00:20:09.928 place, starting at the technology, listening to a 301 00:20:10.006 --> 00:20:14.230 vendor who if you buy my product, you're going to have 302 00:20:14.309 --> 00:20:19.159 zero trust. It's not like you have zero trust, zero trust is a 303 00:20:19.237 --> 00:20:23.775 way of doing things. And if I can get people to understand 304 00:20:23.853 --> 00:20:28.860 that strategic value, then they will be more successful. And the 305 00:20:28.938 --> 00:20:33.788 key to it really is creating the right incentive structure. So 306 00:20:33.867 --> 00:20:38.795 I'm much more successful, if I can start by talking to the CEO, 307 00:20:38.874 --> 00:20:43.568 or a member of the board of directors or the CIO. But if I'm 308 00:20:43.646 --> 00:20:48.105 trying to move it from the ground up, it's harder because 309 00:20:48.183 --> 00:20:53.034 people are not incentivized to try new things. They're afraid. 310 00:20:53.112 --> 00:20:58.119 What if I do it, and it doesn't go right, I might get fired. But 311 00:20:58.197 --> 00:21:02.265 if this incentivization structure comes from the top 312 00:21:02.343 --> 00:21:07.350 down, then that worry goes away, and they can be empowered to be 313 00:21:07.428 --> 00:21:11.810 successful. That's the people part of the of the system. 314 00:21:12.800 --> 00:21:15.320 Anna Delaney: John, talking about government and incentives, 315 00:21:15.320 --> 00:21:18.410 it's almost a year to the day that the executive order was 316 00:21:18.410 --> 00:21:21.170 released. Biden's administration, of course, 317 00:21:21.290 --> 00:21:25.160 released the EO on improving the nation's cybersecurity and zero 318 00:21:25.160 --> 00:21:29.720 trust plays an important role in that. How are they doing a year 319 00:21:29.720 --> 00:21:30.050 on? 320 00:21:33.650 --> 00:21:37.700 John Kindervag: I mean, the discussions are happening and 321 00:21:37.700 --> 00:21:42.770 the plans are starting. It's like everything in a big 322 00:21:42.770 --> 00:21:47.720 government, things move much slower than probably everybody 323 00:21:47.720 --> 00:21:53.510 wants to. It's hard to turn a battleship. And that's what's 324 00:21:53.510 --> 00:21:57.470 happening. It's actually an aircraft carrier, right? It's 325 00:21:57.470 --> 00:22:00.140 not even a battleship. It's like, the world's biggest 326 00:22:00.140 --> 00:22:03.440 aircraft carrier, and we're trying to spin it around. So 327 00:22:03.770 --> 00:22:08.690 there's a lot of movement in that world, just like the NSTAC 328 00:22:09.020 --> 00:22:14.270 stuff that we're doing and a lot of thought leadership, and then 329 00:22:14.660 --> 00:22:18.740 a lot of planning that is starting. And that's probably 330 00:22:18.740 --> 00:22:22.100 the right way to do it. We're starting to plan. The government 331 00:22:22.130 --> 00:22:26.990 does have an advantage over private industry and that 332 00:22:27.260 --> 00:22:30.500 they're generally required to know what their high value 333 00:22:30.500 --> 00:22:35.330 assets are, which we can then put inside of a protect surface 334 00:22:35.450 --> 00:22:39.590 and start step one of the journey a little bit more easily 335 00:22:39.800 --> 00:22:44.720 in the federal government. So that's the good news that they 336 00:22:44.720 --> 00:22:50.000 have over private industry, because the private sector often 337 00:22:50.030 --> 00:22:54.830 doesn't know or think about what they need to protect. They're so 338 00:22:54.830 --> 00:22:58.970 used to just buying technology, trying to protect everything. 339 00:22:58.970 --> 00:23:02.600 And as Frederick the Great said, if you try to protect 340 00:23:02.600 --> 00:23:04.220 everything, you protect nothing. 341 00:23:06.180 --> 00:23:09.300 Matthew Schwartz: Frederick and I go way back, but also with the 342 00:23:09.300 --> 00:23:13.110 project methodology that you have there, I would think even 343 00:23:13.110 --> 00:23:15.120 in the government sphere, especially in the private 344 00:23:15.240 --> 00:23:19.950 sphere, with a project getting some wins, with each phase 345 00:23:19.950 --> 00:23:24.060 proving the value, as you go on, you have to almost probably keep 346 00:23:24.060 --> 00:23:26.490 selling it as you go, even though a lot of the people on 347 00:23:26.490 --> 00:23:28.980 board will be on board with the benefit of it. 348 00:23:29.830 --> 00:23:32.110 John Kindervag: Absolutely. And that's why I created a maturity 349 00:23:32.110 --> 00:23:36.610 model to track how well you're doing. So that you can see we're 350 00:23:36.610 --> 00:23:39.970 becoming more mature in this particular area, on a per 351 00:23:39.970 --> 00:23:43.960 protect surface basis. So maturity is a great way to track 352 00:23:44.170 --> 00:23:48.550 how well you're doing. I'm not a big fan of trying to say, well, 353 00:23:48.550 --> 00:23:51.640 we've reduced risk or something like that, because that's pretty 354 00:23:51.640 --> 00:23:55.240 ephemeral. But if I can say you're more mature, because 355 00:23:55.240 --> 00:23:59.020 you've done this, based upon specific definitions of 356 00:23:59.020 --> 00:24:08.710 maturity, then we can demonstrate success. And we can 357 00:24:08.710 --> 00:24:14.770 also plan for projects and greater successes. And this is 358 00:24:14.770 --> 00:24:18.520 something that I see in my own practice, where leaders will 359 00:24:18.520 --> 00:24:22.630 look at the maturity scores of various protect surfaces and 360 00:24:22.630 --> 00:24:26.200 say, hey, I want this particular thing to be more mature. Let's 361 00:24:26.200 --> 00:24:31.180 put a project around making this one area. It might be like 362 00:24:32.020 --> 00:24:35.260 protecting the Swift gateway if you're in the financial services 363 00:24:35.260 --> 00:24:37.810 thing. Well, that's a pretty important thing. Let's make that 364 00:24:37.810 --> 00:24:41.530 more mature. And so now you have a project just to do that one 365 00:24:41.530 --> 00:24:45.790 thing, and then you can demonstrably prove that you 366 00:24:45.790 --> 00:24:46.510 succeeded. 367 00:24:48.940 --> 00:24:51.100 Anna Delaney: Talking on projects, what's next on the 368 00:24:51.130 --> 00:24:54.010 John Kindervag agenda? What exciting projects are you 369 00:24:54.010 --> 00:24:55.900 working on in the next few months? 370 00:24:56.860 --> 00:24:58.270 Matthew Schwartz: Would you like to share with us John? 371 00:24:58.000 --> 00:25:04.870 John Kindervag: I think one of the most exciting things is I 372 00:25:04.870 --> 00:25:11.050 partnered with ISMG, who you guys know well, right? One of 373 00:25:11.050 --> 00:25:14.920 your subsidiaries or brands called CyberTheory, we've 374 00:25:14.920 --> 00:25:18.370 created the CyberTheory Institute, which is an 375 00:25:18.370 --> 00:25:21.910 independent think tank on various topics around 376 00:25:21.910 --> 00:25:24.790 cybersecurity. And the first thing we've done is the zero 377 00:25:24.790 --> 00:25:28.330 trust council. So we've got a lot of the top thought leaders 378 00:25:28.330 --> 00:25:32.260 involved. And we've been shooting really interesting 379 00:25:32.260 --> 00:25:39.010 videos about these topics. They're actually videos, we have 380 00:25:39.010 --> 00:25:41.530 dinner, and we have all these interesting people, Greg 381 00:25:41.530 --> 00:25:45.940 Touhill, who runs CERT; or Tony Scott, the former CIO of the US 382 00:25:45.940 --> 00:25:53.680 Federal Government; Chase Cunningham, doctor zero trust, 383 00:25:54.520 --> 00:25:58.210 and people like that. We sit around and we have dinner and we 384 00:25:58.210 --> 00:26:00.970 talk about these things. And it's like, the audience can 385 00:26:00.970 --> 00:26:05.500 listen into these conversations. And they've been very successful 386 00:26:05.500 --> 00:26:08.050 with the clients who've underwritten them. And so I'm 387 00:26:08.050 --> 00:26:11.530 excited about that, because it's generating a whole new set of 388 00:26:11.530 --> 00:26:15.220 research. And then also for CyberEd, which is also one of 389 00:26:15.220 --> 00:26:21.550 your brands, I'm working on training materials around zero 390 00:26:21.550 --> 00:26:25.960 trust, and then tying it into some design thinking concepts, 391 00:26:25.960 --> 00:26:30.130 so people can understand how you get to these places by thinking 392 00:26:30.580 --> 00:26:34.630 as a designer, more than a cybersecurity person. 393 00:26:35.650 --> 00:26:38.980 Anna Delaney: A part of it is getting the message out, being 394 00:26:38.980 --> 00:26:41.830 the voice, helping with the education. 395 00:26:42.650 --> 00:26:49.610 John Kindervag: Yeah, there's a lot of noise in the market based 396 00:26:49.610 --> 00:26:52.520 upon vendor spin, because vendors, of course, and I 397 00:26:52.520 --> 00:26:57.440 understand this. If I'm an MFA company, then zero trust has to 398 00:26:57.440 --> 00:27:01.520 equal MFA. If I'm a proxy company, then zero trust has to 399 00:27:01.520 --> 00:27:06.380 equal proxy. And those things aren't true. We consume those 400 00:27:06.380 --> 00:27:10.190 technologies inside of zero trust. But again, it's the 401 00:27:10.190 --> 00:27:17.690 strategic side. And the strategic side doesn't resonate 402 00:27:17.720 --> 00:27:23.120 with everybody. So yeah, there are tactical and tool kinds of 403 00:27:23.210 --> 00:27:27.140 discussions we can have. But the strategic side resonates with 404 00:27:27.140 --> 00:27:30.710 business leaders. And that's what I'm trying to do is 405 00:27:30.860 --> 00:27:34.550 articulate the business value to this because that's what hasn't 406 00:27:34.550 --> 00:27:38.540 happened in cybersecurity. We're thought of as the Department of 407 00:27:38.540 --> 00:27:43.970 No, right? And we we want to become the enablers of business, 408 00:27:43.970 --> 00:27:48.320 because in reality, we're no longer just overhead, we are 409 00:27:48.590 --> 00:27:54.350 part of the function of every single business. No business 410 00:27:54.350 --> 00:28:00.620 runs without the computer systems running. So an airline, 411 00:28:00.620 --> 00:28:03.260 you can have the three Ps of the airline business - planes, 412 00:28:03.260 --> 00:28:07.940 pilots, and passengers. But if the computer is down, that plane 413 00:28:07.940 --> 00:28:13.010 doesn't get off the ground. And I grew up working for a little 414 00:28:13.010 --> 00:28:18.080 airline, Air Nebraska. Imagine flying Air Nebraska. It's as 415 00:28:18.080 --> 00:28:23.540 scary as it sounds. But we hand wrote the tickets back then, and 416 00:28:23.540 --> 00:28:27.380 everybody hand wrote tickets And they were in triplicate, and you 417 00:28:27.380 --> 00:28:30.980 pulled them off. And it was a very manual process. Well, you 418 00:28:30.980 --> 00:28:34.490 can't do that anymore. It all has to be computerized. And so 419 00:28:34.850 --> 00:28:38.270 I've been on planes where we're sitting there and the computer 420 00:28:38.270 --> 00:28:40.880 system goes down, and we can't take off. 421 00:28:42.740 --> 00:28:45.560 Anna Delaney: It's exciting times for zero trust. You must 422 00:28:46.010 --> 00:28:48.110 love watching this evolution, John. 423 00:28:48.720 --> 00:28:49.380 John Kindervag: Yes. 424 00:28:50.250 --> 00:28:52.710 Anna Delaney: I have a final question for you. Buzzwords. We 425 00:28:52.710 --> 00:28:57.060 know this industry loves a good buzzword or term. What's this 426 00:28:57.060 --> 00:29:00.780 year's buzzword/phrase, five months in? 427 00:29:02.130 --> 00:29:03.930 Matthew Schwartz: Good or bad, Anna? Or either? 428 00:29:04.110 --> 00:29:07.950 Anna Delaney: Either. Or one you love to hate, or vice versa. 429 00:29:11.650 --> 00:29:13.000 John Kindervag: You go first, Matt. 430 00:29:13.000 --> 00:29:21.580 Matthew Schwartz: I'm not going to sacrifice cybersecurity. I'm 431 00:29:21.580 --> 00:29:25.180 going be an optimist. And it's only May. So hopefully, 432 00:29:25.270 --> 00:29:28.270 something horrible will come along that I can use when you 433 00:29:28.270 --> 00:29:32.080 ask this question again. But I did notice the theme of RSA this 434 00:29:32.080 --> 00:29:37.240 year, if I'm recalling correctly, is change. And I 435 00:29:37.240 --> 00:29:39.520 don't know is that low ball is that high ball? I do think it's 436 00:29:39.520 --> 00:29:42.700 appropriate. Everything's changing so fast. And that 437 00:29:42.700 --> 00:29:46.600 includes what we're doing with cybersecurity at the societal 438 00:29:46.600 --> 00:29:50.470 level. It's such a big concept, but I'm just going to say 439 00:29:50.470 --> 00:29:53.410 change. That's my buzzword. Hopefully, it's a good one. Not 440 00:29:53.410 --> 00:29:53.920 a bad one. 441 00:29:55.880 --> 00:29:58.610 John Kindervag: Yeah, change is an easy buzzword because you can 442 00:29:59.030 --> 00:30:04.700 take it in. Of course there's change. The sun rose, the 443 00:30:04.700 --> 00:30:11.330 sunset, there was a change. So you're not going very far out on 444 00:30:11.330 --> 00:30:23.750 a limb with that one, are you? The buzzword for RSA this year, 445 00:30:23.780 --> 00:30:33.140 in my mind, is shark. Have they jumped the shark? I don't know, 446 00:30:33.140 --> 00:30:36.920 I kind of think RSA might have jumped the shark. If you 447 00:30:36.920 --> 00:30:44.900 remember that old term from, are you now irrelevant. Just last 448 00:30:44.900 --> 00:30:50.420 night, I was talking to a major CISO. And he's like, well, I'm 449 00:30:50.420 --> 00:30:53.150 not going to RSA this year. I called all my other CISO 450 00:30:53.150 --> 00:30:56.210 friends, and they're not going either. So we're going to see 451 00:30:58.310 --> 00:31:04.580 has it jumped the shark? Is it on its last legs? Is it ready to 452 00:31:05.840 --> 00:31:12.170 go on the firepit and become barbecue? I don't know. It 453 00:31:15.500 --> 00:31:22.010 doesn't seem like people missed it in the last two years. So is 454 00:31:22.070 --> 00:31:25.970 it going to be successful or people going to show up? I don't 455 00:31:25.970 --> 00:31:30.530 know. I think that's the question coming up in the next 456 00:31:30.530 --> 00:31:31.310 two months. 457 00:31:32.140 --> 00:31:36.040 Anna Delaney: See ya next month! I was going to say something 458 00:31:36.370 --> 00:31:38.800 around...there seems to be a lot of discussion on cyber war, 459 00:31:39.130 --> 00:31:44.260 cyber warfare, hybrid war. And there's no clarification or 460 00:31:44.260 --> 00:31:47.830 consolidation of these phrases. So maybe that's not a buzzword 461 00:31:47.830 --> 00:31:52.390 or they're not buzzwords, but I wonder how that debate will 462 00:31:52.390 --> 00:31:53.020 change. 463 00:31:55.040 --> 00:31:58.670 John Kindervag: We're always in a cyber war, right? So 464 00:31:58.670 --> 00:32:02.600 cybersecurity is one of three adversarial businesses. You have 465 00:32:02.930 --> 00:32:05.720 the military, law enforcement and cybersecurity. So we all 466 00:32:05.720 --> 00:32:09.140 have emissaries. And we're all in the same cyber war, because 467 00:32:09.350 --> 00:32:12.050 we're all directly connected to the world's most malicious 468 00:32:12.050 --> 00:32:19.100 actors. So when a nation-state attacks some company, they're 469 00:32:19.100 --> 00:32:22.640 sending a digital missile at that company. Now, they wouldn't 470 00:32:22.640 --> 00:32:27.500 do that kinetically, they wouldn't launch a missile at 471 00:32:28.400 --> 00:32:32.540 SolarWinds, for example. At the SolarWinds headquarters, you 472 00:32:32.540 --> 00:32:37.610 wouldn't launch a physical ICBM at SolarWinds headquarters to 473 00:32:37.610 --> 00:32:40.880 take it down, but you will want launch a digital missile. So 474 00:32:40.880 --> 00:32:45.230 we're already in that cyber war. And we just need to acknowledge 475 00:32:45.230 --> 00:32:49.400 that, because so many of the attackers are either 476 00:32:49.430 --> 00:32:51.710 nation-state attackers, or they're sponsored by 477 00:32:51.740 --> 00:32:56.330 nation-states. They're financed by nature, nation-states. So 478 00:32:56.540 --> 00:32:59.720 we're in that cyber war. And whether we acknowledge it or 479 00:32:59.720 --> 00:33:03.650 not, that becomes the question is the acknowledgement not the 480 00:33:03.650 --> 00:33:08.180 reality that I think is the question that needs to be 481 00:33:08.180 --> 00:33:16.130 answered by people like you and your team over there? I have a 482 00:33:16.130 --> 00:33:19.400 whole thing I call cyber war with zero trust, because I 483 00:33:19.400 --> 00:33:27.410 believe we're all in a cyber war. And if you're fighting a 484 00:33:27.410 --> 00:33:32.960 nation-state that has unlimited resources, that's a hard thing 485 00:33:32.960 --> 00:33:38.840 to fight, if you don't have the right tools. So you look at the 486 00:33:38.840 --> 00:33:45.590 history of warfare, when you're fighting a war with Napoleonic 487 00:33:45.770 --> 00:33:50.600 tactics, and suddenly the machine gun is invented. That's 488 00:33:51.710 --> 00:33:58.190 a nasty outcome. And so we're often fighting with really old 489 00:33:58.190 --> 00:34:00.830 tools. We're fighting with muskets and they've got machine 490 00:34:00.830 --> 00:34:04.430 guns. And we need to up our game a little bit, I think. 491 00:34:06.920 --> 00:34:09.410 Anna Delaney: John, always informative. I think we've got a 492 00:34:09.410 --> 00:34:13.340 few more discussions ahead of us, I think. You just dropped so 493 00:34:13.340 --> 00:34:17.480 many bombs at the end. I know Matthew's got questions. 494 00:34:17.000 --> 00:34:19.790 Matthew Schwartz: You'll be hearing from me, John. 495 00:34:20.510 --> 00:34:20.960 John Kindervag: Okay! 496 00:34:22.610 --> 00:34:24.260 Anna Delaney: This has been absolutely brilliant, though. 497 00:34:24.260 --> 00:34:27.200 Thank you, John. Thank you, Matt, for a great discussion. 498 00:34:27.860 --> 00:34:28.250 Matthew Schwartz: Thank you! 499 00:34:29.090 --> 00:34:29.780 John Kindervag: Thanks! 500 00:34:30.080 --> 00:34:32.600 Anna Delaney: Thank you so much for watching. Until next time!