WEBVTT 1 00:00:00.000 --> 00:00:03.330 Anna Delaney: Hello, and welcome to Proof of Concept, the ISMG 2 00:00:03.330 --> 00:00:06.120 talk show where we analyze today's and tomorrow's 3 00:00:06.150 --> 00:00:10.140 cybersecurity challenges with experts in the field and discuss 4 00:00:10.170 --> 00:00:13.110 how we can potentially solve them. We are your hosts. I'm 5 00:00:13.110 --> 00:00:16.200 Anna Delaney, director of productions here at ISMG. 6 00:00:16.710 --> 00:00:18.540 Tom Field: I'm Tom Field. I'm senior vice president of 7 00:00:18.540 --> 00:00:21.330 Editorial at ISMG. Anna, always a pleasure. 8 00:00:21.510 --> 00:00:24.750 Anna Delaney: Always a pleasure. So Tom, what is on your mind 9 00:00:24.750 --> 00:00:25.350 this week? 10 00:00:25.780 --> 00:00:30.310 Tom Field: Mardi Gras! Not Louisiana Mardi Gras, not New 11 00:00:30.310 --> 00:00:34.690 Orleans, but San Francisco and RSA Conference in the return of 12 00:00:34.690 --> 00:00:38.680 RSA conference to San Francisco after a two-year absence. That's 13 00:00:38.680 --> 00:00:41.770 about two weeks away, Anna, and we're both going to be there for 14 00:00:41.000 --> 00:00:44.253 Anna Delaney: We are! Marathon schedule ahead, I think. But I'm 15 00:00:41.770 --> 00:00:41.890 it. 16 00:00:44.320 --> 00:00:48.503 going to be curious as to what the atmosphere will be like this 17 00:00:48.569 --> 00:00:52.753 year. I don't know if it was the same in the US, but definitely 18 00:00:52.819 --> 00:00:57.135 in the UK, there was a period of conference fatigue, and a bit of 19 00:00:57.202 --> 00:01:00.787 groaning when people sort of had that thought of large 20 00:01:00.854 --> 00:01:04.771 conferences. Maybe this year, there'll be excitement and an 21 00:01:04.838 --> 00:01:09.087 appreciation of in-person events again. And of course, the world 22 00:01:09.154 --> 00:01:12.872 looks very different now as well, post-COVID, a pandemic 23 00:01:12.939 --> 00:01:17.122 which accelerated home working, and cloud adoption, and then of 24 00:01:17.188 --> 00:01:21.372 course, a war in Ukraine, which is sort of another geopolitical 25 00:01:21.438 --> 00:01:24.625 tension, which is really destabilizing the cyber 26 00:01:24.692 --> 00:01:28.410 structure. And what else have we got? Tom, have I missed 27 00:01:28.477 --> 00:01:32.262 anything? Oh, yes, ransomware! Ransomware attacks are now 28 00:01:32.328 --> 00:01:36.246 declared a national security threat. So there's a lot going 29 00:01:36.312 --> 00:01:39.500 on. I wonder how these will shape conversations? 30 00:01:39.840 --> 00:01:41.100 Tom Field: Well, it's interesting, because if you look 31 00:01:41.100 --> 00:01:44.340 at our own events, they're a microcosm of what's happening in 32 00:01:44.340 --> 00:01:47.490 the world. You and I have both been to some of our summits or 33 00:01:47.490 --> 00:01:51.780 conferences. We've hosted live roundtable discussions. And you 34 00:01:51.780 --> 00:01:55.950 see that people after two years of quarantine are very eager to 35 00:01:55.950 --> 00:01:59.550 get together again in groups and discuss these issues. At the 36 00:01:59.550 --> 00:02:03.480 same time, every one of these events is a first for somebody. 37 00:02:03.570 --> 00:02:05.790 It's the first time they've gotten out of their home office 38 00:02:05.790 --> 00:02:10.380 in two plus years to see such people. And you know, naturally 39 00:02:10.380 --> 00:02:12.390 enough, there's still some people tentative about that, 40 00:02:12.390 --> 00:02:15.000 given that the pandemic continues to rage and pockets 41 00:02:15.000 --> 00:02:18.840 around this. But I think there's so much to come and talk about 42 00:02:18.930 --> 00:02:22.290 in terms of the geopolitical situation, in terms of 43 00:02:22.470 --> 00:02:27.060 ransomware-as-a-service, in terms of software supply chain 44 00:02:27.060 --> 00:02:29.730 security, you know, we still haven't gotten all together yet 45 00:02:30.000 --> 00:02:33.750 to talk about SolarWinds, and Kaseya, and Colonial Pipeline, 46 00:02:33.810 --> 00:02:37.770 and Log4j. Here's the opportunity upcoming. I think 47 00:02:37.770 --> 00:02:41.370 people are ready. I think it's going to be a lively event. And 48 00:02:41.370 --> 00:02:44.280 I think as always, it's going to set the tone for our 49 00:02:44.280 --> 00:02:46.530 conversations for the remainder of the year. 50 00:02:47.260 --> 00:02:51.100 Anna Delaney: So, Tom is an RSA veteran. What are your survival 51 00:02:51.100 --> 00:02:51.640 tips? 52 00:02:52.350 --> 00:02:54.180 Tom Field: Comfortable shoes. 53 00:02:56.820 --> 00:02:58.470 Anna Delaney: And maybe Paracetamol. 54 00:02:59.250 --> 00:03:01.470 Tom Field: This is not the place you break in a new pair of 55 00:03:01.470 --> 00:03:02.040 shoes. 56 00:03:02.000 --> 00:03:04.508 Anna Delaney: Yeah, I would agree on that point. Having 57 00:03:03.770 --> 00:03:19.340 Tom Field: I am so excited because we do have two studios 58 00:03:04.569 --> 00:03:08.363 found that out last time, that we'll be doing a lot of sitting 59 00:03:08.424 --> 00:03:11.973 I presume, as well, this time because we're interviewing a 60 00:03:12.035 --> 00:03:15.890 series of people. Who are you looking forward to the interview? 61 00:03:19.340 --> 00:03:23.030 that we are going to be going for four days. We have one on 62 00:03:23.030 --> 00:03:26.720 Broadcast Alley as people walk into Moscone West and go into 63 00:03:26.720 --> 00:03:32.600 the showroom there. And we'll have our traditional ESPN-style 64 00:03:33.170 --> 00:03:36.860 setting, I always call it at the Marriott Marquis. And we are 65 00:03:36.860 --> 00:03:39.860 filling the slots for four days right now. We'll be doing a lot 66 00:03:39.860 --> 00:03:43.760 of sitting and talking. I'm always excited to bring in 67 00:03:43.760 --> 00:03:46.400 individuals and panels. And I think we have a great 68 00:03:46.400 --> 00:03:50.570 opportunity to create some unique programming. The slots 69 00:03:50.570 --> 00:03:54.650 are filling up right now. It's going to be exciting. That's all 70 00:03:54.650 --> 00:03:55.130 I can say. 71 00:03:55.690 --> 00:03:57.280 Anna Delaney: Looking back over the years, how have you made the 72 00:03:57.280 --> 00:03:59.230 most out of your experience there? 73 00:04:00.110 --> 00:04:04.550 Tom Field: Very little sleep and a lot of talking. We have the 74 00:04:04.550 --> 00:04:08.300 opportunity to talk with the cybersecurity leaders in the 75 00:04:08.300 --> 00:04:11.540 world. People from government, people from industry, people 76 00:04:11.540 --> 00:04:16.490 from every one of the sectors that we deal with. It's just a 77 00:04:16.490 --> 00:04:20.960 great opportunity to find out what is Top of Mind with the key 78 00:04:20.960 --> 00:04:24.320 decision makers globally in cybersecurity. It is 79 00:04:24.320 --> 00:04:27.950 overwhelming. And you need to come away with it and have some 80 00:04:27.950 --> 00:04:31.100 time to think and parse your thoughts and really determine 81 00:04:31.100 --> 00:04:34.910 what are the key overall themes you're hearing. But I will say 82 00:04:34.910 --> 00:04:40.550 this, you know, 18-hour days, dozens of conversations over a 83 00:04:40.550 --> 00:04:45.200 course of a day, events to go to go in the evening. It fills your 84 00:04:45.200 --> 00:04:48.590 time, it fills your mind. It sucks your soul to some extent 85 00:04:48.590 --> 00:04:52.520 but you always come away with a sense of what the most important 86 00:04:52.520 --> 00:04:57.440 themes are now and for the next six to 12 months going forward. 87 00:04:57.440 --> 00:05:01.070 So it always is a great opportunity to reconnect with 88 00:05:01.070 --> 00:05:02.180 the cybersecurity world. 89 00:05:02.810 --> 00:05:04.820 Anna Delaney: Yes, very well put. Well, I am looking forward 90 00:05:04.820 --> 00:05:08.840 to it. And I want to bring on stage, someone who has told me 91 00:05:08.840 --> 00:05:12.080 he will be there too. Grant Schneider, are you there? 92 00:05:14.790 --> 00:05:15.750 Tom Field: Are you at RSA now? 93 00:05:15.000 --> 00:05:18.600 Grant Schneider: I'm not at RSA yet. 94 00:05:20.190 --> 00:05:23.730 Anna Delaney: But soon, Grant. For those who don't know, you 95 00:05:23.730 --> 00:05:25.890 are the senior director of the cybersecurity services at 96 00:05:25.890 --> 00:05:30.900 Venable LLP and the former federal CISO. Hello, thanks so 97 00:05:30.900 --> 00:05:31.920 much for joining us, Grant. 98 00:05:32.950 --> 00:05:35.080 Grant Schneider: Thank you. Great to be here, Anna and Tom. 99 00:05:35.080 --> 00:05:37.780 Good to see you guys again. I look forward to seeing you in 100 00:05:37.780 --> 00:05:38.320 person. 101 00:05:38.780 --> 00:05:43.820 Anna Delaney: Yes. So, Grant, we've just passed, can you 102 00:05:43.820 --> 00:05:46.100 believe it, the first anniversary of Biden's 103 00:05:46.100 --> 00:05:51.500 cybersecurity EO. Are we where we need to be? Because I say 104 00:05:51.500 --> 00:05:54.500 this, I asked this, your colleague, Jeremy Grant told us 105 00:05:54.500 --> 00:05:58.610 recently on an episode of Proof of Concept, that whilst progress 106 00:05:58.610 --> 00:06:02.930 has been made, the challenge is in translating policy decrees 107 00:06:02.930 --> 00:06:05.420 into result. And that will take time. Do you agree? And what are 108 00:06:05.420 --> 00:06:06.140 your thoughts? 109 00:06:07.350 --> 00:06:09.570 Grant Schneider: Yeah, I completely agree with that 110 00:06:09.570 --> 00:06:13.230 statement. I think that if you look at, you know, where are we 111 00:06:13.230 --> 00:06:16.680 a year into the EO, I think the administration hit most of the 112 00:06:16.770 --> 00:06:20.100 deadlines, at least most of the deadlines that were publicly 113 00:06:20.100 --> 00:06:25.440 available, you know, over the last 12 months, and there was a 114 00:06:25.440 --> 00:06:29.580 lot of actions and a lot of things that had to happen. I 115 00:06:29.580 --> 00:06:33.300 think, and this is where I think Jeremy's probably going. A lot 116 00:06:33.300 --> 00:06:37.110 of those actions don't generate the outcomes, or the results 117 00:06:37.140 --> 00:06:39.960 that he mentioned that we really want to get to. A lot of them 118 00:06:39.960 --> 00:06:43.170 were plans, and they were developing and sort of getting 119 00:06:43.170 --> 00:06:47.400 things started. But all of those really kicked off activities 120 00:06:47.910 --> 00:06:52.020 that agencies need to take to really achieve the desired 121 00:06:52.020 --> 00:06:55.140 cybersecurity enhancement outcomes that we'd like to see. 122 00:06:55.140 --> 00:06:59.880 So definitely, more work to do. But I think it has, you know, 123 00:06:59.880 --> 00:07:04.620 galvanized the agencies and really focused them on 124 00:07:04.620 --> 00:07:06.900 cybersecurity in a great way. 125 00:07:08.030 --> 00:07:10.880 Anna Delaney: Well, it's been a busy week for government. News, 126 00:07:10.880 --> 00:07:14.060 also this week, that endpoint detection and response to 127 00:07:14.060 --> 00:07:17.090 appointments will be underway, and more than half of federal 128 00:07:17.090 --> 00:07:20.540 civilian agencies by the end of September. Now that's not too 129 00:07:20.540 --> 00:07:24.890 far away. You're well aware of how government operates, what is 130 00:07:24.890 --> 00:07:28.100 your perspective on this? And what potential challenges do you 131 00:07:28.000 --> 00:07:31.321 Grant Schneider: So I think I'm getting underway. That's a great 132 00:07:28.100 --> 00:07:28.700 foresee? 133 00:07:31.392 --> 00:07:35.208 phrase, when you're in government, right, I'm going to 134 00:07:35.278 --> 00:07:39.589 be started on things, as opposed to necessarily complete with 135 00:07:39.660 --> 00:07:44.253 things. And so I think that's an achievable goal to get underway. 136 00:07:44.324 --> 00:07:48.776 And I think it's great. I think that is, you know, long journey 137 00:07:48.847 --> 00:07:53.087 starts with the first step, and I think this is really about 138 00:07:53.157 --> 00:07:57.115 getting those first steps taken at a number of different 139 00:07:57.185 --> 00:08:01.637 agencies, rolling things out in one environment is a challenge, 140 00:08:01.708 --> 00:08:05.383 rolling it out across agencies is a whole bunch more 141 00:08:05.453 --> 00:08:09.694 challenges. And so, I think, you know, that is an achievable 142 00:08:09.764 --> 00:08:14.004 target. I really would love to see, you know, and maybe it's 143 00:08:14.075 --> 00:08:18.386 the end of this fiscal year, calendar year, but kind of where 144 00:08:18.456 --> 00:08:22.343 our agencies right there started, but where are they in 145 00:08:22.414 --> 00:08:26.795 their implementation? And are they really leveraging the value 146 00:08:26.866 --> 00:08:31.389 of EDR, where you match that up with, you know, other sensors in 147 00:08:31.459 --> 00:08:35.063 your environment, the other things in your extended 148 00:08:35.134 --> 00:08:38.950 detection and response environment, if you will, to be 149 00:08:39.021 --> 00:08:43.331 able to have better situational awareness and then be able to 150 00:08:43.402 --> 00:08:46.300 react more quickly to malicious activity. 151 00:08:47.460 --> 00:08:50.100 Anna Delaney: So often, we hear that having adequate resources 152 00:08:50.100 --> 00:08:52.890 at the government level is a challenge. Do you see that 153 00:08:52.890 --> 00:08:54.030 shifting in any way? 154 00:08:55.530 --> 00:08:58.890 Grant Schneider: I mean, agencies are just now in the 155 00:08:58.920 --> 00:09:01.680 budgets that they're building the 2024 budget, right? They 156 00:09:01.680 --> 00:09:06.090 just got their 2022 budget, the 2023 budget is with Congress. 157 00:09:06.090 --> 00:09:10.410 And hopefully, that will come this fall, we'll see. But 158 00:09:10.410 --> 00:09:13.890 agencies are really getting their first opportunity since 159 00:09:13.920 --> 00:09:17.820 SolarWinds, which drove the executive order to ask for new 160 00:09:17.820 --> 00:09:21.720 money or make significant realignments to their 161 00:09:21.810 --> 00:09:24.840 cybersecurity and technology dollars. So, you know, 162 00:09:24.870 --> 00:09:27.570 everything they've been doing so far, they're kind of taking out 163 00:09:27.570 --> 00:09:31.740 a hide the word, some SolarWinds dollars, and the 2022 164 00:09:32.730 --> 00:09:36.300 appropriations for a handful of agencies. But in general, 165 00:09:36.780 --> 00:09:39.120 agencies are definitely strapped, right. They came out 166 00:09:39.120 --> 00:09:41.700 of the pandemic that you mentioned, where they had to 167 00:09:41.700 --> 00:09:44.880 shift to work from home. They shifted to cloud capabilities. 168 00:09:44.880 --> 00:09:48.510 They did a lot of things there. And so I think they've already 169 00:09:48.510 --> 00:09:52.200 been strapped. And this is an additional challenge. And I 170 00:09:52.200 --> 00:09:55.440 think agencies are really going to have to make hard tradeoffs 171 00:09:55.470 --> 00:09:59.490 on how they're investing not just their cyber dollars, but 172 00:09:59.490 --> 00:10:03.360 their technology dollars writ large, which is typically a much 173 00:10:03.360 --> 00:10:07.230 bigger pot of money than their cybersecurity dollars to be sure 174 00:10:07.230 --> 00:10:09.210 that they're meeting their critical requirements. 175 00:10:10.560 --> 00:10:13.470 Anna Delaney: Well, this week also cybersecurity agencies from 176 00:10:13.470 --> 00:10:16.920 across five eyes countries have published a joint report on the 177 00:10:16.920 --> 00:10:20.370 most common methods and techniques used by threat actors 178 00:10:20.370 --> 00:10:23.340 to gain an initial foothold into corporate and government 179 00:10:23.340 --> 00:10:26.280 networks. What did you make of the findings? Because it's not 180 00:10:26.280 --> 00:10:29.550 really about sophisticated, sexy zero days, is it? 181 00:10:29.000 --> 00:10:31.100 Grant Schneider: No, it's really not. A friend of mine years ago 182 00:10:31.100 --> 00:10:32.300 said that cybersecurity is like working at a brewery. You know, 183 00:10:32.300 --> 00:10:42.260 it sounds all exciting and sexy. But working in a brewery is 184 00:10:42.260 --> 00:10:44.810 really about cleaning stuff. It's about sanitization, 185 00:10:44.810 --> 00:10:48.380 sanitization, sanitization, and cybersecurity is about the 186 00:10:48.380 --> 00:10:51.200 basics and the fundamentals. And I think that's reflected in the 187 00:10:51.200 --> 00:10:53.810 guidance. I think that the guidance is great. And it's 188 00:10:53.810 --> 00:10:58.160 great to have all of that in one location. And, you know, 189 00:10:58.250 --> 00:11:01.610 industries should certainly look to that. That's the types of 190 00:11:01.610 --> 00:11:04.850 things that I tell my clients to really focus on multi-factor 191 00:11:04.850 --> 00:11:08.360 authentication, patching your systems, updating your software, 192 00:11:08.540 --> 00:11:12.350 you know, doing those basic hygienes, sometimes the term we 193 00:11:12.350 --> 00:11:17.570 use activities, so I don't think there was anything terribly new 194 00:11:17.570 --> 00:11:21.980 in it. However, it's always good to have a resource in one place, 195 00:11:21.980 --> 00:11:25.310 it's good to get some additional media, more people like us 196 00:11:25.310 --> 00:11:28.760 talking about it, so that more companies pay attention, 197 00:11:28.760 --> 00:11:32.510 because, you know, a lot of companies and entities haven't 198 00:11:32.510 --> 00:11:34.700 implemented multi-factor authentication or 199 00:11:34.730 --> 00:11:37.520 phishing-resistant multi-factor authentication, where they 200 00:11:37.520 --> 00:11:41.030 really need to move to so you know, great sets of actions in 201 00:11:41.030 --> 00:11:43.730 there. And I think every organization should benchmark 202 00:11:43.730 --> 00:11:46.310 themselves against what would send that alert. 203 00:11:47.270 --> 00:11:49.640 Anna Delaney: That's great. Grant, we said that you're going 204 00:11:49.640 --> 00:11:52.190 to RSA. How are you going to make the most of this 205 00:11:52.190 --> 00:11:53.120 experience? 206 00:11:54.410 --> 00:11:58.160 Grant Schneider: So the first thing I would say is disable all 207 00:11:58.160 --> 00:12:02.030 your wireless connections on your phone when you get to San 208 00:12:02.030 --> 00:12:08.180 Francisco. So that's step one. I agree with Tom on uncomfortable 209 00:12:08.180 --> 00:12:11.510 shoes, but it really is about the engagements and it's about 210 00:12:11.510 --> 00:12:16.220 the meetings, and to me, a lot of that are the sidebars, right? 211 00:12:16.220 --> 00:12:19.970 The sidebars that you have with people, sometimes that are 212 00:12:19.970 --> 00:12:22.700 impromptu because you run into them on the street and you just 213 00:12:22.700 --> 00:12:25.820 haven't seen them in a few years. I think that's going to 214 00:12:25.820 --> 00:12:29.420 be the case again this year, is reconnecting with people, 215 00:12:30.080 --> 00:12:33.320 understanding, you know, where they're at in their 216 00:12:33.320 --> 00:12:36.530 cybersecurity journey and see what you can learn from them and 217 00:12:36.530 --> 00:12:38.990 see how we can help each other. It's really what it's about. 218 00:12:38.990 --> 00:12:42.500 It's about sharing and helping each other at the conference. 219 00:12:42.900 --> 00:12:45.840 Anna Delaney: Excellent advice. Well grown. I'll see you there. 220 00:12:46.560 --> 00:12:49.290 But in the meantime, I'm passing the baton over to Tom to 221 00:12:49.290 --> 00:12:50.430 introduce our next guest. 222 00:12:50.810 --> 00:12:53.780 Tom Field: Excellent, terrific time to talk about this. In the 223 00:12:53.780 --> 00:12:56.690 past weeks, we have seen volatility in the marketplace. 224 00:12:56.750 --> 00:13:00.590 We have seen essentially, the crypto exchange hack of the 225 00:13:00.590 --> 00:13:04.760 week. It is the crypto Mardi Gras. Here to talk about that 226 00:13:04.760 --> 00:13:09.350 with us is Captain Crypto. He is the crypto Blue Devil from Duke 227 00:13:09.350 --> 00:13:12.590 University, head of Legal and Government Affairs with TRM 228 00:13:12.590 --> 00:13:15.980 Labs. Introducing Ari Redbord. Ari, how's that for like a 229 00:13:15.980 --> 00:13:16.970 sports introduction? 230 00:13:17.530 --> 00:13:19.773 Ari Redbord: I love everything about that introduction. All we 231 00:13:19.819 --> 00:13:22.521 need is sort of like the cool, you know, entering the arena 232 00:13:22.566 --> 00:13:25.360 type of music, which we'll work on for the next one for sure. 233 00:13:25.000 --> 00:13:30.610 Tom Field: Ari, last week, we get a stark reminder of the 234 00:13:30.610 --> 00:13:33.970 volatility of cryptocurrencies, as you know, Bitcoin plummeted 235 00:13:33.970 --> 00:13:38.230 to its lowest value in 16 months. So, raises a question: 236 00:13:38.470 --> 00:13:42.310 is the stable coin economy stable as we'd like it to be? 237 00:13:43.080 --> 00:13:45.330 Ari Redbord: Look, I think what we're seeing here is sort of, 238 00:13:45.420 --> 00:13:48.420 you know, a reminder that we're still day one when it comes to 239 00:13:48.420 --> 00:13:52.560 sort of building this new crypto economy. And there are just 240 00:13:52.560 --> 00:13:58.290 myriad projects, sort of, you know, across the landscape, more 241 00:13:58.320 --> 00:14:01.110 stable for lack of better description than others, some 242 00:14:01.110 --> 00:14:04.650 more volatile. And I think what you're seeing here is, quite 243 00:14:04.650 --> 00:14:08.010 frankly, sort of a winning or winnowing out of the sort of 244 00:14:08.010 --> 00:14:12.270 crypto economy as we build. I think, you know, look, even 245 00:14:12.270 --> 00:14:15.360 regulators and governments, Janet Yellen, Treasury 246 00:14:15.360 --> 00:14:18.150 secretary, came out and said, look, I don't see a systemic 247 00:14:18.180 --> 00:14:20.880 risk here. I mean, there's no question that these are the 248 00:14:20.880 --> 00:14:23.280 types of events that could spread. I think what we've seen 249 00:14:23.280 --> 00:14:27.210 is it did not, but look, a lot of the warnings around sort of 250 00:14:27.330 --> 00:14:32.130 stable coin runs and volatility and systemic risks, you know, I 251 00:14:32.130 --> 00:14:34.500 think we saw it play out a little bit last week. Again, in 252 00:14:34.500 --> 00:14:39.390 a contained way. But what we will likely see is a continued 253 00:14:39.390 --> 00:14:41.970 push, although I'm not sure how much more you can push because 254 00:14:41.970 --> 00:14:45.030 there's already been a lot of discussions about sort of how to 255 00:14:45.030 --> 00:14:49.050 regulate the stable coin space. And while I'm not, you know, 256 00:14:49.680 --> 00:14:55.140 betting on clear regulation, or clear legal frameworks, clear 257 00:14:55.140 --> 00:14:59.190 legislation for crypto, over the course of the next several 258 00:14:59.190 --> 00:15:03.510 months or a year or so, we may see action on stable coins. But 259 00:15:03.510 --> 00:15:06.930 I think that would have been true even before this sort of 260 00:15:06.930 --> 00:15:09.960 latest market volatility. 261 00:15:10.440 --> 00:15:12.630 Tom Field: Fair point, because as you know, stable coins were 262 00:15:12.630 --> 00:15:15.690 typically lightly regulated, which is a nice way of saying; 263 00:15:15.690 --> 00:15:20.520 sometimes not regulated at all. What do you see is the state of 264 00:15:21.210 --> 00:15:24.990 crypto regulation today? And what trends are you for seeing? 265 00:15:24.000 --> 00:15:27.023 Ari Redbord: Yeah, I think it's a really exciting moment. And 266 00:15:27.086 --> 00:15:30.928 look, we have sort of a week, like last week, and you see the 267 00:15:30.991 --> 00:15:34.959 price of bitcoin. But you know, look, the price of bitcoin does 268 00:15:35.022 --> 00:15:38.738 not determine sort of the state of crypto. And I think what 269 00:15:38.801 --> 00:15:42.517 we're seeing right now is, you know, we had never seen a US 270 00:15:42.580 --> 00:15:45.855 President talk about cryptocurrency. And now we have 271 00:15:45.918 --> 00:15:49.823 an executive order on it. And that executive order, from a few 272 00:15:49.886 --> 00:15:53.602 months ago, didn't just talk about sort of the risks, which 273 00:15:53.665 --> 00:15:57.444 is typical sort of a document like that. It talked about the 274 00:15:57.507 --> 00:16:01.223 need for US leadership in the world. And the importance of, 275 00:16:01.286 --> 00:16:04.876 you know, looking really hard and studying a central bank 276 00:16:04.939 --> 00:16:08.340 digital currency. And I think what we've seen then, is 277 00:16:08.403 --> 00:16:11.930 regulators and, you know, policymakers globally, kind of 278 00:16:11.993 --> 00:16:15.646 continue that message, that clarion call for leadership in 279 00:16:15.709 --> 00:16:19.677 the space. You know, a couple of weeks ago, we saw a handful of 280 00:16:19.740 --> 00:16:23.708 UK regulators come out and say, hey, you know, this summer, the 281 00:16:23.771 --> 00:16:27.676 Royal Mint is going to mint in NFT. We want to lead in sort of 282 00:16:27.739 --> 00:16:31.581 the regulatory space, in the stable coin space and the CBDCs. 283 00:16:31.644 --> 00:16:35.612 And I think we're seeing sort of more and more of this need for 284 00:16:35.675 --> 00:16:39.643 leadership, you know, and call for leadership in this space. So 285 00:16:39.706 --> 00:16:43.674 I think we've sort of gone past this business about, we should, 286 00:16:43.737 --> 00:16:47.516 you know, ban crypto to stop ransomware and other bad things 287 00:16:47.579 --> 00:16:51.232 from happening, too. We need thoughtful, sound regulation, 288 00:16:51.295 --> 00:16:55.263 and responsible innovation. And I think that's a really kind of 289 00:16:55.326 --> 00:16:58.350 good place for the overall crypto economy to be. 290 00:16:58.900 --> 00:17:00.550 Tom Field: I liked that you mentioned the executive order, 291 00:17:00.550 --> 00:17:04.060 because as Anna said, it's been a year since the cybersecurity 292 00:17:04.090 --> 00:17:06.400 executive order from President Biden. It's been about two 293 00:17:06.400 --> 00:17:09.010 months now, since the cryptocurrency executive order 294 00:17:09.010 --> 00:17:11.860 was released. I know, it's still early days. But I know you've 295 00:17:11.860 --> 00:17:15.310 got great sources in government as well. What progress do you 296 00:17:15.310 --> 00:17:16.030 see so far? 297 00:17:16.000 --> 00:19:22.600 Tom Field: I guess the news is with blockchain and 298 00:17:16.020 --> 00:17:18.643 Ari Redbord: Yeah, you know, look, the executive order 299 00:17:18.707 --> 00:17:22.673 basically sort of tasks tasked federal agencies, really across 300 00:17:22.737 --> 00:17:26.832 the interagency, with coming up with reports and guidance in the 301 00:17:26.896 --> 00:17:30.863 crypto space. Admittedly, a lot of them had already done that. 302 00:17:30.927 --> 00:17:34.958 And, you know, we heard a speech from Janet Yellen, a few weeks 303 00:17:35.022 --> 00:17:38.541 ago. We've heard a lot of comments by Gary Gensler, the 304 00:17:38.605 --> 00:17:42.699 SEC chair. We've seen CFTC come out and sort of make statements. 305 00:17:42.763 --> 00:17:46.410 OCC. I think we're going to see those come out in sort of 306 00:17:46.474 --> 00:17:50.505 reports, but what we've already heard is sort of a preview. And 307 00:17:50.569 --> 00:17:54.344 a lot of it sort of echoes the EO. And that is this sort of 308 00:17:54.408 --> 00:17:57.991 call for leadership, and the space and sort of what that 309 00:17:58.055 --> 00:18:02.149 looks like. But I can tell you, sort of, talking to folks at DOJ 310 00:18:02.213 --> 00:18:06.116 and Treasury, that work was very quickly spun up, those teams 311 00:18:06.180 --> 00:18:09.379 were very, very quickly put together. And they are 312 00:18:09.443 --> 00:18:13.218 definitely working on it. You know, I think, interestingly, 313 00:18:13.282 --> 00:18:17.057 sort of in the meantime, we've seen cyberattacks, you know, 314 00:18:17.121 --> 00:18:20.960 continue. I know, you were sort of talking about that in the 315 00:18:21.024 --> 00:18:24.734 last segment with Grant, but a lot of those attacks are on 316 00:18:24.798 --> 00:18:28.509 cryptocurrency businesses. Because you know, in the age of 317 00:18:28.573 --> 00:18:32.732 crypto, an attack could mean you can steal money directly to fund 318 00:18:32.796 --> 00:18:36.507 weapons proliferation, sort of destabilizing activity. And 319 00:18:36.571 --> 00:18:40.154 we've seen that, but really interesting, we've also seen 320 00:18:40.218 --> 00:18:43.993 regulators start to take action in real time in response to 321 00:18:44.057 --> 00:18:47.703 these types of attacks. And that's sort of an only crypto 322 00:18:47.767 --> 00:18:51.158 type story, where a regulator can sort of actually be 323 00:18:51.222 --> 00:18:55.317 following the flow of funds in a cyberattack, and start bringing 324 00:18:55.381 --> 00:18:58.836 sanctions actions in real time. And it's just a really 325 00:18:58.900 --> 00:19:02.803 interesting development that shows sort of the paradox of you 326 00:19:02.867 --> 00:19:06.834 know, in cryptocurrency, you can steal a lot of money, and you 327 00:19:06.898 --> 00:19:10.672 can move it faster. But law enforcement also has tools that 328 00:19:10.736 --> 00:19:14.575 we never had before. It's a really interesting moment in the 329 00:19:14.639 --> 00:19:17.966 regulatory, in the law enforcement, and sort of, you 330 00:19:18.030 --> 00:19:19.950 know, across the crypto verse. 331 00:19:22.600 --> 00:19:28.900 cryptocurrency, everyone knows you're a dog. Ari, as you said, 332 00:19:28.930 --> 00:19:30.820 we're in the age of crypto, we're going to be talking about 333 00:19:30.820 --> 00:19:32.830 this a lot. Thank you so much for being here with us today. 334 00:19:33.130 --> 00:19:34.630 Ari Redbord: Love it. Always a pleasure. Thank you so much for 335 00:19:34.630 --> 00:19:35.050 having me. 336 00:19:35.680 --> 00:19:37.300 Tom Field: Anna, want some fun with our guests. 337 00:19:37.480 --> 00:19:39.070 Anna Delaney: Well, let's all come together again, then. I 338 00:19:39.070 --> 00:19:44.710 think it's time. So I want to start this conversation around 339 00:19:44.740 --> 00:19:49.300 collaboration. And how we can improve as an industry on that 340 00:19:49.300 --> 00:19:52.900 front because we all know it's important, but I feel sometimes 341 00:19:52.900 --> 00:19:56.710 it's given a bit of a superficial brushstroke. Ari, at 342 00:19:56.710 --> 00:20:01.420 TRM Labs, you look at blockchain analytics to follow the money. 343 00:20:01.420 --> 00:20:04.720 It's definitely part of the cybercrime puzzle. But it's not 344 00:20:04.720 --> 00:20:07.720 the complete picture, is it? And I just want to know how you 345 00:20:07.720 --> 00:20:10.930 collaborate with law enforcement, with other data 346 00:20:10.930 --> 00:20:13.240 analytics companies and other companies? 347 00:20:13.630 --> 00:20:15.700 Ari Redbord: Yeah, it's a great question. And, you know, to me 348 00:20:15.700 --> 00:20:18.850 collaboration is everything, public private partnerships. 349 00:20:19.000 --> 00:20:22.270 And, you know, it's something that's easy to say, and I get 350 00:20:22.300 --> 00:20:25.000 that it's harder to do. But, you know, I've sort of seen this 351 00:20:25.000 --> 00:20:28.390 from both sides. And I know, Grant has as well. You know, 352 00:20:28.390 --> 00:20:31.840 when I was in the government, particularly at Treasury, there 353 00:20:31.840 --> 00:20:35.140 was a huge effort at the Office of Terrorism and Financial 354 00:20:35.140 --> 00:20:40.240 Intelligence to provide guidance, and, you know, 355 00:20:40.480 --> 00:20:45.850 advisory to private sector. And it was a critical part of what 356 00:20:45.850 --> 00:20:48.610 we did. We hoped that every enforcement action was something 357 00:20:48.610 --> 00:20:50.080 that, you know, the private sector would read and 358 00:20:50.080 --> 00:20:53.020 understand, because that was sort of a preview of the 359 00:20:53.020 --> 00:20:55.720 guidance that was coming down the road. And look, now at TRM, 360 00:20:56.440 --> 00:21:00.100 that's exactly what we do. We are a tool that is used by law 361 00:21:00.100 --> 00:21:03.550 enforcement, by regulators to track and trace the flow of 362 00:21:03.550 --> 00:21:08.080 funds, to make licensing determinations when it comes to 363 00:21:08.140 --> 00:21:12.010 a crypto business. And, you know, those partnerships are 364 00:21:12.010 --> 00:21:16.690 absolutely critical to the process. You know, tools are 365 00:21:16.690 --> 00:21:19.690 very powerful, particularly in crypto, where you have sort of 366 00:21:19.720 --> 00:21:22.420 open blockchains. And you can follow the flow of funds. But 367 00:21:22.420 --> 00:21:25.030 honestly, at the end of the day, these investigations, sort of 368 00:21:25.060 --> 00:21:28.780 whether it's BitFenix, or Ronin, or Colonial Pipeline, come down 369 00:21:28.780 --> 00:21:33.340 to great investigators. And using just this, you know, one 370 00:21:33.340 --> 00:21:36.610 of many tools in their toolbox. And it's that sort of 371 00:21:36.610 --> 00:21:39.460 coordination and collaboration that is so critical. 372 00:21:40.690 --> 00:21:43.600 Anna Delaney: That's great. And Grant, I'd love your perspective 373 00:21:43.600 --> 00:21:47.290 on how we can improve? Where can we close the gaps? 374 00:21:48.260 --> 00:21:51.218 Grant Schneider: Yeah, I mean, a, I agree with you, and I agree 375 00:21:51.282 --> 00:21:55.269 with Ari that it's absolutely critical, right? And it is hard. 376 00:21:55.333 --> 00:21:59.063 And, you know, it's sort of the new version of information 377 00:21:59.127 --> 00:22:02.857 sharing, right? For years, it was we need more information 378 00:22:02.921 --> 00:22:06.908 sharing. And that became, you know, I want to say it became a 379 00:22:06.972 --> 00:22:10.766 throwaway term, and I don't really think it is. But I think 380 00:22:10.831 --> 00:22:14.560 with collaboration and with information sharing, really to 381 00:22:14.625 --> 00:22:18.612 be successful, like, what do you want to collaborate about and 382 00:22:18.676 --> 00:22:22.534 on? And what are the outcomes that you're trying to achieve? 383 00:22:22.599 --> 00:22:26.521 And I think the more specific you can get, the better ability 384 00:22:26.585 --> 00:22:30.122 you have to create the right partnership with the right 385 00:22:30.187 --> 00:22:34.045 partners, and be able to build that trust, because both with 386 00:22:34.109 --> 00:22:37.710 information sharing and with collaboration, you know, it 387 00:22:37.775 --> 00:22:41.762 really is successful when you can create a safe space that you 388 00:22:41.826 --> 00:22:45.556 can toss out ideas that maybe seem a little silly or don't 389 00:22:45.620 --> 00:22:49.350 seem quite right. But spark other people's imagination, to 390 00:22:49.414 --> 00:22:52.951 help, you know, whether it's an investigator, to help a 391 00:22:53.015 --> 00:22:57.066 forensics person, to help figure out what's actually happening. 392 00:22:57.131 --> 00:23:00.603 So I think the getting specific about what you want to 393 00:23:00.667 --> 00:23:04.590 collaborate about, who you want to collaborate with, and then 394 00:23:04.654 --> 00:23:08.384 how do you create that safe environment, that everyone can 395 00:23:08.448 --> 00:23:12.500 completely bring their complete capabilities to the engagement. 396 00:23:12.000 --> 00:23:17.340 Tom Field: Anna, if I may, I've hosted a number of roundtable 397 00:23:17.340 --> 00:23:19.650 discussions on this topic recently. And just the other 398 00:23:19.650 --> 00:23:23.010 night I did was some critical infrastructure organizations. 399 00:23:23.010 --> 00:23:25.770 And I know that talking to CISOs, there was a hunger for 400 00:23:25.770 --> 00:23:29.430 this. And I think that they are ready now to be able to share 401 00:23:29.610 --> 00:23:33.000 their information, their threat intelligence in order to receive 402 00:23:33.000 --> 00:23:36.450 that and certainly desire to do it with the public sector. Now 403 00:23:36.450 --> 00:23:39.300 credit where credit's due. I hosted this discussion with 404 00:23:39.300 --> 00:23:43.980 Michael Ehrlich. He's the former CTO with IronNet Cybersecurity. 405 00:23:44.190 --> 00:23:47.550 I thought he described the mentality perfectly when he 406 00:23:47.550 --> 00:23:50.940 called it a kindergarten mentality between the public and 407 00:23:50.940 --> 00:23:54.810 private sectors. There's a desire that I would like to have 408 00:23:54.810 --> 00:23:58.260 you share your toys, but I'm kind of reluctant to share my 409 00:23:58.260 --> 00:24:02.010 own. But he also described a good end game, which he 410 00:24:02.010 --> 00:24:07.080 described as the ways app of cybersecurity. If we can get a 411 00:24:07.080 --> 00:24:11.520 way to be able to share information about the car wreck 412 00:24:11.520 --> 00:24:14.280 or the construction up ahead, or even that there's a police 413 00:24:14.280 --> 00:24:17.640 officer looking for speeders up ahead and be able to do that 414 00:24:17.640 --> 00:24:21.750 anonymously in real time, that's a place that we need to go and I 415 00:24:21.750 --> 00:24:25.050 think increasingly, there's a desire and capability to get 416 00:24:25.050 --> 00:24:25.260 there. 417 00:24:25.050 --> 00:24:27.750 Ari Redbord: You know, Tom, it's interesting, and this is, I 418 00:24:27.807 --> 00:24:31.082 promise you, this is not a shameless plug. But yesterday, 419 00:24:31.140 --> 00:24:34.185 we released something called Chain Abuse, which is in 420 00:24:34.242 --> 00:24:37.747 collaboration with five or six other leading industry groups, 421 00:24:37.804 --> 00:24:41.539 including Circle and Binance and others, and it is a free website 422 00:24:41.596 --> 00:24:45.158 where we crowdsource hacks and scams and fraud and crypto. And 423 00:24:45.216 --> 00:24:48.433 the idea is it's a ways. It absolutely is! It is a place 424 00:24:48.491 --> 00:24:52.053 where people could go to share a hack or a scam that they have 425 00:24:52.110 --> 00:24:55.500 been a victim of, so that it doesn't happen to other people 426 00:24:55.557 --> 00:24:58.890 and I think that, you know, part of the ethos of crypto is 427 00:24:58.947 --> 00:25:02.050 obviously sort of this democratization of finance. But 428 00:25:02.107 --> 00:25:05.554 there's also an understanding that you have to have controls 429 00:25:05.612 --> 00:25:09.231 in place. But if the community can kind of come together and do 430 00:25:09.289 --> 00:25:12.966 that together, at least in part, outside of law enforcement, and 431 00:25:13.023 --> 00:25:16.700 I think that definitely is sort of some of the power and promise 432 00:25:16.758 --> 00:25:20.205 there. But yeah, I think I could not agree more on that ways 433 00:25:20.262 --> 00:25:23.940 analogy, and that's what we've essentially been trying to build. 434 00:25:23.000 --> 00:25:25.910 Tom Field: There you go, Anna! 435 00:25:26.930 --> 00:25:29.300 Anna Delaney: Totally, I was just going say in our roundtable 436 00:25:29.300 --> 00:25:32.540 yesterday as well, that came up. The kindergarten approach, some 437 00:25:32.540 --> 00:25:34.490 are eager to take and not to share. 438 00:25:34.900 --> 00:25:37.030 Ari Redbord: It's hard. You know, it's hard, having been in 439 00:25:37.030 --> 00:25:39.790 government and having been on the other side, I guess, there's 440 00:25:39.790 --> 00:25:45.490 a lot you can share, and I know Grant does too. So I usually try 441 00:25:45.490 --> 00:25:51.550 to, you know, just be a giver. You know, because it's gotta be 442 00:25:51.850 --> 00:25:56.230 you know, give freely, just love, because you're not going 443 00:25:56.230 --> 00:26:00.880 to get a ton in return. And, you know, it's sort of, I don't 444 00:26:00.880 --> 00:26:02.980 know, domestic violence relationship or something. But 445 00:26:02.980 --> 00:26:08.320 like, it's important. And, you know, I do understand why it 446 00:26:08.320 --> 00:26:10.480 isn't always a two-way street, and I just try to be as 447 00:26:10.480 --> 00:26:11.440 supportive as possible. 448 00:26:13.150 --> 00:26:14.050 Anna Delaney: Are we ending on that note, Tom? 449 00:26:14.000 --> 00:26:16.970 Tom Field: I think it's a terrific place to wrap things 450 00:26:16.970 --> 00:26:17.030 up. 451 00:26:17.030 --> 00:26:17.480 Grant Schneider: Be supportive. 452 00:26:18.410 --> 00:26:18.980 Tom Field: Share. 453 00:26:19.550 --> 00:26:20.270 Ari Redbord: Be a lover. 454 00:26:21.200 --> 00:26:23.390 Anna Delaney: That was absolutely fantastic. I really 455 00:26:23.390 --> 00:26:26.090 enjoyed this discussion. Ari Redbord and Grant Schneider, 456 00:26:26.210 --> 00:26:27.380 thank you for joining us. 457 00:26:27.740 --> 00:26:28.490 Ari Redbord: Thank you so much. 458 00:26:29.800 --> 00:26:31.750 Anna Delaney: Thanks so much for watching. Until next time!