WEBVTT 1 00:00:06.330 --> 00:00:08.520 Anna Delaney: Hello and welcome to this government security 2 00:00:08.520 --> 00:00:11.430 special edition of the ISMG Editors' Panel. I'm Anna 3 00:00:11.430 --> 00:00:14.490 Delaney. And on this episode we're discussing particular 4 00:00:14.490 --> 00:00:17.520 challenges facing U.S. government agencies, but also 5 00:00:17.520 --> 00:00:20.280 security leaders, more generally, including the 6 00:00:20.280 --> 00:00:24.060 cybersecurity and infrastructure security agency's, or CISA's 7 00:00:24.180 --> 00:00:27.990 updated self-attestation form for secure software development, 8 00:00:28.200 --> 00:00:31.590 increasing liability worries for CISOs post SolarWinds SEC 9 00:00:31.590 --> 00:00:36.690 lawsuit and concerns about a 25% budget cut to CISA and potential 10 00:00:36.690 --> 00:00:40.170 alternative strategies. And to do this, we are honored to be 11 00:00:40.170 --> 00:00:44.160 joined by former federal CISO Grant Schneider, senior director 12 00:00:44.160 --> 00:00:48.210 for cybersecurity services, Venable LLP. Grant, it's been a 13 00:00:48.210 --> 00:00:50.370 while but it's always great to have you join us. 14 00:00:51.000 --> 00:00:53.400 Grant Schneider: Great to be here with you guys. It's 15 00:00:53.400 --> 00:00:56.190 fantastic seeing you and I'm excited about the conversation. 16 00:00:56.490 --> 00:00:58.560 Anna Delaney: Yeah, absolutely. And also with us are my 17 00:00:58.560 --> 00:01:02.010 colleagues, Tom Field, senior vice president of editorial, and 18 00:01:02.010 --> 00:01:04.830 Mathew Schwartz, executive editor of DataBreachToday and 19 00:01:04.830 --> 00:01:09.720 Europe. So Grant, as you know, we like to start these sessions 20 00:01:09.720 --> 00:01:13.170 off by sharing where we are in our virtual world. So where are 21 00:01:13.170 --> 00:01:13.680 you? 22 00:01:14.440 --> 00:01:18.040 Grant Schneider: So I am clearly in Washington, DC. The important 23 00:01:18.040 --> 00:01:23.260 part though is that I'm finally alone in Washington, DC. So 535 24 00:01:23.260 --> 00:01:28.300 of our best friends in Congress have departed last week for a 25 00:01:28.300 --> 00:01:32.050 much needed recess. We were talking a little beforehand 26 00:01:32.050 --> 00:01:35.170 about some of the antics that happened over the last few 27 00:01:35.170 --> 00:01:39.430 weeks. It's been a bit of a crazy time. And I really think 28 00:01:39.640 --> 00:01:42.550 that members of Congress need a break. They need to settle down 29 00:01:42.550 --> 00:01:45.520 and hopefully come back rested and ready to get some work done. 30 00:01:45.000 --> 00:01:48.150 Tom Field: It's a nice way of saying they need a time out. 31 00:01:51.990 --> 00:01:54.180 Anna Delaney: Time out in the snow perhaps, Tom Field? 32 00:01:54.450 --> 00:01:56.790 Tom Field: Indeed, this is what I woke up to this morning. I was 33 00:01:56.790 --> 00:01:59.970 expecting maybe an inch of snow overnight, but woke up to the 34 00:01:59.970 --> 00:02:03.420 news is 1 to 3. So in New England, it indeed is the 35 00:02:03.420 --> 00:02:05.640 Thanksgiving holiday with frosting. 36 00:02:06.460 --> 00:02:09.435 Anna Delaney: Well, very pretty indeed. And Matt, are you out on 37 00:02:09.494 --> 00:02:10.030 the town? 38 00:02:10.020 --> 00:02:13.479 Mathew Schwartz: Yeah, the only thing cold here Anna is the 39 00:02:13.562 --> 00:02:18.669 Guinness. This is Dublin where I was for the annual IRISSCERT, 40 00:02:18.751 --> 00:02:23.200 Irish, anti-cybercrime conference, which is held every 41 00:02:23.282 --> 00:02:28.142 year, again by IRISSCERT, which is Ireland's first Computer 42 00:02:28.225 --> 00:02:33.085 Emergency Response Team. It's always a fun event, gathers a 43 00:02:33.167 --> 00:02:38.028 bunch of different security practitioners together to chase 44 00:02:38.110 --> 00:02:42.641 down the latest trends and advice in cybercrime and for 45 00:02:42.723 --> 00:02:45.030 cybersecurity professionals. 46 00:02:45.570 --> 00:02:47.430 Anna Delaney: But you could have fooled us there because that is 47 00:02:47.430 --> 00:02:59.220 not an Irish flag, is it? 48 00:02:49.259 --> 00:02:52.654 Mathew Schwartz: There seems to be some crossover here with the 49 00:02:52.728 --> 00:02:57.157 American, I don't know, vibe. Yeah, this is Temple Bar. So I 50 00:02:57.231 --> 00:03:01.956 lived in Ireland about 30 years ago, and it was a bit rundown in 51 00:03:02.029 --> 00:03:06.090 this area. But it's a bit of a tourist trap these days. 52 00:03:06.000 --> 00:03:08.160 Tom Field: Seeing the American flag, I saw the Guinness, I 53 00:03:08.160 --> 00:03:09.150 thought that was the flag. 54 00:03:10.560 --> 00:03:12.960 Grant Schneider: I had to look for it as well, Tom. It's like 55 00:03:13.380 --> 00:03:15.030 there's something besides beer there. 56 00:03:16.890 --> 00:03:21.540 Anna Delaney: Well, I was in Frankfurt, Germany last week, 57 00:03:21.660 --> 00:03:24.540 and I'm sharing a snapshot of the main square, Ram-Berg 58 00:03:25.110 --> 00:03:27.630 already with its resplendid Christmas tree. And the 59 00:03:27.630 --> 00:03:31.080 buildings appear to be medieval, which is quite cool. But as you 60 00:03:31.080 --> 00:03:33.630 know, Frankfurt was heavily bombed in the Second World War. 61 00:03:33.630 --> 00:03:37.050 So all the buildings have been reconstructed in the past 70 62 00:03:37.080 --> 00:03:40.410 years or so. And it's very impressive. So worth a visit if 63 00:03:40.410 --> 00:03:44.250 you're in the area. Oh, Grant, as you know, we have a few 64 00:03:44.250 --> 00:03:47.310 questions for you. So I'll hand over to Tom, at this point. 65 00:03:47.810 --> 00:03:50.720 Tom Field: Thank you very much. Grant, Anna mentioned upfront 66 00:03:50.750 --> 00:03:54.350 the updated secure software development self-attestation 67 00:03:54.350 --> 00:03:58.190 form from CISA. And it reminds me of something as CISA once 68 00:03:58.190 --> 00:04:01.460 said to me, which was "attestation is not a security 69 00:04:01.460 --> 00:04:05.540 control." So your thoughts on this new updated form? 70 00:04:06.560 --> 00:04:09.650 Grant Schneider: Yeah, so I agree. Attestation is not a 71 00:04:09.650 --> 00:04:13.310 security control. And the Secure Software Development Framework 72 00:04:13.430 --> 00:04:17.030 is indeed intended to be a framework and not be a security 73 00:04:17.030 --> 00:04:22.070 control as well, I think if you talk to folks at NIST. So CISA 74 00:04:22.070 --> 00:04:25.340 put this out as a draft about - I don't know, five months ago, 75 00:04:25.340 --> 00:04:28.280 maybe even six months ago, they received a lot of public 76 00:04:28.280 --> 00:04:32.570 comments. They really didn't make many changes, if any, to 77 00:04:32.570 --> 00:04:36.590 the form though. Candidly, I was a little surprised and a little 78 00:04:36.590 --> 00:04:41.810 disappointed. You know, they made one that basically had the 79 00:04:41.810 --> 00:04:44.780 signature saying to the best of my knowledge as opposed to an 80 00:04:44.780 --> 00:04:49.790 absolute like, you know, these are 100% accurate, so I think 81 00:04:49.790 --> 00:04:54.410 that was helpful, but they changed the signature back - it 82 00:04:54.440 --> 00:04:58.940 originally was CEO or their designee, and they've now 83 00:04:58.940 --> 00:05:02.870 changed it to be a company's CEO or their chief operating 84 00:05:02.870 --> 00:05:08.480 officer, and you know, many big companies, it's a significant 85 00:05:08.480 --> 00:05:13.400 hurdle to get the CEOs signature or even the COOs signature on 86 00:05:13.400 --> 00:05:16.640 something, particularly that implies the level of liability 87 00:05:17.090 --> 00:05:20.750 that this potentially has with an administration that's talking 88 00:05:20.750 --> 00:05:26.120 about shifting liability to, you know, providers of software and 89 00:05:26.120 --> 00:05:29.240 services. So, you know, if I'm an industry and people I'm 90 00:05:29.240 --> 00:05:32.690 talking to in the industry have significant concerns. There's a 91 00:05:32.690 --> 00:05:35.690 couple things in the attestation that I think are going to be 92 00:05:35.690 --> 00:05:39.200 very problematic. One of them is around provenance of software, 93 00:05:39.440 --> 00:05:43.880 and it explicitly includes third party, which would include 94 00:05:43.910 --> 00:05:48.380 open-source software. And, you know, no one really knows the 95 00:05:48.380 --> 00:05:52.220 complete provenance of the people, the processes and the 96 00:05:52.220 --> 00:05:55.490 technologies that go into open-source elements that, you 97 00:05:55.490 --> 00:05:59.240 know, companies using their software. So I think that one's 98 00:05:59.240 --> 00:06:02.480 going to be problematic. This is out for another 30-day comment 99 00:06:02.480 --> 00:06:07.640 period, with OIRA inside of the Office of Management and Budget. 100 00:06:08.180 --> 00:06:10.670 I don't think they're going to take a lot of comments, we're 101 00:06:10.670 --> 00:06:14.750 certainly going to reengage and try to explain some of the more, 102 00:06:14.960 --> 00:06:18.080 you know, just challenging aspects of what this form could 103 00:06:18.080 --> 00:06:22.850 bring for really discouraging - and my concern is - driving 104 00:06:22.850 --> 00:06:26.060 people out of the federal ecosystem, right. Companies are 105 00:06:26.060 --> 00:06:30.170 getting piles and piles of things on them, requirements to 106 00:06:30.170 --> 00:06:33.200 work with the government, and it's a lot of work to work with 107 00:06:33.200 --> 00:06:37.220 the government. The margins aren't always great. So I think 108 00:06:37.220 --> 00:06:38.660 this might cause some challenges. 109 00:06:38.000 --> 00:06:40.850 Tom Field: So something to look forward to in the new year. 110 00:06:44.180 --> 00:06:46.820 Grant, also look forward to in the new year, we expect we're 111 00:06:46.820 --> 00:06:49.490 going to have a new national cyber coordinator. As you know, 112 00:06:49.490 --> 00:06:53.840 the acting coordinator Kemba Walden stepped down recently. 113 00:06:54.860 --> 00:06:57.350 Not sure when we're going to have one, up for approval, but 114 00:06:57.350 --> 00:07:00.830 what is your advice going to be for the person who occupies this 115 00:07:00.830 --> 00:07:01.490 role next? 116 00:07:02.620 --> 00:07:05.530 Grant Schneider: Yeah, I think for the person that occupies 117 00:07:05.530 --> 00:07:11.080 this role, my advice is, you know, I think Chris Inglis came 118 00:07:11.080 --> 00:07:14.530 in and did a fantastic job of building out the organization. 119 00:07:15.040 --> 00:07:21.160 And going from zero people, one, himself, to the 85, or so that 120 00:07:21.160 --> 00:07:24.580 they have now, getting the footprint inside the Executive 121 00:07:24.580 --> 00:07:28.300 Office of the President and really becoming an institution 122 00:07:28.300 --> 00:07:32.050 there and did an amazing job. And I think Kemba, you know, 123 00:07:32.050 --> 00:07:35.530 picked up with the strategy and getting the strategy over the 124 00:07:35.530 --> 00:07:37.780 finish line and the implementation plan over the 125 00:07:37.780 --> 00:07:42.670 finish line and out, I think for whoever the next director is, 126 00:07:42.700 --> 00:07:47.200 presumably, it will be Harry. I think it's building 127 00:07:47.200 --> 00:07:51.310 relationships inside the compound, inside the complex, 128 00:07:51.310 --> 00:07:54.250 inside the Executive Office of the President. There's a lot of 129 00:07:54.250 --> 00:07:58.060 external work to do, and the workforce needs to do that. But 130 00:07:58.120 --> 00:08:00.970 I would recommend, you know, that they spent a concerted 131 00:08:00.970 --> 00:08:05.080 amount of time of meeting with their peers and other politicals 132 00:08:05.110 --> 00:08:07.630 inside the Executive Office of the President, and 133 00:08:07.810 --> 00:08:11.560 understanding, you know, I'm certain having worked at the EOP 134 00:08:11.560 --> 00:08:14.440 that bringing in a new organization that's now the 135 00:08:14.440 --> 00:08:19.090 third or fourth-largest inside the EOP of the 19 org or 20 136 00:08:19.120 --> 00:08:22.000 organizations there, you ruffle a lot of feathers, you stole 137 00:08:22.000 --> 00:08:27.820 people's office space, you got more budget than they've had. So 138 00:08:27.820 --> 00:08:31.750 I think going and doing some work with your peers would be 139 00:08:31.750 --> 00:08:38.500 really helpful on making sure that the ONCD becomes, you know, 140 00:08:38.830 --> 00:08:43.120 a long standing part of the Executive Office of the 141 00:08:43.120 --> 00:08:46.510 President and really becomes part of that institution, as 142 00:08:46.510 --> 00:08:49.300 well as obviously all the cyber work they need to do with the 143 00:08:49.360 --> 00:08:50.860 agencies and the policies. 144 00:08:52.840 --> 00:08:54.970 Tom Field: Well said. I speak for many, when Congress comes 145 00:08:54.970 --> 00:09:00.220 back, all we want for Christmas is a new cyber director. I'll 146 00:09:00.340 --> 00:09:02.650 pass it to my colleague Matt. Matt, your witness. 147 00:09:04.420 --> 00:09:06.730 Mathew Schwartz: No, no, no, it's not that kind of grilling 148 00:09:06.730 --> 00:09:11.770 here. Well, Grant always a pleasure to have you on. One of 149 00:09:11.770 --> 00:09:15.010 the things I've been looking at and hearing about recently, is 150 00:09:15.010 --> 00:09:20.500 this question of CISOs being under fire. Are you hearing 151 00:09:20.620 --> 00:09:23.830 increased liability concerns from your peers, from other 152 00:09:23.830 --> 00:09:28.990 CISOs, in the wake of the SEC lawsuit, for example, your most 153 00:09:28.990 --> 00:09:33.010 recent example, against SolarWinds? Do you think these 154 00:09:33.190 --> 00:09:37.150 concerns are justified? Is there any advice that you've been 155 00:09:37.270 --> 00:09:39.820 promulgating to CISOs for how they might need to better 156 00:09:39.820 --> 00:09:41.050 protect themselves? 157 00:09:42.100 --> 00:09:45.730 Grant Schneider: Yeah, I think, so short answer: There's a lot 158 00:09:45.730 --> 00:09:51.760 of concern with this. Anytime - you know, it's become personal. 159 00:09:52.270 --> 00:09:55.270 I mean, that's exactly what the SEC is about. They've made it 160 00:09:55.270 --> 00:09:59.500 personal, by actually, you know, charging an individual person 161 00:09:59.500 --> 00:10:02.350 and it just shouldn't to the organization. And I think most 162 00:10:02.350 --> 00:10:05.590 people know they're part of an organization that's probably 163 00:10:05.890 --> 00:10:09.640 going to be sued or hasn't been sued or is currently being sued 164 00:10:09.640 --> 00:10:14.830 for one thing or another. But to really say that, you know, that 165 00:10:15.040 --> 00:10:20.020 he personally had a role in personally misleading investors, 166 00:10:20.020 --> 00:10:22.840 right, which is essentially the charge with some of the things 167 00:10:22.840 --> 00:10:27.580 that, you know, the comments that he said, and so CISOs are 168 00:10:27.580 --> 00:10:29.650 definitely concerned, they're concerned of what their personal 169 00:10:29.650 --> 00:10:33.220 liability is going to going to be. I am certain there are 170 00:10:33.640 --> 00:10:36.640 conversations happening about liability insurance, and what 171 00:10:36.640 --> 00:10:42.130 that will look like. And are they covered by directors, you 172 00:10:42.130 --> 00:10:46.390 know, D&O insurance inside the companies. But I think, you 173 00:10:46.390 --> 00:10:49.570 know, the advice are, people really are going to need to 174 00:10:49.570 --> 00:10:53.380 think about is clearly both public statements, and in this 175 00:10:53.380 --> 00:10:58.360 case, internal statements, right, internal emails, are 176 00:10:58.360 --> 00:11:03.370 being potentially used against the CISO, and being sent back 177 00:11:03.370 --> 00:11:06.100 as, "Hey, you said we were actually doing fine." And then 178 00:11:06.100 --> 00:11:10.810 something bad happened. And, you know, I've long said that if a 179 00:11:10.810 --> 00:11:16.000 CISO's conversation with the CEO is the CEO saying, "Are we safe 180 00:11:16.000 --> 00:11:21.130 yet? Are we secure yet?" Right? The CISO either has to say yes, 181 00:11:21.130 --> 00:11:24.880 to keep their job, but get fired the first time something happens 182 00:11:24.880 --> 00:11:29.260 because no one's really secure, right? Or they say no, and they 183 00:11:29.260 --> 00:11:32.200 get fired right then because we're not secure. And so you 184 00:11:32.200 --> 00:11:35.530 need to have a far more nuanced conversation. And I think we've 185 00:11:35.530 --> 00:11:39.310 got to find a way, working with the SEC to be able to have more 186 00:11:39.310 --> 00:11:43.570 nuanced conversations with investors, which they're trying 187 00:11:43.570 --> 00:11:47.800 to do, because there isn't a state of your organization 188 00:11:47.800 --> 00:11:51.940 that's now secure. You can do all of the right things and 189 00:11:51.940 --> 00:11:56.710 still have an incident. And tying back and being able to 190 00:11:56.710 --> 00:11:59.650 say, well, this is the thing that caused the incident or 191 00:11:59.650 --> 00:12:03.820 wasn't - like being able to prove that the incident was not 192 00:12:03.820 --> 00:12:08.770 caused by a failure or a lapse someplace, is really, really 193 00:12:08.770 --> 00:12:13.180 hard, very, very time consuming, very resource intensive. And so, 194 00:12:13.450 --> 00:12:16.480 you know, I think my concern is that this is again going to have 195 00:12:16.480 --> 00:12:20.290 a chilling effect on CISOs. You know, it may have a chilling 196 00:12:20.290 --> 00:12:23.170 effect on people wanting to become CISOs. Right? If that was 197 00:12:23.170 --> 00:12:25.420 your, you know, "what I want to be when I grow up," maybe I 198 00:12:25.420 --> 00:12:31.600 don't anymore, but also is it going to really either mute some 199 00:12:31.600 --> 00:12:34.840 of their conversations, or make them have to run around 200 00:12:34.840 --> 00:12:37.480 internally saying the sky is falling, the sky is falling, the 201 00:12:37.480 --> 00:12:40.870 sky is falling, right, which I think in a business context is 202 00:12:40.870 --> 00:12:44.980 going to be really hard because senior leadership wants to 203 00:12:44.980 --> 00:12:47.500 understand risks, but they want to understand what are 204 00:12:47.500 --> 00:12:52.300 manageable and acceptable risks. And if the CISO is in a position 205 00:12:52.300 --> 00:12:55.450 where they can't talk about acceptable risks because those 206 00:12:55.450 --> 00:13:00.010 might be considered misleading statements, I think that's just 207 00:13:00.010 --> 00:13:02.410 a challenge going forward. So we're going to have to figure 208 00:13:02.410 --> 00:13:05.890 out a balance with this. And it's going to be really 209 00:13:05.890 --> 00:13:08.800 interesting to watch. And I think a lot of eyes are on this. 210 00:13:08.800 --> 00:13:09.880 And there's a lot of concern. 211 00:13:10.570 --> 00:13:12.790 Mathew Schwartz: I'm struck by the boilerplate you see in so 212 00:13:12.790 --> 00:13:17.860 many SEC quarterly filings from public businesses: "unforeseen 213 00:13:17.860 --> 00:13:19.870 events may yet occur." 214 00:13:19.000 --> 00:13:26.860 Grant Schneider: Yeah, I think you will get a lot of, you know, 215 00:13:27.310 --> 00:13:29.890 I want to say weasel words. Maybe that's inappropriate. But 216 00:13:31.540 --> 00:13:35.860 hygiene, yeah, great hygiene in comments that people are coming 217 00:13:35.860 --> 00:13:41.860 out with which, again, is that a service to investors as well? I 218 00:13:41.860 --> 00:13:44.680 mean, you know, you're going to see liability statements, as 219 00:13:44.680 --> 00:13:49.720 opposed to, again, more nuanced conversation around security and 220 00:13:49.720 --> 00:13:53.530 real risks. And with cybersecurity, there's just so 221 00:13:53.530 --> 00:13:57.130 many unknown risks and potentially unforeseen events, 222 00:13:57.130 --> 00:13:57.940 as you pointed out. 223 00:13:58.560 --> 00:14:00.960 Mathew Schwartz: Well, in terms of dealing with these risks, 224 00:14:01.290 --> 00:14:05.310 having good threat intelligence is something that people have 225 00:14:05.340 --> 00:14:09.960 been seeking for a long time. And the government is getting 226 00:14:09.960 --> 00:14:13.410 behind this as well. There's a new proposed rule from the DOD, 227 00:14:13.530 --> 00:14:18.120 the GSA, and NASA, of all people, on modernizing 228 00:14:18.150 --> 00:14:21.900 cyberthreat and incident reporting. And as I understand 229 00:14:21.900 --> 00:14:25.920 it, this aims to remove contractual barriers that might 230 00:14:25.920 --> 00:14:29.550 be in place with contractors, I believe, and to facilitate 231 00:14:29.550 --> 00:14:32.880 easier sharing of data, so the government has a better 232 00:14:32.910 --> 00:14:37.350 defensive posture. Am I reading that correctly? And I'm just 233 00:14:37.350 --> 00:14:42.030 interested in your perspective about what this rule is trying 234 00:14:42.030 --> 00:14:45.990 to do? What's required to make it successful and if you think 235 00:14:45.990 --> 00:14:47.010 it might be successful. 236 00:14:47.660 --> 00:14:51.380 Grant Schneider: Yeah. Matt. So this is another thing that's 237 00:14:52.220 --> 00:14:56.060 creating a little consternation inside of industry, I would say, 238 00:14:56.360 --> 00:14:59.750 and, you know, the way you phrase it, removing contractual 239 00:14:59.750 --> 00:15:02.330 barriers, I think that is probably the way the government 240 00:15:02.330 --> 00:15:06.470 would would look at this, that it's removing barriers where, 241 00:15:06.650 --> 00:15:09.320 you know, they heard during SolarWinds and during some other 242 00:15:09.320 --> 00:15:13.550 incidents, where agencies said, "Hey, I either can't share that 243 00:15:13.550 --> 00:15:16.820 with you," or probably it was "I don't want to share that. And I 244 00:15:16.820 --> 00:15:20.540 don't have to, and therefore, I'm going to say I can't." I 245 00:15:20.540 --> 00:15:24.980 think industry says that this is imposing contractual, you know, 246 00:15:25.010 --> 00:15:28.220 barriers, if you will, or mandates that are going to 247 00:15:28.220 --> 00:15:33.770 require a lot more sharing around incidents. And, you know, 248 00:15:34.070 --> 00:15:38.300 which I think, in general industry is probably okay with, 249 00:15:38.330 --> 00:15:42.110 Hey, we should share more information. Because really, 250 00:15:42.110 --> 00:15:45.200 this is about being able to protect federal information, 251 00:15:45.200 --> 00:15:49.040 federal information systems, the broader ecosystem, and this type 252 00:15:49.040 --> 00:15:53.780 of sharing can be helpful in that way. Some of the things 253 00:15:53.780 --> 00:15:58.790 that are in this proposed rule, though, is it talks about a lot 254 00:15:58.820 --> 00:16:02.960 of access for the government on to contractor systems. And the 255 00:16:02.960 --> 00:16:08.030 way it's phrased is a little unspecific because it talks 256 00:16:08.030 --> 00:16:12.080 about, you know, the FBI and CISA. And then I think later it 257 00:16:12.080 --> 00:16:16.610 talks about or anyone, the U.S. government wants to designate as 258 00:16:16.610 --> 00:16:21.380 a third-party person would have access to any system used in the 259 00:16:21.380 --> 00:16:24.170 performance of this contract. And in the performance of the 260 00:16:24.170 --> 00:16:27.410 contract could mean, you know, directly here, this is the 261 00:16:27.410 --> 00:16:30.680 software I'm providing to the government, or it could mean my 262 00:16:30.680 --> 00:16:34.130 HR system, because that's what I use to hire the people that 263 00:16:34.280 --> 00:16:37.130 perform a service for the government. So it's pretty 264 00:16:37.130 --> 00:16:40.610 broad. And there's a lot of concern around essentially what 265 00:16:40.640 --> 00:16:43.550 reads like potentially unfettered access for the 266 00:16:43.550 --> 00:16:46.700 government to just come in whether you've reported an 267 00:16:46.700 --> 00:16:50.360 incident or they've identified that you had an incident or had 268 00:16:50.360 --> 00:16:56.060 a potential incident. And so the scoping for it is very, very 269 00:16:56.060 --> 00:16:59.360 broad. And I'm hopeful that we're going to be able to narrow 270 00:16:59.360 --> 00:17:05.150 that down. Also, the reporting timelines are, I think, it's 271 00:17:05.150 --> 00:17:11.210 within eight hours of discovery of an actual or potential 272 00:17:11.210 --> 00:17:15.320 incident. And again, you know, that means someone in the middle 273 00:17:15.320 --> 00:17:18.380 of the night at a company is having to make the determination 274 00:17:18.560 --> 00:17:21.260 that their company should report something to the federal 275 00:17:21.260 --> 00:17:24.860 government that was a potential incident, not even an actual 276 00:17:24.860 --> 00:17:28.250 incident. And so I understand that I'm wanting to get 277 00:17:28.250 --> 00:17:32.240 information quickly. What I hear from the government is, "Hey, if 278 00:17:32.240 --> 00:17:34.760 you want to be played with the government, if you want to be a 279 00:17:34.880 --> 00:17:37.370 contractor, you need to play by the government's rules, you need 280 00:17:37.370 --> 00:17:41.240 to play by the same rules that government agencies play by and 281 00:17:41.240 --> 00:17:45.920 they have to report within an hour's timeline to CISA." And I 282 00:17:45.920 --> 00:17:51.290 appreciate that, having been the federal CISO, but the challenge, 283 00:17:51.290 --> 00:17:54.590 or one of the differences, is that federally, there's no 284 00:17:54.590 --> 00:17:59.090 federal agency jail, there's no FISMA jail. We say this all the 285 00:17:59.090 --> 00:18:01.220 time when I was in government, there's no FISMA jail. So if an 286 00:18:01.220 --> 00:18:04.520 agency doesn't actually meet their requirements, or they fall 287 00:18:04.520 --> 00:18:07.580 down somewhere, they can get yelled at. There are 288 00:18:07.970 --> 00:18:10.280 ramifications. I don't want to imply they're right, but there 289 00:18:10.280 --> 00:18:15.290 certainly can be. But for a contractor, it's illegal false 290 00:18:15.290 --> 00:18:18.920 claims, right. And it's a false claim that can come back to 291 00:18:18.920 --> 00:18:22.190 every invoice that they send to the government, if it turns out 292 00:18:22.190 --> 00:18:25.850 that they didn't report something that the government 293 00:18:25.850 --> 00:18:29.720 thought should have been reported. So I think I'm a big 294 00:18:29.720 --> 00:18:32.240 fan of information sharing, I'm a big fan of getting more 295 00:18:32.240 --> 00:18:35.330 information so that we can help protect the ecosystem. There's 296 00:18:35.330 --> 00:18:39.080 nothing in this rule about how the government's going to use 297 00:18:39.080 --> 00:18:41.450 this information and how they're going to share back with 298 00:18:41.600 --> 00:18:45.170 industry. It's an information sharing requirement that you 299 00:18:45.170 --> 00:18:48.740 share more information with us or report it to us. It's really 300 00:18:48.740 --> 00:18:51.320 an information reporting requirement, as opposed to 301 00:18:51.320 --> 00:18:55.250 information sharing, in my opinion, so I think they're 302 00:18:55.250 --> 00:18:57.920 going to get a whole lot of comments on this. These were 303 00:18:57.920 --> 00:19:02.390 originally due on December 4, they did give us - the industry 304 00:19:02.780 --> 00:19:05.300 had asked for a 60-day extension, and they provided 305 00:19:05.300 --> 00:19:09.260 that. So industry gets to work over the holidays on this, but I 306 00:19:09.260 --> 00:19:12.470 think it's going to be important work. So understand what the 307 00:19:12.470 --> 00:19:15.800 government's trying to get to. I think there's some ways to 308 00:19:15.800 --> 00:19:16.670 improve this though. 309 00:19:17.570 --> 00:19:20.720 Mathew Schwartz: Fantastic, wonderful, nuanced reading 310 00:19:20.720 --> 00:19:23.930 there, especially with your experience. Thank you. Handing 311 00:19:23.930 --> 00:19:26.510 you over now to Anna. 312 00:19:27.320 --> 00:19:31.010 Anna Delaney: Good. Thank you. So I want to discuss concerns 313 00:19:31.040 --> 00:19:35.810 about a proposed 25% budget cut to CISA as part of the fiscal 314 00:19:35.840 --> 00:19:41.480 2024 Homeland Security spending belt. Grant, how might this cut 315 00:19:41.510 --> 00:19:45.650 impact CISA's core functions? And what do you think the 316 00:19:45.650 --> 00:19:49.880 potential ramifications will be on CISA's ability to fulfill its 317 00:19:49.880 --> 00:19:51.080 cybersecurity mission? 318 00:19:52.620 --> 00:19:58.320 Grant Schneider: Yeah, it's unfortunate that this cut is in 319 00:19:58.710 --> 00:20:03.420 the House bill, because cybersecurity for so long has 320 00:20:03.420 --> 00:20:08.250 been a bipartisan, you know, a bipartisan topic, we've been 321 00:20:08.250 --> 00:20:13.080 able to get bipartisan work done. And it's becoming more 322 00:20:13.080 --> 00:20:17.490 partisan. And it really is related, I think, to CISA's work 323 00:20:17.490 --> 00:20:21.720 around elections and election security, and the perception 324 00:20:21.720 --> 00:20:26.280 that CISA was doing censoring of messaging on some of the 325 00:20:26.400 --> 00:20:30.240 disinformation campaigns that they had worked on. So, you 326 00:20:30.240 --> 00:20:34.980 know, it's concerning to me from a political kind of a broader 327 00:20:34.980 --> 00:20:38.010 ecosystem where we've had bipartisan support. I think, to 328 00:20:38.010 --> 00:20:42.180 your question, though, on, like, where the rubber meets the road 329 00:20:42.240 --> 00:20:46.440 - 25% is a big cut. Now, CISA has grown a ton, they've grown a 330 00:20:46.440 --> 00:20:50.460 ton over the years. And with that, they're able to do a lot 331 00:20:50.460 --> 00:20:53.610 more, they are far more involved in the ecosystem, they're far 332 00:20:53.610 --> 00:20:57.870 more able to provide alerts, provide information to industry, 333 00:20:57.870 --> 00:21:00.810 to critical infrastructure, to work with government agencies, 334 00:21:01.050 --> 00:21:03.390 and I think they've really stepped up across the board 335 00:21:03.390 --> 00:21:09.240 there. It is concerning to me of what a 25% cut would look like. 336 00:21:09.540 --> 00:21:12.870 Obviously, CISA would have to determine what to reduce, and 337 00:21:12.870 --> 00:21:16.050 how to do that, and would ultimately be the one driving 338 00:21:16.710 --> 00:21:21.450 the impacts. But this is something that, you know, it 339 00:21:21.450 --> 00:21:27.090 could end up a lot of manpower, it could impact some of the pay 340 00:21:27.090 --> 00:21:32.430 scale that CISA has been using. So CISA has a cyber, basically 341 00:21:32.430 --> 00:21:36.060 personnel system that they're able to, you know, ideally hire 342 00:21:36.060 --> 00:21:39.480 and bring in and recruit, you know, really good talent, 343 00:21:39.480 --> 00:21:41.970 because they can pay them a little better than normal 344 00:21:41.970 --> 00:21:44.940 government salaries, maybe they have to pull back on that. And 345 00:21:44.940 --> 00:21:49.230 it's going to make them harder to retain people, harder to 346 00:21:49.230 --> 00:21:53.700 recruit people. So it could have a lot of impacts on CISA's 347 00:21:53.700 --> 00:21:58.080 ability to really be, as I think they like to refer to themselves 348 00:21:58.080 --> 00:22:02.760 as the "cyber risk manager for the nation" with a far smaller 349 00:22:02.760 --> 00:22:03.120 budget. 350 00:22:04.860 --> 00:22:07.830 Anna Delaney: In the event that budget cuts are implemented, are 351 00:22:07.830 --> 00:22:11.280 there alternative strategies or even approaches that could be 352 00:22:11.340 --> 00:22:14.460 considered to mitigate the impact on cybersecurity efforts? 353 00:22:15.560 --> 00:22:19.970 Grant Schneider: Well, I mean, anytime you're faced with - and 354 00:22:19.970 --> 00:22:22.460 I have in my career before, I've been faced with some pretty 355 00:22:22.460 --> 00:22:27.800 drastic budget reductions, you know, you can run around with 356 00:22:27.800 --> 00:22:31.970 your hair on fire and say, "The sky is falling," and there will 357 00:22:31.970 --> 00:22:36.830 be a lot of that. And you need to sit down and say, "Okay, 358 00:22:36.950 --> 00:22:39.950 where am I really investing my core money," and in the 359 00:22:39.950 --> 00:22:43.880 government, you know, we usually talk about, spend a lot of time 360 00:22:43.880 --> 00:22:47.780 focused on, potential new money, we don't spend a lot of time 361 00:22:47.960 --> 00:22:50.930 focused on how we're spending the money that we get every 362 00:22:50.930 --> 00:22:55.010 year, if you will. And so this is an opportunity for Jen 363 00:22:55.010 --> 00:22:57.950 Easterly and Eric Goldstein and others to really sit down and 364 00:22:57.950 --> 00:23:03.170 say, Okay, what are we investing in? Where are our resources, 365 00:23:03.170 --> 00:23:06.350 prioritized? What are the efforts that are really 366 00:23:06.350 --> 00:23:09.710 generating results for us that we can see tangible, measurable 367 00:23:09.710 --> 00:23:13.370 results? And are there any that aren't, and maybe we should, you 368 00:23:13.370 --> 00:23:17.360 know, think about realigning or adjusting in some way, shape or 369 00:23:17.360 --> 00:23:20.360 form. So, you know, at the same time, there's a little bit of 370 00:23:20.360 --> 00:23:23.810 it, it's an opportunity there, I don't know specifically because 371 00:23:23.810 --> 00:23:26.600 I don't have the metrics inside of CISA - kind of where they may 372 00:23:26.600 --> 00:23:32.750 pull, decide to de-emphasize and emphasize other areas. But I 373 00:23:32.750 --> 00:23:35.840 would imagine that they're going to pull back toward more core 374 00:23:35.840 --> 00:23:38.870 functions, which is supporting federal agencies and Federal 375 00:23:38.870 --> 00:23:41.930 Information Systems and critical infrastructure and kind of work 376 00:23:41.930 --> 00:23:45.860 they're doing beyond that will be a little harder to justify in 377 00:23:45.860 --> 00:23:49.040 that type of budget scenario, if that's where we end up. 378 00:23:49.790 --> 00:23:51.530 Tom Field: Ultimately, Jen Easterly will be hosting a 379 00:23:51.530 --> 00:23:52.220 GoFundMe. 380 00:23:57.860 --> 00:24:02.030 Grant Schneider: She might do well with one. We can auction 381 00:24:02.030 --> 00:24:05.150 off Rubik's cubes. And you know, I can I think there's a lot of 382 00:24:05.150 --> 00:24:06.410 options they could take. 383 00:24:06.860 --> 00:24:10.220 Anna Delaney: 25% increase! Well, hopefully won't be 384 00:24:10.220 --> 00:24:12.140 introduced - these cuts. But for now, that was a really 385 00:24:12.140 --> 00:24:16.340 excellent overview of situation, Grant. So there's our fun bit. 386 00:24:16.550 --> 00:24:20.480 Finally, and for fun, we're talking about fortune cookie. So 387 00:24:20.480 --> 00:24:23.690 if you have to put a cybersecurity tip inside of 388 00:24:23.690 --> 00:24:26.990 fortune cookie, what would it say? Tom, do you want to start 389 00:24:26.990 --> 00:24:27.320 us off? 390 00:24:27.830 --> 00:24:29.600 Tom Field: Well, two sides, one side, you get the lottery 391 00:24:29.600 --> 00:24:35.090 numbers, of course. So, probably 123456. On the flip side, it 392 00:24:35.090 --> 00:24:37.070 would say change your bloody password. 393 00:24:39.500 --> 00:24:40.580 Anna Delaney: Wise words. Matt? 394 00:24:41.000 --> 00:24:44.300 Mathew Schwartz: Yes, I don't mean to repeat, but I would say 395 00:24:44.300 --> 00:24:47.600 - my fortune cookie would say use a password manager and 396 00:24:47.660 --> 00:24:49.430 two-factor authentication. 397 00:24:51.760 --> 00:24:55.330 Anna Delaney: I would say beware the fishing tides to keep cyber 398 00:24:55.360 --> 00:25:07.030 dragons at bay. Grant, anything to wow us with? 399 00:25:08.200 --> 00:25:12.430 Grant Schneider: So my fortune cookie would say the only limit 400 00:25:12.700 --> 00:25:16.630 to our realization of tomorrow is our failure to implement 401 00:25:16.660 --> 00:25:20.740 multi-factor authentication today. So similar to Matt and 402 00:25:20.740 --> 00:25:24.970 Tom on the identification focus. 403 00:25:25.960 --> 00:25:28.300 Mathew Schwartz: Much more poetic, much more eloquently 404 00:25:28.300 --> 00:25:32.200 put, Grant. You're hired at the fortune cookie production 405 00:25:32.800 --> 00:25:37.270 Grant Schneider: I did actually go and ask this question to an 406 00:25:37.270 --> 00:25:42.130 AI bot and the answers were not very inspiring, to be honest. 407 00:25:43.240 --> 00:25:49.630 Anna Delaney: Grant beats AI, every time. Well, we are so very 408 00:25:49.630 --> 00:25:53.230 grateful to you for your time and invaluable insight. As 409 00:25:53.260 --> 00:25:56.050 always, thank you so much, Grant and Happy Thanksgiving to you 410 00:25:56.050 --> 00:25:56.290 all. 411 00:25:56.830 --> 00:25:57.400 Tom Field: Thanks so much. 412 00:25:57.400 --> 00:25:59.860 Grant Schneider: Thank you very much. We appreciate it. Great 413 00:25:59.860 --> 00:26:01.210 spending time with you guys today. 414 00:26:01.990 --> 00:26:04.060 Anna Delaney: And thanks so much for watching. Until next time.