
What Business Leaders Can Learn from Higher Ed Cybersecurity
Ann Johnson: Welcome to Afternoon Cyber Tea, where we explore the intersection of innovation and cybersecurity. I'm your host Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. [ Music ] Today we are excited to welcome Micah Czigan, the Chief Information Security Officer at Georgetown University. With a distinguished career spanning government, defense, and private sector, Micah has led critical cyber initiatives at the Department of Energy, Department of Defense, and Symantec. Now as the CISO of Georgetown, he is tasked with ensuring the security and privacy of the global Georgetown community and protecting the university's critical data. Welcome to Afternoon Cyber Tea, Micah. >> Thank you. Glad to be here. So you have this amazing and diverse background. You know, I talked about it just a little bit in the introduction, and you even served in the U.S. Navy as a cryptologic communications specialist. Can you share with the audience your career journey and what drew you to cybersecurity?
Micah Czigan: Sure. So I come from a long background of military service. My family's been serving since the Revolutionary War. Although my parents would have been fine with me doing whatever, my original course was not cyber. I actually went to Texas A&M University and studied marine biology, so completely different, and I joined the Merchant Marine, served as a Merchant Mariner, and then sort of just felt like this really wasn't the place for me, and there's nothing wrong with it, and after a couple of months sitting on my dad's couch, he said, "Son, you need to do something." And so I joined the Navy, went back to sea, which I love, I love being on the water, and I did not plan -- didn't go in with any idea of really what I wanted to do, and the Navy said, "Hey, we got this thing called 'crypto'." I was like, "I don't know." And I called my dad and I'm like, "What is this?" He was like, "Take it." So I did, and it was amazing. So it really has a lot to do with intelligence, tactical intelligence and what we call "signals intelligence," which now has kind of morphed into cyber intelligence, but at the time, cyber wasn't really a thing. This was back in the '90s, so I don't know if the word "cyber" had even come up yet. And in the Navy, we were intercepting signals of adversaries, determining where those signals came from, trying to determine, if we could decrypt them, what they were saying, who they were meant for, those kind of things, and that was really my first kind of professional foray into IT, into this world of what is now cyber. And when I decided -- I spent eight years, four years active duty, four years in the reserves. When I got out of the Navy, I had very high-level clearance, and for people who get out, the clearance, there's a lot of people who want you simply because of your clearance. And I was in San Diego, a lot of high tech companies in San Diego, military, the Navy has Space and Naval Warfare Center there, so I was able to get a job very quickly in IT, and that's where I really started my civilian career in IT, doing system administration, IT management, those kind of things. And after a little while, my wife and I decided it was time to leave California and I took a job with the U.S. Army doing network security. I'd done some of that, again, cyber, and back then they called it -- I think they called it "information assurance," and really there were very few people at that time who really understood and could do that a lot, and it was a skill set I had developed, the Navy taught me how to do. That's really where I started it, because there was just a lack of people with that training that the military had trained me to do. Working for the Army for a little while, I eventually made my way to the Pentagon, and I spent 11 years at the Pentagon. Very few people, civilians, go into the Pentagon and leave the Pentagon doing the same job. Like, you're going to change positions. You're going to do all kinds of different things. And in my 11 years, I was all over the Pentagon doing different stuff in the IT organization, and I really ended up -- I started doing system administration and I left doing cyber. At the Pentagon is really where I found my love for cybersecurity. That's where, at that time, it made that transition from information assurance into cybersecurity, and we really started to see some very codified policies, procedures at the federal level, NIST 800-171, all these NIST standards actually come to fruition, and I was there. I knew the guys who were writing these and was part of that, and so it just kind of made sense. I knew this was not a fad, like this is really going to be a thing. It's a great time to get in and start my career, make that career change. I tell you, it's been great. I love it. I love cyber. As you know, every day is a new day. There's nothing that we did yesterday that's the same. There's always a new challenge. I think one of the things that has really helped me from my career in the military and DOD is that it's warfare, right? We are fighting the cyber adversaries. I think for people perhaps who are coming from -- you don't have that background, that's a little bit of a step for them to make that realization that the adversaries on the other side of the screen, whether they're bots or real people, it's real, and we really are in a battle, we're in a warfare, and a lot of the techniques that we see in cyber have corollaries to military warfare techniques.
Ann Johnson: You know, we talk a lot about the intersection of kinetic and cyber war. I love to do recruiting, to be honest with you, from transitioning military members because even if they're not cyber skilled, they know how to work as a team. They know how to work under pressure. They understand warfare. There's just so many intangibles there that we can teach them cyber skills, right? As time progresses. So we have a whole transitioning military program at Microsoft and it's one of the best places for us to recruit talent into the cyber world.
Micah Czigan: Exactly, and I've definitely found the same. The other thing that I've found, for people who've been in the military or been in federal service, is the ability to think critically, to just figure it out, right? I had conversation with some of my team today and I'm like, "Look, we got this thing, like nobody knows how to do it, like nobody does." Okay, you don't know that code language, we've got it. We're going to figure it out. Get the manual out, we're going to go line by line. We're going to start reading the code. And for a lot of people who haven't had that experience, they're like, "What do you mean? I haven't been trained on this. I can't do this." But you get guys from the government and they're like, "Oh, okay, yeah, let's do it. We're done. We'll order pizza, and we're here until it gets finished," right?
Ann Johnson: Yeah. There's a culture. It's hard to explain to folks, but when we build cyber teams, it's just a great fertile recruiting background for us and to really bring people into commercial environments that have those intangible skills that I can't teach them. They're just -- they come innately with them. You know, universities are really attacked, right? They're some of the most attacked entities in the world. I think I read somewhere third largest or something like that, or third most attacked. What drew you from you've had this background that's military government warfare into going to a university? How did you end up there?
Micah Czigan: When I was in the Pentagon, one of my mentors was helping me, you know, trying to figure out, before I made the move into cyber, what do I want to do with my career? And one of the exercises that she had me do was kind of list out all of my experiences, places I've been, and then what are the things looking forward where maybe I haven't been, things I haven't done. And, you know, if I look at myself at retirement, you know, at age whatever, 65, 70, if I look back at those experiences, what do I want to see and where are those gaps? And so I'm like, okay, fine, yeah, I'll do that. And one of the items that I just have never done before was work at a university, and it was on my list, and so I was working for Symantec, and this Symantec had got bought out by Broadcom and split up, and the entire cybersecurity department, as well as many others, we all got laid off, and this is in the middle of COVID. This is just as COVID was starting to spike. And so, you know, I'm like, okay, well, I'm comfortable, right? I got a good payout from the company. I can survive for a while. I don't have to be picky. I mean, I can be picky. I don't have to just take anything. And at the same time, like nobody was hiring. So as I'm kind of sitting around, I had gone on -- I just like, okay, fine. I mean, I know what Georgetown is, but I'll never get that. Like, that's too much, but I said, why not? There's no harm in it.
Ann Johnson: All they can do is say no.
Micah Czigan: Exactly, that's right. And surprisingly, they said yes, and I'm like -- so, funny, I talked to my wife and I'm like, "I don't know if I really want to do this," and she goes, "Take it, because if you don't, you're just going to sit around the house all day long."
Ann Johnson: So I love that.
Micah Czigan: So I'm like, okay, you know, great. If I don't like it, you know, I'll quit, and I told her, yeah, I said, "Okay, I'll promise you a year," right? I promise I won't quit. I won't go anywhere. I won't even look anywhere else for a year, and if in a year I'm good, I'll give you another three years after that, because I know that for a lot of CISOs, and I talked to a few others at universities, it's a high turnover rate. It's a high burnout rate, and I didn't know what to expect having never worked at a university. I went to college, but I was never on the other side. So I've been here now four and a half years.
Ann Johnson: That's fantastic.
Micah Czigan: Yeah.
Ann Johnson: So a long time ago, I was actually at the University of Wisconsin, it's been a very long time, talking to them about cybersecurity, and I bring this up because, and they were talking about how there's this need -- they have this need to balance research, right? They need to be open, and open for research. There's student privacy, right? Student access. At that point in time, not even every student had every kind of, you know, students were just getting mobile devices. Like, people were showing up on campus with their own computers and their own mobile devices and all of these things. So you have this huge -- it's a really complex environment, right? It's even more complex than a lot of corporate environments because of the population that you're dealing with because of the research elements. How do you balance across all of those things to be open for innovation, to respect student privacy, but also keep the network locked down, basically?
Micah Czigan: Yes. So I think the biggest surprise for me is that universities, most of them, are like cities. I conceptually knew people live there. People work there. It operates 24/7, and it is like a small city. It's open to the public. People can just walk on campus. We don't have 20-foot high fences. It's not a secure facility, and that brings massive amounts of complexity. We have buildings. We used to have our own power plant. We have water systems, sewage. We used to have our own hospital. We've sold those. But all of those things coming together, and as I'm getting on board, I'm like, this is a city, this is a town, and then I started to look at how do governments, small governments, run a town or a city, and what are those complexities? And I'm like, wow, that really fits. And so I said, you know, obviously, I don't, as I'm sure you, you don't know everything that goes on in Microsoft. Like, it's just too big, and you have to rely on your teams. You have to rely on the people that you hire and the teams that are outside of your teams, and so I think the big key is to, one, understand your environment. To use a military term, what's the battle space? And then, what do I have to work with, right? What's the army I'm going to battle with? And then where are my gaps? And how do I fight that? And the program that it is today looks nothing like it did when I started. The cyber program in Georgetown, a lot of universities, is not very old. This whole tech thing is really still, I think, conceptually in the minds of the university, is still fairly new. And one of the good things, if I can say that, about COVID is it pushed us and everybody remotely and forced us into recognizing everything that we do relies on technology. Everything. And so I think where previously a lot of the administration, like IT is just one of the other departments. They really saw how, wow, IT and security are integral to everything we do, building maintenance, all the facilities maintenance, even the groundskeeping, timekeeping, all the research. Everything revolves around, has some kind of nexus with IT and security. And so that meant that they were -- the administration has been very supportive of what we're trying to do. I mean, I really do think that they get it. I've definitely been in a lot of jobs where I thought, man, the leadership doesn't get it. They don't understand. But I can say for sure they really do understand. They really do get it. When I started, it's like, okay, well, we need a framework. We need a structure, right? And why reinvent the wheel? And so we just started with NIST 800-53, and that's what we're going to build it on, and everything in the cyber program revolves around risk. What's the risk? And one of the first questions I ask is, all right, what's the risk appetite of the university? What are the risks so that I can inform the board and the administration of what the risks are that I'm asking them to accept so they understand, and, you know, that was quite a process of trying to figure all that out because none of that had been done. None of that was documented. So we've been on this journey of ensuring that we understand, what is the landscape of risk, defining that risk, understanding what the university's risk appetite is, and then walking through this path of 800-53, and now CMMC, 800-171.
Ann Johnson: And do you think the standards approach, so let me talk about that, because we've talked in security before that, and I'm not trying to be contentious, we've talked to security before that just because you're aligned with standards doesn't mean you're secure, but that's more regulatory than NIST or CMMC. Do you feel like the standards approach actually gives you a security maturity, right?
Micah Czigan: No, it gives you a framework. It gives you a start, right?
Ann Johnson: That's a great way to frame it. No pun intended, right?
Micah Czigan: Yeah, it is. I mean, if you tell somebody, here is a problem, go fix the problem, how do you approach -- any problem, you got to approach any problem, any project, well, let's start with some kind of framework that is familiar to me that is at least going to get me some percentage, 25%, 50%, whatever that is, doesn't matter, these NIST and the 800 series gives us some kind of defined pathway to something, and I'm not just making it up. The other thing it does, it prevents a lot of arguments because people say, "Well, why do I have to do this?" You know what? This is not Micah's greatest idea. Like, we're actually following people who've done a lot of thinking, a lot of research on these things, right? And so this control isn't just because Micah thought it's a good idea. It's because NIST says, like, look, this is really a good idea. There's real research behind this. And so that makes the arguments for why we have to do these things a lot easier for me, because then I can say, "Argue with the standards, not me."
Ann Johnson: Yeah, I love that approach, by the way, because you're actually -- and because you work at a university, right? You're talking to intellectuals. You need to bring a lot of data, and that just gives you a framework to talk from.
Micah Czigan: Because they are very quick to remind me I do not have a Ph.D. and they do, so they're smarter than I am.
Ann Johnson: Oh, boy. You know, you can laugh about this, as a graduate of a state college, I hear those things even in my own world, "I have an MBA from," name an Ivy League school, and like, that's fantastic. I went to a tiny college, state college in Utah and I'm doing okay.
Micah Czigan: Great. Yeah, I think you've been successful.
Ann Johnson: Yeah, I'm doing okay. So can you talk a little bit about how, then, you encourage, with those frameworks, right, as a baseline, you have faculty, you have staff, you have students. How do you drive cyber hygiene? How do you educate? How do you make people aware? How do you drive it with your users?
Micah Czigan: Yes, the human element is the most difficult element. It's easy for us to implement the technical controls. That's simple. It's the human piece, and to use somebody else's phrase, I can't fix stupid, but I can mitigate it, and so --
Ann Johnson: I might steal that, by the way.
Micah Czigan: Yeah, at least we try. We try to mitigate the stupid. We try to separate, again, risk, right? What's the risk? I honestly thought that students were going to be my biggest problem and they're not. We have very little interaction, very few cases where we have real problems with students, not that they're doing all the right things, but students are there to learn. They're not actually using a lot of our resources. They're using the educational systems. They're learning from them, the learning modules. They're looking at their grades, but unless they're an employee, we're not paying them, and so a lot of that really sensitive data, they're not dealing with a lot of research, some of them are, but most are not, and so they haven't been the struggle. I thought that computer science were going to be my Achilles heel, right? Because they are going to tell me they know, and they really do probably know a lot more than I do, but they have been some of my best advocates since I've been there. And, you know, I've even been able to reach out to computer science and say, "Hey, look, you know, I'm dealing with this problem that's a university kind of culture thing. Help me to -- what am I missing?" And they've been very helpful and helped me navigate through the university ecosystem, the university ethos, and where I have had some issues is research, and like everybody else, shadow IT, and most of the shadow IT has not been a willful, purposeful "I'm not going to cooperate." It's just lack of knowledge. They weren't trying to do anything malicious. They just really didn't know any better. And all of our researchers, I think I can say that, all of our researchers are very happy to come into compliance, because for them, research is their career. If they don't publish and they don't do that research, they're not going to be successful as a Ph.D. For them, it's like, "Oh, my God, yes, please protect me. I just don't know. I don't know how, and make it easy." And so we've really tried to work with our researchers to get them into the fold in an easy fashion, giving the easy button, giving the easy path to get to not just compliance but real security. We haven't had a lot of resistance with that.
Ann Johnson: So I want to ask you, Micah, about researchers. I think this is probably a good segue, is how do you balance that? What are your approaches? I don't want you to give me any keys to the kingdom, but what are your approaches to balancing open collaboration and keeping that sensitive research data and that unique data secure? How do you think about it?
Micah Czigan: Right. So one of the first questions, and I try to ask them, you know, like, okay, this person doesn't know anything about technology, right? So I said, is the data what I call truly open? Like, I can leave it on a bus and I really don't care who gets it. Like, I want to give it away. If that's the case, then, okay, the only thing that we're really concerned about is making sure that data doesn't change, somebody doesn't modify that who shouldn't modify that data. If that's not the case, then let's figure out, what is the sensitivity of this data? What's the classification of the data? And so we've developed a pretty, I think, pretty easy matrix, the researchers, not us, it's the researcher, it's their data, how they determine what level of classification, and I don't mean government classification, I mean sensitivity classification, that data is.
Ann Johnson: Like we use.
Micah Czigan: Yup, and they were very happy to do that because they don't want any more restrictions than have to be, but if it is sensitive, then they really do want that protection. Once they've been keyed into that and once we walk them through that, they're like, "Oh, okay, yeah, I got it. Now I understand." And they're very quick to be able to make those determinations. And then we say, okay, now we need what we call a "data custodian." Who's going to ensure -- not from a technical standpoint, but who's going to be that gatekeeper to make sure that only people who should have access are going to have access. Again, not from a technical standpoint, right? But just from an access standpoint. And once we do that, we're like, okay, now we've got it. Now we're going to make everything else invisible to you. You just go do your thing and we will handle all the back end stuff.
Ann Johnson: That's fantastic, make it frictionless for them, make it easy for them, right?
Micah Czigan: Exactly.
Ann Johnson: And then they want to cooperate.
Micah Czigan: They do, and as long as we're able to make it that frictionless, they're happy. They're very happy. There is that small percentage in which we've, you know, they were like, well, we want to manage our own and there's some special stuff, right? This is special sauce. Okay, well, let's talk about it, right? I've been very clear to my staff and to other teams that we work with, we are not going to be the office of "no." We're going to be the office of "how do I get to yes."
Ann Johnson: Love that.
Micah Czigan: And that path might be a little difficult, but this is our job, right? And so we're not just going to say we can never get to yes, but I can always get to yes. Always. It might be hard, it might be expensive, and it might take some time, but we can always get to yes.
Ann Johnson: I talk about -- I like the way you're getting to yes. I say, "No, but." We're the office of "No, but." If it's a no, it's "No, we can't do it this way, but let's talk about ways we could do it," and that's how I talk to my team. I'm like, we never want to give a hard no. We want to give a no, but here's the way we could approach it, right? And let's just brainstorm together. It's the same philosophy, but yours is a little more positive. I like the way you phrase it, so getting to yes. Let's just talk about one other thing. You know, universities, you've worked now in university, you've worked in government, you were with Symantec. What do you think the corporate world could learn from university cybersecurity?
Micah Czigan: One of the interesting things that I had not recognized when I was at Symantec or in the government that I see now at university is that profiles -- maybe that's not the right word, "change." So at the university, we have a student, and that student is not always a student because sometimes they're an employee, sometimes they're a researcher, and sometimes they're an alumni, and sometimes they're all of that, and each time they take on those personas, we need to find a way that their access and restrictions or lack of are not static. So when you're a student, you only need access to a certain number of resources, but then when you have your day job, which is as a researcher in computer science, that's a whole separate persona. At Symantec, we had the same thing, because when you're on the SOC and you're doing your investigation, you're going to need access to things that when you're not doing the investigation you don't need access to.
Ann Johnson: Yeah, you're doing your email or whatever.
Micah Czigan: Exactly, and so if I always have this persona as the incident responder and now I've got keys to everything and I'm just checking my email, well, now we have a threat vector that we shouldn't have, and I had never really made that connection until I got to university. I'm like, man, why didn't I figure that out before? I don't know that we've cracked that nut, but that is something we are definitely working on.
Ann Johnson: Well, yeah, that's why at Microsoft we use privileged admin workstations for folks to do admin work, but you've also given me -- you're doing great in helping me with my transitions today because the last question I wanted to ask, or second to last question, was about incident response. Again, without giving away any of your secrets, how do you, if you had a major event, what does your incident response effort look like, not just the technical controls, but how are you notifying your faculty, and how are you notifying your leadership, and how are you notifying anyone external that you would have to notify, students, etc.?
Micah Czigan: I don't know why I was surprised, but I was surprised. I think one of the things that we do, and I've seen my colleagues at other universities do very, very well, is crisis. When we have an incident or we have an outage or whatever, and I as the CISO reach out to the other executive team and say, "I need help, we have a problem," I have never, ever had any problem getting copious amounts of people on the Zoom, on the bridge, working the problem, and everybody works it to resolution every time. So when we've got these big incidents, and thankfully we have not, since I've been here, we have not had any major incidents, but we have a plan, and so that's one of the first things -- one of the things I did when I came on board. Okay, what do we do if we have a major incident like ransomware that infects everything? Let's figure that out. And so we use that as our kind of straw man to figure out what do we do for other major incidents, because that's probably one of the worst, is ransomware, right? And so we have developed our playbook not just for the SOC, not just for IT, but our playbook for the entire university which includes the provost, general counsel, HR, our public affairs, right, our communications folks, and our executive vice president, all the way up to the president, right? And what does that messaging look like? How do we do with that? And in fact, just last year, we had a proctored tabletop with our executive leadership team at the university on ransomware so that they could see what we're seeing and what that's going to look like to them. Like, what can they expect from messaging? And what are the decisions in ransomware, are you going to pay, aren't you going to pay, right? Those kind of things, and I think it was very eye-opening for our executive leadership to see those kinds of things that they have never, "Oh, wow, I am going to have to make some real hard decisions and I may have to make them really, really fast."
Ann Johnson: Excellent. So I'd like to wrap up. Thank you again for joining me, but I'm a cyber optimist, and I always say that. After 25 years of doing this, if I weren't optimistic, I wouldn't still get out of bed, I can tell you that. So what are you optimistic about? As you think about all these threats, your job, it can be really hard some days, but what are you optimistic about?
Micah Czigan: So I will go, and I don't know how many people feel this way, I am really optimistic about our ability to utilize AI in the cyber warfare. I wasn't sure, you know, as it started to ramp up, like everybody else, okay, where's it going to end up? What's it going to look like? But I think like many other tools we've seen, we are going to see, continue to see, and are seeing that specialized AI tools to do a few very specific things are really, really good. Like, I want to write a firewall rule. I can't figure it out. Help me to figure this out, and when I've got that AI that's trained on firewall rules, I wish there was one, then, like, oh my gosh, it actually works, right? Not that it's just going to write the rule for me every time, but it's going to help me to get over those roadblocks where previously I'm going to call the chief engineer and we're going to have, you know, three or four hours and we're going to go through code, and seven or eight hours later, we might have a fix. And now I think, I'm optimistic, we're going to go, "Hey, you know, Copilot, I want to write a rule. Tell me how to write a rule for a Palo Alto firewall, this model doing this thing," and I'm going to get the rule, and I'm going to say, "Oh, well, can we change -- can we have it do this or change this?" And it's like a conversation with the expert. That's what I'm optimistic about.
Ann Johnson: I love that, and I'm actually optimistic, too, about how AI will grow with us and how it will make us better cyber defenders and how I'm waiting for when it gets to be too anticipatory, too. You know, what does the next type of attack look like, and can we predict it and, you know, work on the defenses beforehand? Micah, thanks for making the time. I know you're busy. I really appreciate you making the time to join me today.
Micah Czigan: Thanks, Ann. It's been a pleasure.
Ann Johnson: And many thanks to our audience for listening. Join us next time on Afternoon Cyber Tea. [ Music ] I invited Micah to join me because in higher education, security leaders balance this open learning environment with a large tax surface. It really does raise unique questions on how to enforce cyber hygiene. It was an incredibly dynamic conversation, and I know the community is going to learn a lot from Micah. [ Music ]