
12 July · Episode 163
Higher Ed Cybersecurity, MOVEit Hack, and 3rd-Party Risk
32 Min · By Dr. Drumm McNaughton
The recent MOVEit hack has put higher ed cybersecurity and third-party risk top of mind. Learn what institutions can do to protect themselves.
The recent MOVEit hack has serious implications for higher education. MOVEit, an application used by the National Student Clearinghouse and many other institutions to move large files, directly affects numerous higher ed institutions and solution providers. This, coupled with the Gramm-Leach-Bliley Act going into effect in early June of 2023, has (should have) put higher ed cybersecurity and third-party risk at the top of mind for college and university decision-makers.
In this episode, Dr. Drumm McNaughton speaks with returning guest, Virtual Chief Information Security Officer Brian Kelly, to discuss the ramifications of the MOVEit software vulnerability, tools that can help higher ed institutions assess their security risk, all nine elements of the Gramm-Leach-Bliley Act (GLBA) that colleges and universities must be in compliance with to receive financial aid, what GLBA enforcement could look like, and an online hub that states and higher ed can emulate to ensure students enter the cybersecurity field.
Podcast Highlights
- MOVEit, a third-party tool used by the National Student Clearinghouse and others to move large data pieces, was recently compromised, compromising institutional data. This is having a downstream impact on higher ed since many institutions engage with the NSC.
- In addition to performing triage and internal assessments, higher ed institutions must reach out to all of their vendors and contractors and ask if they use MOVEit and, if they are, what they are doing to protect their data.
- The importance of having a process in place for vetting third-party risk. EDUCAUSE’s Higher Education Community Vendor Assessment Toolkit (HECVAT) framework helps address third-party risk. It’s a standard set of questions that institutions can ask third-party vendors about security and privacy. Over 150 colleges and universities use HECVAT version 3.0’s questionnaire in their procurement process. Large vendors like Microsoft and Google have completed it.
- HECVAT makes it easier for vendors since they don’t have to answer bespoke questionnaires from numerous institutions that might have their nuances and differences. It also allows the community of CISOs and cybersecurity privacy practitioners in higher ed to have a conversation around a grounded standardized set of questions.
- The Federal Trade Commission’s Safeguards Rule, which changed the standards around safeguarding customer information, went into effect on December 9th, 2021. The Gramm-Leach-Bliley Act that took effect in early June of 2023 required higher education institutions to meet the elements of those rule changes. There are nine elements.
- The primary rule change is designating a CISO or a qualified individual responsible for protecting customer information or student financial aid data. The second is to perform a risk assessment at least annually by a third party or internally.
- The third involves access review controls. Institutions must annually vet employees granted access to information and ensure more people haven’t been granted access. Institutions must know where all data resides and that all incoming data is identified. Institutions must ensure data is protected and encrypted when it’s being stored and in use, ensure the coding or development of any software that interacts with the Department of Education’s data follows secure practices, ensure data that institutions should no longer have or that has aged out has been properly disposed of, and ensure change management has been implemented. Institutions must identify who has access to customer information and annually review their logs.
- The fourth ensures that institutions annually validate that these controls are in place and working as intended. The fifth mandates that the individuals who interact with the Department of Education and use customer information are appropriately trained and aware of the risks involved. The sixth ensures institutions have a program and process to address and test for third-party risks.
- Seventh mandates having a prescriptive plan for responding to incidents, regularly testing and validating the plan to see if it’s working, and identifying the lessons learned. The ninth mandates that the CISO annually reports to the board or president.
About Our Podcast Guest
Brian Kelly supports the safeguarding of information assets across multiple verticals against unauthorized use, disclosure, modification, damage, or loss by developing, implementing, and maintaining methods to provide a secure and stable environment for clients’ data and related systems.
Before joining Compass, Brian was the CISO at Quinnipiac University and, most recently the Cybersecurity Program Director at EDUCAUSE. Brian is also an Adjunct Professor at Naugatuck Valley Community College, where he has developed and teaches cybersecurity courses.
Brian has diverse experience in information security policy development, awareness training, and regulatory compliance. He provides thought leadership on information security issues across industries and is a recognized leader in his field.
Brian holds a bachelor’s degree from the University of Connecticut and a master’s degree from Norwich University. He has served in various leadership roles on the local boards of the ISSA, InfraGard, and HTCIA chapters. Brian is also a retired Air Force Cyber Operations Officer.
About the Host
Dr. Drumm McNaughton, the host of Changing Higher Ed®podcast, is a consultant to higher ed institutions in governance, accreditation, strategy and change, and mergers.
Read the Transcript Online
Transcript: Changing Higher Ed Podcast 163 with Host Dr. Drumm McNaughton and Guest Brian Kelly
Higher Ed Cybersecurity, MOVEit Hack, and Third-Party Risk
Welcome to Changing Higher Ed, a podcast dedicated to helping higher education leaders improve their institutions, with your host, Dr. Drumm McNaughton, CEO of the Change Leader, a consultancy that helps higher ed leaders holistically transform their institutions. Learn more at changinghighered.com. And now, here’s your host, Drumm McNaughton.
Drumm McNaughton
Thank you, David.
Our guest today is Brian Kelly, a virtual chief information security officer for Compass IT Compliance. As a virtual CISO, Brian supports safeguarding information assets across multiple verticals against unauthorized use, disclosure modification, damage, or loss by developing, implementing, and maintaining methods to provide a secure and stable environment for his clients’ data and related systems.
Before joining Compass, Brian was the CISO for Quinnipiac University and, most recently, the cybersecurity program director at EDUCAUSE. He joins us again today to talk about the recent move at Hack, third-party risk, and the Connecticut cyber hub, a model for training more cyber professionals.
Brian, welcome to the show, or I should say welcome back to the show.
Brian Kelly 01:24
Glad to be back. Thanks for having me.
Drumm McNaughton 01:27
Thank you for coming back. This is going to be fun. There’s so much going on in cybersecurity right now. But before we jump into that, please give folks some background on who you are and how you got here.
Brian Kelly 01:42
Yeah. I haven’t been here in a while. So I’ll help remind your listeners. I have had a long career in higher education cybersecurity. I started way back in 2006 as the chief information security officer for Quinnipiac University. I did that through 2019. I left Quinnipiac to be the cybersecurity director at EDUCAUSE. I did that through the pandemic and left EDUCAUSE about a year ago to go back from being a coach and a consultant to being a player in the role of virtual CISO for consultancy here in the northeast. I support higher education clients and other verticals, including biopharma, manufacturing, and software. So I can’t get away from cybersecurity. There’s so much to do, and I enjoy being back in the field doing cyber.
Drumm McNaughton 02:33
How did you get into cyber?
Brian Kelly 02:36
That’s the million-dollar question. I wanted to be a police officer when I started my academic career at a community college in a criminal justice program. I finished that program, and I talked to an Air Force recruiter. You know that joke; how do you know when your recruiter is lying to you? When their lips are moving. So they convinced me to take a computer operator position in the Air Force that came with a top-secret clearance. It was probably the best lie I’ve ever been told.
That moved me into computer security for the military. I did that through most of the ’90s. Then I left the military to work for Aetna full-time, doing similar work around cybersecurity and information security from 2000 to 2006. Then I went to Quinnipiac. So it all goes back to that Air Force recruiter who shifted me from police work to cybersecurity.
Drumm McNaughton 03:26
Well, thank heavens for little lies, right?
Brian Kelly
Right. Yeah. Sometimes. One door closes, another door opens, right? You don’t necessarily see those opportunities at face value initially.
Drumm McNaughton 03:35
Well, as I see you, I don’t see you as a policeman. I think you’re exactly where you need to be.
Brian Kelly 03:41
Thank you. I’ve heard that a lot. And a lot of that comes from my father, who was a policeman for his career. And that’s where my desire to follow in his footsteps came from. All along my career path, he would say the same thing. He would say, “You’re better off where you are. This is where you’re supposed to be. You’re policing different things.” It’s very similar without the danger and exposure.
Drumm McNaughton 04:04
Although, they’ve lately been sending bombs through the internet, haven’t they? No, I’m just kidding. Let’s not give them any ideas.
Brian Kelly 04:15
Though, we’re close to what kinetic effect you can have. You know, the internet. It’s happening.
Drumm McNaughton 04:22
Oh, that’s scary. We’ll have to talk about that offline. So let’s go back to cyber from bombs. I’ve been reading too many novels lately, I think. We’ve been seeing a lot of news about the cyber world lately. What’s been going on?
Brian Kelly 04:39
So everything is going on. As you’ve said, we’re seeing more interest around it from our leadership within higher education, including our presidents and boards, simply because there is so much in the news about it. Recently, we saw the National Student Clearinghouse (NSC) have a third-party incident, which has a downstream impact on many higher ed institutions since the NSC is an organization many of them use.
When we think about those types of risks, we call them third-party risks. So all our colleges and universities that engage with the NSC share data. And the NSC is using a third-party application. In this case, it was ironically called MOVEit. Be careful of what you name your application. The application was called MOVEit to help you move large pieces of data, and that application was compromised. And as such, institutional data was compromised. Those notices just started going out within the last couple of days. In this context, it’s important to understand what that third-party risk is for our institutions.
Drumm McNaughton 05:53
Well, that’s huge. I’ve read about the MOVEit incidents that are going on. I did not realize it affected the National Student Clearinghouse.
Brian Kelly 06:01
Right. When we look at these things, such as the MOVEit vulnerability we see in the news at the consumer level or from the layman’s perspective, we say, “Okay, does our institution use that product or that service? Are we at risk? Do we need to patch? Do we need to update our environment to protect us from the MOVEit environment?” I believe institutions are very good at doing that. But in this case, it’s a third party. It’s the NSC.
So when we see these types of vulnerabilities or exploits being reported, this is when it becomes super important to not only do the triage and internal assessments but we’re starting the process of reaching out to all the vendors and contractors our institutions work with, to ask, “Do you use the MOVEit product? If you are, what are you doing to protect the data we shared with you? That’s just one example. If you think of all the different pieces of software that every company uses, it becomes important to have a process in place for vetting the risk around the third parties we engage with.
Drumm McNaughton 07:14
Well, I am a complete neophyte when it comes to this. So help me understand this. Many folks use the cloud for data storage, e.g., Amazon or Google. There are so many different ones out there that they can use. Is this MOVEit tool used to take data from Joe University or Mary’s University and put it in the cloud?
Brian Kelly 07:39
Yeah, conceptually, you can use it to move data from what would traditionally be your on-premise data center to the cloud. And, again, at the heart of it, it’s a file transfer service that moves files from one location to another, whether across an institution or from cloud to cloud. Many institutions are cloud first now with AWS, Google, or Microsoft Azure. Certainly, those environments have more security and capability than some of the on-prem capabilities that we have at our institutions. But it comes back to configuration and making sure we’re not just willy-nilly putting things into the cloud. We need to ensure we are securing those environments and, again, understanding what software is being used on-premise and in our clouds to ensure they’re not vulnerable.
Drumm McNaughton 08:32
Break it down to something simple for me to understand. A colleague of mine has a website that got hacked. The software for the site had not been updated multiple times. There were multiple security vulnerabilities. They came in and they were loading porn videos onto it. This website was ranked number 13 on Google for porn videos. If you’re a consultant for US government entities, that’s probably not what you want to be known for.
Brian Kelly 09:08
That is an example of patching that we’ve talked about for years. We’re all familiar with our Windows or Mac operating system patches. But every application potentially has vulnerabilities. So, is your friend’s company or an institution doing vulnerability management? Do they have a program that you’re using to scan programmatically? What are those vulnerabilities? Are there patches that can be applied to mitigate those vulnerabilities? Or are there other compensating controls that can be put in place, like for your firewalls or other types of things that can help prevent the hackers from installing files or malware or posting videos on your site and using it in a way that’s not intended?
A lot of that is having processes and policies to do that work. Your friend probably didn’t have those resources or awareness, right? Just knowing what to do is hugely important.
Drumm McNaughton 10:09
Which is why universities and colleges should have a CISO who understands these things, correct?
Brian Kelly 10:16
Absolutely. So that chief information security officer role is important, especially now. We’re at an inflection point where most higher ed institutions have that role. That’s a subject we talked about when discussing the Department of Education’s Federal Student Aid initiatives around the Gramm-Leach-Bliley Act. That’s the first element of compliance. It’s to designate a responsible, qualified individual at your institution, typically the CISO. It could be a director of information security. But having that responsible person is now a requirement if the institution wants to continue to receive student aid.
Drumm McNaughton 10:55
And that’s huge. It was a long time coming. The corporate side has been doing this for a long time. But it makes total sense for higher ed institutions to do this with the amount of data they have.
Brian Kelly 11:09
Absolutely. It’s because they’re viewing colleges and universities now as financial institutions. They’re like a bank or any other credit or loan agency that will be held to the same standards.
Drumm McNaughton 11:24
That’s good. Let’s go back to third-party risk. It’s like a computer, where everything is linked together. If you don’t have the right patches, you’re in trouble. Before we came on the air, we talked a little about tools universities can use to see how much risk there is.
Brian Kelly 11:46
Absolutely. One that’s near and dear to my heart is the HECVAT. And when we say HECVAT, everybody’s first question is, “What the heck is that?” It’s such an ugly acronym that it’s beautiful, right? It’s in its sixth year now. It was developed in partnership with EDUCAUSE, internet2, and the Community of Higher Education Practitioners. The acronym stands for the Higher Education Community Vendor Assessment Toolkit.
Initially, the “C” stood for “cloud” because, to your point, everyone was moving their data to the cloud. We wanted to ensure that, institutionally, we were vetting and looking at the risk of those cloud applications. But it became a word used universally in higher education for vetting all third-party applications. However, the acronym HEC had already grown legs, and we didn’t want to change. So we thought, “How do we change the “C” to something more inclusive without changing the acronyms?” So it became “Community.” It reflects that it’s for the community and built by the community.
HECVAT provides a standard set of questions that institutions can ask those third-party vendors about security and privacy. Most recently, there are some accessibility questions that have been added to the HECVAT for version 3.0. For the third version, EDUCAUSE has over 150 colleges and universities using the questionnaire in their procurement process. We know that large vendors like Microsoft and Google have completed it. So we’re all asking the same questions about that risk and getting the same answers, right?
HECVAT makes it easier for the vendors. They are not answering bespoke questionnaires from 150 different colleges and universities that might have their nuances and differences to them. But it also allows the community of CISOs and cybersecurity privacy practitioners in higher ed to have a conversation around a grounded standardized set of questions. So, again, it enables everyone in higher ed to work together. We’re not a one-system institution anymore. We have so many different systems, your student information system, ERP, or CRM, right? How do we vet the security of all of those consistently?
Drumm McNaughton 14:25
And you should have a system to do that even for simple stuff like ordering a Coca-Cola.
Brian Kelly 14:31
As we say, there’s an app for that. We used to call in to order a pizza, and now we can use an app. It’s the same thing with Starbucks. We can order our coffee online. We don’t have to talk to human beings.
Drumm McNaughton 14:42
Just talking about these third-party devices, people bringing their own computers or cell phones and connecting to your Wi-Fi creates headaches, too, doesn’t it?
Brian Kelly 14:54
Higher ed has already been in the BYOD, Bring Your Own Device, department for five to seven years, even pre-pandemic. We had our students, faculty, and staff doing that long before it was an accepted practice. In corporate America, there’s still some reluctance to allow employees to bring or use their own devices. But higher ed had to be able to enable that securely. So there are risks there, but you have to be aware of them to address them.
Drumm McNaughton 15:29
Let’s swap horses a little bit. The GLBA or Gramm-Leach-Bliley Act just came about to protect student information for federal financial aid. What is it? What are the implications?
Brian Kelly 15:43
Largely, the implications went into effect in early June of 2023. The Federal Trade Commission first enacted The Safeguards Rule, which changed the standards around safeguarding customer information. That went into effect way back on December 9th of 2021. So here we are in mid-summer of 2023, where it is now a requirement that higher education institutions meet the elements of those rule changes.
There are nine elements within these rule changes that are important for institutions to be aware of and track. As we talked about, the primary rule change is designating a responsible, qualified individual.
The second one is performing a risk assessment at least annually. You can have a third party do this or do it internally, where you’re evaluating the risk of your federal student aid data within your institution.
Then there are a number of controls that go in around access review. Are you vetting the employees within your institution who are granted access to information? Are you reviewing annually that the access hasn’t changed or that people haven’t been added?
You want to make sure that you know where that data is and that you’ve identified all that data that’s coming in. Are you making sure that the data is protected and encrypted both at rest—when it’s being stored, whether on-prem or in the cloud—and also when it’s in transit or being used?
Drumm McNaughton 17:17
When you say on-prem, you mean on-premise, right?
Brian Kelly 17:21
Yeah, on-premise, which is when you have a data center on campus. Sorry. It’s the geek in me. Also, if you’re coding or developing any software at your institution that interacts with Department of Education data, are you making sure you’re following secure coding practices? We also talk a lot about lifecycles, right? How are we securely disposing of data that we should no longer have or that has aged out of our systems? We want to make sure there’s change management involved.
There are also controls for logging and monitoring the accessibility of customer information. Who has access to it? Are those logs being reviewed? So one of the cardinal sins of logging is that you’re not institutionally logging all of that access. What is Drumm working on? What has he downloaded? Who is he interacting with? But if nobody’s reviewing it? That’s a problem. You must have both the logging capability and the ability to monitor it so you can go back and say, “Yes, we’re regularly looking at that.”
The fourth element is the regular testing of controls. Are you annually validating that these controls are in place and that they are working as intended or designed?
The fifth element is awareness and education. I like to say “education” versus “training,” especially in a higher education context. Sometimes training is a dirty word to say to our faculty and staff. The fifth element mandates that the individuals who are interacting with the Department of Education and using customer information are appropriately trained and aware of the risks involved with protecting that PII.
Can’t get away from vendor management. The sixth element is vendor management. This includes that third-party risk we talked about. It involves making sure that we have a program and process around it and that we’re regularly testing everything so that it continues.
And then there’s incident response planning. We usually want to bury our heads in the sand and say that it’ll never happen to us, that that’s somebody else’s problem. This mandates having a prescriptive plan for how you will work through an incident, reporting the folks involved during an incident, having that written plan in place, and then regularly testing and validating the plan to see if it’s working and identifying the lessons learned.
The last one takes us back to the first element, right? So we have that qualified individual. We’ve named a CISO or person responsible for protecting that customer information and student financial aid data. We want that person to annually report to the board, president, cabinet, or whatever leadership structure that’s in place on campus. This raises the visibility of the program at the board level, and it’s a requirement. When they come and audit to see if we can continue receiving financial aid, they will ask, “How often is the CISO briefing the board?” Remember, there is the requirement that the CISO reports to the board annually. It’s important that this elevates the role of the CISO on campus because it ensures they’re granting the necessary visibility of issues to leadership.
Drumm McNaughton 20:39
Well, because those are a lot of requirements, I’m curious to know who will be performing the evaluations to ensure they are being done properly. Is this going to fall on the accreditor?
Brian Kelly 20:53
Typically, the Department of Education will audit some number of institutions annually. It doesn’t have the means to do it for everyone. It will typically be a self-assessment that the institutions will do themselves. They can also contract with third parties.
One of the jobs the company I’m with has been doing over the last six months is helping higher ed see if they are compliant with the standards that took effect in early June of 2023 so they can outsource that to a consultant. I found that in my experience as a CISO at Quinnipiac, it was always helpful to have that third party come in and do it.
It helps facilitate the process. Sometimes your colleagues are more apt to be transparent with a third party than they might be with you. There’s a whole list of socio-psychological reasons why you do that. But from a compliance perspective, it certainly checks the box when you have a third-party compliance firm come in and make sure that you’re meeting that standard.
Drumm McNaughton 21:55
So it’s going to be random testing.
Brian Kelly 21:58
It will likely be either a venture event from the Department of Education where it’ll come out and perform an audit if there was an incident, or it will just be random and the Department will do a subset of institutions annually.
Drumm McNaughton 22:10
I bring up accreditation again, but the Department has delegated the assessment of quality with respect to federal financial aid to the accreditation bodies. It makes me wonder if they will have accreditors go through a HECVAT type of assessment and bring in a third party, etc.
Brian Kelly 22:37
Yeah, it might be part of the program participation agreement. You may see some language change in that when they’re onboarding or in regard to accreditation. Government-wide, there’s another acronym, the C2M2 model, the Cybersecurity Capability Maturity Model, which is very much focused on the Department of Defense currently.
Applicability to higher ed is specific to those large R1s that are doing potential DoD work. The longer-term play is will the C2M2 accreditation model be adopted by FSA, other departments, or other areas within the government as the standard for vetting?
To your point about the accreditation, for many years, that was a self-assessment. The institution would do the self-assessment and tell the government, “We’ve done our self-assessment, and we meet the standard.”
Where C2M2 is moving the needle, it’s a “show us your homework” type of situation, right? Whenever I ask my high school-aged son if he’s done his homework, he always says, “Yes.” But then, when I ask, “Can I see it?” He’s always said, “Dad, I’ll be back in a couple of hours.” He has yet to do it.
This is moving institutions from saying they’re compliant to showing that they’re compliant.
Drumm McNaughton 24:04
That makes a lot of sense. When we started, we talked a bit about the importance of having a CISO. But before coming on the air, we talked about how few CISOs there are. You mentioned how there are about 750,000 jobs available in the cyber world.
Brian Kelly 24:28
Not at the CISO level, but there are openings at cyberseek.org, a website that tracks the availability of open positions nationally. We’re seeing a workforce gap at all levels up to and through the CISO level. Some of the CISOs are my age or older. We’re getting near retirement.
We’re in our mid-to-late 50s. So how are we cultivating, mentoring, and developing the mid-management workforce? Also, how are we bringing more youth and diversity into our ranks?
So I’m based in Connecticut. Connecticut recently kicked off an initiative called the CTE Cyber Hub, which is designed to provide a pipeline for that talent by working with academic institutions in Connecticut. So it’s a small state. There are 24 higher ed institutions in Connecticut.
Twelve are community colleges, and there are four state university system colleges. The flagship is the University of Connecticut. Then we have schools like Yale, Quinnipiac, Sacred Heart, Fairfield, and a number of private institutions. This provides a framework for these institutions with cybersecurity curricula and for students there to have a pathway for work.
If you think back to other trades, where there were journeyman apprentice programs that coincided with an academic setting, now you have an applied learning setting where the cyber hub will allow employers offering internships and jobs to partner with these higher ed institutions.
We’re known for insurance here in Connecticut and some defense contracting as well, but cyber employers are signing on to be the beneficiary of the output of these programs. So the students who come through the programs get hands-on applied learning skills and do internships and work for those employers. Meanwhile, employers are getting to vet and see those students in action and that potential workforce.
So I’m excited about the Cyber Hub. It’s a model that can scale nationally to help bridge that workforce and skills gap while not only bringing more diverse but younger talent who have different ways of thinking into the career field. If you go to cybersecurity conferences and look around, it’s a lot of older people. So how do we continue to bring new people into the career field?
Drumm McNaughton 27:04
Well, that’s a great thing. And this will probably be a model that other states and state university systems can emulate.
Brian Kelly 27:11
Absolutely. There’s a company called iQ4 that is helping behind the scenes with the state of Connecticut that has created a repeatable model that will be brought to other states. I believe they’re looking at Rhode Island. I want to say they’ve done some work in New York state already. So there’s some traction around this.
The thing about higher ed is that we have that talent. When I was the CISO at Quinnipiac, I had one or two full-time employees. That is probably the standard for a lot of our higher ed institutions. They might have that named person but not have a ton of full-time employees behind them. But we are seeing some models where institutions are leveraging student talent and the academic programs around cybersecurity.
Cal Poly has a program where students manage its security operation center, which helps Cal Poly stay in compliance with the Department of Education’s GLBA requirements on monitoring and logging, as we talked about. But Cal Poly can staff that with students, which gives the students real-world experience and provides Cal Poly the resources to address some of those control requirements without having to budget and fund it.
Drumm McNaughton 28:41
That’s a great thing. And it gives students credit as far as their degree program.
Brian Kelly 28:45
Absolutely. It’s a win-win.
Drumm McNaughton 28:48
That’s fabulous. I knew this was going to happen, Brian. We’re at the end of the end of our time already. So if you would please, what are three takeaways for higher ed presidents and boards?
Brian Kelly 28:59
I always say talk to that CISO. Make sure you’re engaging with your chief information security officer. It’s a two-way street. I always encourage institutions to pay attention to third-party risk, as we talked about. It’s a board-level concern, not just an IT-level concern. Be aware, engaged, and supportive, and know this is an institutional issue. I believe most presidents and boards are aware that cyber is a board issue. It’s not an IT issue or a CISO issue. This is an institutional issue.
Drumm McNaughton 29:37
I think you’re right. I’m going to add two more to that from the board’s perspective. Make sure you have someone on your board who understands cybersecurity or who may be a CISO. Most boards don’t have that. The second one is that your risk committee needs to be fully engaged with this because If they’re not, you’re opening yourself for hacks or even worse.
Brian Kelly 30:04
The risk committee is a huge call-out. That’s something we didn’t talk about in detail. But you do see higher ed, including enterprise risk management, where cyber risk is part of that conversation around institutional risk at that risk committee level—great call out.
Drumm McNaughton 30:21
Thank you. So, Brian, what’s next for you? I mean, you’ve left EDUCAUSE. You’re working with another firm. You haven’t got any gray hair that I can see.
Brian Kelly 30:31
Yeah. There are days when I think that what’s next for me might be mowing lawns or something that doesn’t involve a computer. Every day is challenging and exciting to know we’re helping, right? That’s part of my military background. I enjoy helping clients and customers in higher ed. That’s where I’ve spent the last 17 years, so I lean heavily toward higher ed. I’ll probably still be in that environment, helping those members.
Drumm McNaughton 30:58
Well, not to sound trite, but thank you for your service to the community. I always enjoy our conversations. I learned so much. So thank you.
Brian Kelly 31:06
Likewise. Great to see you again.
Drumm McNaughton 31:08
And you too. Take care.
Thanks for listening today, and I’d like to give a special thank you to Brian Kelly for sharing his insights on what higher education institutions can do to deal with the mounting cyber risks confronting them. Tune in next week for my talk with David Linton, author and economist who will be sharing the insights he learned about higher education and student debt while researching and writing his newest book Crushed: How Student Debt Has Impaired a Generation and What to Do about It. Thanks again for listening. See you next week.
31:49
Changing Higher Ed is a production of the Change Leader, a consultancy committed to transforming higher ed institutions. Find more information about this topic and show notes on this episode at changinghighered.com. If you’ve enjoyed this podcast, please subscribe to the show. We would also value your honest rating and review. Email any questions, comments, or recommendations for topics or guests to podcast@changinghighered.com. Changing Higher Ed is produced and hosted by Dr. Drumm McNaughton. Post-production is by David L. White.