Dr. Scott Shackelford is the Provost Professor of Business Law and Ethics at the Indiana University Kelley School of Business and the executive director of Center for Applied Cybersecurity Research.
Summary of the episode
In this episode of the Cybersecurity Guide Podcast, host Steve Bowcut interviews Dr. Scott Shackelford, Provost Professor of Business Law and Ethics at the Indiana University Kelley School of Business. They discuss cybersecurity education at Indiana University and the broader field of cybersecurity. Dr. Shackelford emphasizes the importance of a broad foundation in cybersecurity, including technical skills, soft skills, and interdisciplinary knowledge. He also highlights scholarship opportunities such as the CyberCorps program and the Google.org-funded cybersecurity clinics. Dr. Shackelford predicts that the future of cybersecurity will be shaped by technological trends, regulatory developments, and geopolitical factors. He emphasizes the need for ethical considerations in cybersecurity and the importance of codes of conduct and responsible behavior. Dr. Shackelford advises early-career professionals and students to embrace uncertainty, explore diverse opportunities, and stay informed about emerging trends and ethical considerations in cybersecurity.
Listen to the full episode
Transcript of the episode
Steve Bowcut:
Thank you for joining us today for the Cybersecurity Guide Podcast. My name is Steve Bowcut. I am a writer and an editor for Cybersecurity Guide and the podcast’s host. We appreciate your listening. Today, our guest is Dr. Scott Shackelford, Provost Professor of Business Law and Ethics at the Indiana University Kelley School of Business.
We’re going to be discussing cybersecurity education at Indiana University, but because Dr. Shackelford is so eminently qualified, I think we’re going to extend that a little beyond IU and talk about these subjects in kind of a broader sense. And I’m going to read you just some of his bio and I think you’ll understand why I feel that he’s so eminently qualified.
At IU, Professor Scott J. Shackelford serves as the executive director of the Ostrom Workshop and the Center for Applied Cybersecurity Research. He’s also an affiliated scholar at both the Harvard Kennedy School’s Belfer Center and the Science and International Affairs, and Stanford’s Center for Internet Society. Professor Shackelford has written more than 100 articles, book chapters, essays, and op-eds for diverse publications.
Similarly, Dr. Shackelford’s research has been covered by an array of outlets, including POLITICO, NPR, CNN, Forbes, Times, The Washington Post, and the L.A. Times. And now, he’s even going to be able to add Cybersecurity Guide. He is the author of… And there’s three books that are out now that I want the audience to be aware of. They are: The Internet of Things: What Everyone Needs to Know, Governing New Frontiers in the Information Age: Toward Cyber Peace, and Managing Cyber Attacks in International Law, Business, and Relations: In Search of Cyber Peace. And coming soon, we can look forward to another book, Forks in the Digital Road: Key Decisions in the History of the Internet. I’m really looking forward to that one.
He is also the lead editor of the first volume dedicated to cyber peace entitled Cyber Peace: Charting a Path Toward a Sustainable, Stable, and Secure Cyberspace. Both Professor Shackelford’s academic work and teaching have been recognized with numerous awards, including a Harvard University Research Fellowship, a Stanford University Hoover Institution National Fellowship, a Notre Dame Institute for Advanced Study Distinguished Fellowship, the 2014 Indiana University Outstanding Junior Faculty Award, the 2015 Elinor Ostrom Award, and the 2022 Poets&Quants Best 40-Under-40 MBA Professor Award. And with that, welcome, Scott. Thank you for joining me today.
Scott Shackelford:
Well, it’s an honor and a pleasure to be with you.
Steve Bowcut:
Yeah. I’m very excited.
Scott Shackelford:
My goodness, I was worried the whole podcast would be the intro.
Steve Bowcut:
I know. Well, and oftentimes, I cut those things down, but this all seems very relevant and germane. So I wanted to make sure our audience-
Scott Shackelford:
You’re too kind.
Steve Bowcut:
… understands just how qualified you are. So thank you. We appreciate you giving us some time today. So before we get-
Scott Shackelford:
It’s a pleasure.
Steve Bowcut:
… into the specifics of a cybersecurity education, both at IU and/or elsewhere, I think the audience would be really interested to hear about your journey. How did cybersecurity become something that you’re interested in and focused on?
Scott Shackelford:
Yeah, happy to. It was not a well-thought-out plan, which maybe is often the case, right?
Steve Bowcut:
It is.
Scott Shackelford:
I had started off actually with a very different set of interests, mostly around space, frankly. I was a bit of a sci-fi nerd, still am, growing up. So toward the end of my undergraduate career and I was looking at grad school, I started writing on space policy, and this was shortly after the Columbia Space Shuttle disaster and the new NASA Vision for Space Exploration.
But when I was digging into it, I started researching all this stuff on the commons, which, as you would imagine, led me to some of the work of Elinor Ostrom, who was the first woman to win the Nobel Prize in Economics back in 2009, an IU professor. We can talk more about her and all the barriers she overcame along the way. We actually just did a children’s book about Lin last year called Lin’s Uncommon Life, and that led me to looking at all these other commons and the similarities and differences between them, including, of course, cyberspace.
But by the time I really was digging into that, things had escalated. There were, shortly after, cyberattacks against Estonia back in 2007. And at the time, I was looking to write an article, and one of my professors then said, “Hey, no one’s ever really looked at how international law applies to these newfangled things called cyberattacks. Maybe you could look at that.” So I did, and I basically took the red pill and haven’t looked back since.
Steve Bowcut:
Very good. Okay. Well, thank you. Thank you for sharing that. So let’s talk a little bit about what you’ve learned along the way. So maybe you could talk to us about emerging threats and challenges that you’re aware of, maybe the most significant cybersecurity threats and challenges that we’re facing today.
Scott Shackelford:
Yeah. I mean, I could rattle off a laundry list, and I’m sure you and a lot of the listeners are familiar, frankly, with quite a few of them. But in my mind, there’s a few particularly pressing things. One is a set of technological trends, and two are a set of regulatory and, increasingly, three, geopolitical trends. So I think all three of those are really fueling the rise of a variety of new types of cyberattacks and some older ones taking new forms.
So in my mind, the top, especially, technological trend that we’re living through at the moment is just the revolutions that we’re seeing in these large language models and the role of AI more broadly, both on offense and defense when it comes to cybersecurity, and we can talk more about that. We do a lot of work with a variety of stakeholders here at IU from super sophisticated clients. One of our collaborations right now is with Microsoft’s Responsible AI team all the way to local governments and how they’re figuring out how to use AI, especially those that are pretty constrained for resources and personnel.
And then, of course, on the regulatory front, we see a variety of states starting to become more definite with what reasonable cybersecurity looks like, and that’s having some positive knock-on effects, but it’s also causing attackers to change their tactics. Right? So we still see that 90-plus percent of breaches are happening because of the basic stuff that we keep forgetting to do. It’s the core of cyber hygiene. We’re never going to get to 100% of folks not clicking what they shouldn’t. But still, that’s the predominant way in. Right?
Steve Bowcut:
Yeah.
Scott Shackelford:
And then, three, on the geopolitical front, I mean, nation-state-sponsored cyberattacks have been around for a long time. That’s one of those things we explore in that history book that you kindly mentioned at the beginning, but just because of the incredible amount of instability we’re seeing around the world. It’s not just Russia with its back against the wall. It’s not just North Korea, which has long used cyberattacks to help fund its defense industrial base.
It’s a real laundry list of both state and, now increasingly, non-state actors that are going after a lot of institutions, reserve banks, et cetera, with deep pockets. And that means that any, frankly, private sector organization could be at the front lines, and we see that even here in Indiana, frankly. So those are a few of the top things that come to mind.
Steve Bowcut:
That is really so interesting. And I know at least from my perspective, and I’ve been doing this for many, many years, but it’s evolved quite rapidly actually. I think cyber threats have evolved. And the idea of a nation-state threat maybe a decade or so ago, I think, was relatively unique, and we knew that it happened. And maybe it’s just what we had visibility into. I really don’t know, but it didn’t seem like we really thought of hackers as being part of a nation-state. And when I say that, there are large organizations that they’re not really a nation-state, but they may be backed by a nation-state.
And so, sometimes I think about the image that we started with in this industry of the hacker. So he’s got a hoodie on and he’s in his grandparents’ basement. Right? So that is so non-applicable to what we actually see in today’s world, because it’s become so sophisticated, both the technology involved and the technological capabilities of the attackers, but also the size and organization, I think, of the backers of these kinds of threats or the origins of these kinds of threats. And AI, of course, is just the next iteration of how this gets more and more complicated, and it sounds like that you and I see that pretty much the same. So were you going to say something? Go ahead.
Scott Shackelford:
Yeah. No. No, you’re right. I mean, I think we really have seen a lot of consistency in some ways. And on that front, at least since Stuxnet, Operation Olympic Games, I mean, it’s challenging to put together these most advanced attacks that make use of a lot of zero-day exploits. But I guess the only thing I would add to what you described, which I think is spot on, is we’ve also seen just such a rapid diffusion of capabilities on the dark web, like ransomware as a service, et cetera.
I mean, you can get a lot of these pretty advanced tools, in some cases built on even the NSA’s hacking tools that were pilfered by Shadow Brokers and released on the dark web to cause, frankly, a lot of damage without a lot of know-how. So I think that that’s another aspect of what’s going on here. It’s certainly the case that there’s a tremendous amount of sophistication that’s required to do something, like take down a nation’s energy grid. That’s still not a simple matter, thankfully. But at the same time, you can cause a world of hurt even if you’re a pretty small group but have access to some of these advanced tools that have been stolen over the years.
Steve Bowcut:
Oh, that is an interesting perspective. I guess I really hadn’t given that a lot of thought. So I think the organizations are becoming more organized, but because the technology is more advanced, you could probably do more damage. A smaller organization could do more damage, as you pointed out. That’s an interesting thing, I think, to keep in mind. Let’s talk about educational pathways. So in your opinion, what are the key academic and professional qualifications that people are going to need for a career in cybersecurity?
Scott Shackelford:
Yeah. That’s a great question. Cybersecurity is just such a broad field, as you pointed out. It’s not just those with a technical background who are hands-on-keyboard, day in, day out in some dark room somewhere. There’s certainly a role for that. AI is quickly even changing some of how that job is being done, but that’s really just the tip of the iceberg. Right?
So I guess my first advice would be to think and, most importantly, read broadly and have conversations about just the diverse range of roles and responsibilities in the field of cybersecurity. You could go down the road of being a CISO or a chief privacy officer. Right? You could focus more first and foremost on the legal aspects. There’s a lot of law firms that just can’t get enough of lawyers that have an advanced degree, for example, in cybersecurity as well. That’s still pretty few and far between.
You could think about a role in government. We have the CyberCorps program here at IU. I’m actually the principal investigator for it. And I just came back from the conference in D.C., that’s an annual thing, just the other week, and we had 1,000 students from across the country, more than 100 universities there. But federal agency after federal agency was going up and just bemoaning the fact that they could not get the talent they need.
CISA alone at DHS has hired thousands of people the last couple years, and it’s been tough. CyberCorps is great, but frankly, there’s just not enough slots. So, I mean, that’s a whole nother direction that you can go into. So there are a few things that just broadly, I think, are useful. So having an interest in technology, even if you don’t want to be hands-on-keyboard for your whole career, I think is important.
For example, here at IU, we have some fundamentals courses, kind of like a cyber boot camp that we use as an on-ramp for folks who might’ve been music majors or something else and are interested in a career in cybersecurity. So being conversant in some of the key tools and terms is really helpful from that perspective. But not being too myopic, I think, is also just as important, so not only focusing on the technology.
But in our case, there’s a good reason why we married together the law, business, and computer science schools for our degree, because we really thought that the only way to really attack this problem of cybersecurity risk management is by using that broader risk management lens. Right? So think about the legal, the business, the policy, and the technical implications for all this. So that’s why taking a more diverse range of courses outside of just CS or an engineering school, I think, is really paramount, especially if you want to rise through the ranks and maybe think about senior leadership either in the public or private sectors.
Steve Bowcut:
Interesting. So I think diverse options is the key there. So I can envision, as you mentioned, so someone whose real love is law, so that might be their educational goal, is to become a lawyer, but to specialize, then they would also take or include a degree in cybersecurity. So that’s on one end of the spectrum, at least in my mind. And the other end of the spectrum would be somebody whose real love is cybersecurity. And so, that’s where their focus and their goals will be. And yet, you want to be qualified for a particular type of cybersecurity, like you said, if you want to be a CISO.
You don’t want to spend your whole career sitting in an op center in front of a keyboard. So you need some education that gives you some entrance into a specific type of cybersecurity. So I guess I’m just trying to be helpful to the audience here. And I guess maybe that’s the best we can do, is say, “Look, we’ve come to a point with cybersecurity that you can do it either way. You can either be a lawyer who specializes in cybersecurity or you can be a cybersecurity expert who has some knowledge about another industry and, therefore, let that take you in a different direction.”
Scott Shackelford:
Mm-hmm. Yeah. I think having-
Steve Bowcut:
Is that a fair summation? Go ahead.
Scott Shackelford:
I think that’s fair. Yeah. I think having that broad foundation is really vital, and there are lots of different ways to go about it. It really just depends on what your end goal is. Right? So in our case, for example, we have a combined JD/MS in Cybersecurity Risk Management program that you can do in three years. Right? So you can get both your law degree and an advanced master’s in cybersecurity in that same time frame.
And that’s one way to go about it. So that would allow you to, for example, have some flexibility with your courses. You have the foundation in both. You’ve gone through the core fundamentals of cybersecurity, networks, and computing. You’re conversant in the technology aspects. You’ve taken some core courses on the business side, like IT risk management, information system security, and you’re also cognizant of the cyber and privacy law and policy challenges.
So that type of well-rounded individual, I think, is still pretty rare, and I think it’s just vital. And the only other ingredient I’d add to that interdisciplinary mix is, frankly, just really leveraging as many applied opportunities as possible. And we can talk about some of them we have here at IU, like the Cybersecurity Clinic, and we’ve been involved with lots of other universities with helping to create their own clinics, with some great support from Google.org recently, which has been really appreciated. So I think those types of applied opportunities too are really a way to help set students apart.
Steve Bowcut:
Excellent. And that’s kind of where I wanted to go next. I’m really interested in learning more about the Center for Applied Cybersecurity Research. Maybe you could talk to us about that.
Scott Shackelford:
Yeah. No, happy to. So the Center for Applied Cybersecurity Research or CACR has been around for a little over 20 years now, so it was founded back in 2003. And back then, it was one of the first interdisciplinary centers looking at cybersecurity. Right? And it’s grown gradually over the years. There’s been lots of different chapters along the way. These days, we have 16 staff that I have the pleasure of working with, and we do a variety of things, including lots of training programs, assessments.
So we’re working with Purdue University, which is a bit of a rival of IU, but we can work together on important things like cybersecurity. And we’re working with the Indiana Office of Technology, partnering up with Purdue, to conduct 400 cybersecurity assessments of local governments across the state over the next couple years. And the idea there is really to, frankly, see how we’re doing as a state and make sure that the bipartisan infrastructure funding is going in a useful direction, because there was a certain amount of that earmarked for local government cybersecurity.
But we also do a lot else. So, for example, Trusted CI is a major program that’s sponsored by the National Science Foundation, which is basically looking at cybersecurity for major research facilities across the country and indeed across the Five Eyes, so the closest intelligence-sharing partners. And CACR runs that, and it has for many years at this point. And that’s just the tip of the iceberg.
So one thing we try to do is give lots of opportunities for students who are interested in this stuff to come, do ride-alongs, do research projects. We’re actually expanding that internship program later in the spring, which is going to be a lot of fun. And I’ve found it to be a really useful bridge between the operational and the academic side of the house, because oftentimes, there’s not enough opportunities for those two communities to talk to each other.
Steve Bowcut:
Excellent. So, Scott, I wanted to get your perspective on skill development, so the kinds of skills that someone wanting to get an education in cybersecurity should focus on. And given what we’ve talked about so far, I’m guessing that your answer will be that these skills are going to be quite diverse, because your opportunities and paths in cybersecurity are quite diverse, as we’ve talked about, but if you could distill that down a little bit. So what skills, both technical and soft, do you think are crucial to developing a career in cybersecurity?
Scott Shackelford:
Well, absolutely. Well, I think, first, it’s really important to make sure that you’re getting those soft skills, which are often maybe the last thought that you might have as you’re looking at these different graduate programs or undergraduate programs. But one of the most frequent things we hear from recruiters or members of our Cybersecurity Advisory Council at IU is just the fact that a lot of, especially, entry-level folks, but even, frankly, some mid-level folks, still don’t have those core communication skills that are just so vital to either representing or working with clients and, frankly, even talking to boards about different decisions.
I mean, you can have all the facts in the world to back you up, but if you can’t communicate why a given investment, even something as basic as multifactor or end-to-end encryption is important, you’re just going to see people’s eyes glaze over and you’re not going to get the investment that you need. So that’s why it’s really vital to do some of the basics. Right? So go through, even though they can be annoying at times, just like a core business communications class. Right? Practice it.
So we have a capstone project, for example, that we do every spring with a variety of clients. So this spring, for example, we’ll be working with Palo Alto Networks and Microsoft, as well as ACT-IAC, which is an organization that has a lot of different civilian federal agencies as members. And the students work with these clients over the course of the term. They do a professional presentation at the end of it based on what they’ve learned and they’re deliverable, and we found that to be just a really useful way to work with these more sophisticated clients working with cutting-edge issues. But equally important, they’re working with clients on the opposite end of the spectrum.
So that’s why at the Cybersecurity Clinic at IU, we focus on nonprofits, local governments, community foundations. And similarly, these students work in interdisciplinary teams. Usually, there’s multiple degree programs represented. So you could have a master’s of public affairs working with a lawyer, an MBA, and a cybersecurity risk management student on a common problem. Right? And that’s usually how things work in the real world, whether you’re working in consulting or whether you’re working in a big organization. So getting used to that, I think, is really, really useful and important.
Steve Bowcut:
Yup.
Scott Shackelford:
Now, that’s not to say that you still don’t need to have those other core technical skills, and that’s why making sure you have your bases covered, and there’s useful things like the NICE Framework and otherwise to make sure that the programs you’re considering have the basics down. But I would just make sure that we don’t ignore that broader question at the same time.
Steve Bowcut:
Very good. Thank you. I appreciate that. So I’d like to get you to talk about internships and scholarship opportunities. Now, this may be more specific to IU, but you may be aware of some other programs or scholarships that’s outside of IU. What are your thoughts there?
Scott Shackelford:
Yes. Yes. Well, the first one I’ll mention is CyberCorps. So if they’re unfamiliar, CyberCorps is the National Science Foundation Scholarship for Service Program, and been around a long time. There’s 104 participating universities all across the country now. So regardless of what state you might be listening to this from, there’s a decent chance that there’s a university close by with the CyberCorps program.
It’s a really cool opportunity for interested students. So it’s basically a deal where you agree to sign up for usually a minimum of two years. And in return, not only do you get those two years of schooling completely paid for, you get a stipend on top of it. You get a laptop. You get funding to go to a big job fair in D.C., where there’s a lot of fast-tracking that happens. Sometimes there’s interviews and even exploding offers for federal agencies that otherwise can be a bit tricky to get your foot in the door, and it’s just a great community to be a part of. And there’s a lot of other help, like with security clearances, et cetera.
So that’s one of those scholarship programs that I think is maybe relatively well-known in certain parts of the community, but maybe not others. So if it sounds interesting, if you’re interested in federal, state, local, tribal government service, check it out. It’s a really great way to start your career, and I’ve already found it to be transformational for a number of our students. So that’d be the first one, and I’m happy to talk about others.
Steve Bowcut:
Please. No, please.
Scott Shackelford:
Yeah. Yeah, yeah. So that’s a really, really, I think, empowering one for sure. There are discussions about creating companion ones in other hot fields that are kind of modeled on the CyberCorps idea, but they haven’t really seen the light of day yet. So other scholarship opportunities are largely school-dependent. There’s not enough, frankly, major national competitions or funders.
One for a long time had been the Hewlett Foundation. They had a Cyber Initiative that helped to start a lot of, especially, cyber policy programs at schools, including lots of HCBUs across the country. But they just sunsetted it last December, so I guess just last month now as we’re recording this. So that is no longer an option, unfortunately, for students. And because they were the biggest funder basically in that space, that’s going to be a glaring hole for a little while. That’s not to say that other groups won’t step forward.
Google.org and their investment in cybersecurity clinics, I think, has been really empowering, and the goal there is, by 2030, to have at least one cybersecurity clinic in every state, either at a university or community college. So we partnered with MIT and Berkeley and Alabama to launch a consortium of universities with cybersecurity clinics just a couple years ago, and that was kind of the original kernel for this. And it’s expanded really quickly, now expanded to Europe as well. We’re hoping to expand even more broadly.
So I think leveraging that, making use of the funding that’s now available to the consortium and to these universities starting clinics is also a really cool opportunity for students, which comes with other benefits as well. There’s Titan Security Keys. You get to work with Googlers as mentors, and other aspects. So it’s not just the funding.
Steve Bowcut:
Very good. So let’s get you to just dust off your crystal ball a little bit. Talk to us about how you see the future of the field of cybersecurity, even the next five years or 10 years. And specifically, I think what I’m looking for here is, what do students need to be aware of so that they’re prepared to take advantage of what that future might look like?
Scott Shackelford:
Yeah. And let me get my opaque crystal ball out. Right?
Steve Bowcut:
Yeah. Exactly.
Scott Shackelford:
I think to be ready for what’s coming, I do think, first, and this is one reason we wrote the history book, it’s important to be mindful of whatever the latest fad might be and recognizing that it might just be that. Right? It might be the next NFTs. So AI right now is clearly that thing, whether or not that’s going to be just as big or not even a bigger deal in five years than it is today. It’s hard to say.
The technology is advancing rapidly. It’s already changing a lot of our personal and professional lives, and it does not seem to be going away, to the extent to which it just becomes baked in as yet another tool, like Copilot, to use Microsoft’s offering, and just another thing that we do as we’re creating, whether that’s coding or podcasting or you name it. I think that remains to be seen.
But getting well-versed in the basics there, I think, is just really important. Right? So one way that we’ve done that here at IU is we now have a cybersecurity and AI track in our MS in Cybersecurity Risk Management program. So students can now take courses squarely in that area across the university and each of the partner schools, and I’d like to see a lot more universities following suit.
One part of our National Science Foundation grant is we have a team that’s actually looked at how universities across the country are baking in AI to cybersecurity curricula. So there’s a report coming out on that soon. If anybody’s interested that happens to be listening, happy to share the link after we’re done. And that can be a useful thing to look at as well as you’re considering universities, especially those that might have competencies in both those areas.
So that’s one thing. But again, I think the other is just to be really mindful of what’s happening on the regulatory side of the ledger here, because that’s shaping a lot of business practices already, not only here in the US with some states, including states like Indiana, even now has a comprehensive data privacy bill that’s going to be coming into force in the next few years, but even more importantly, what’s happening globally, so whether it’s the EU and their new Digital Services Act, the AI Act.
That’s really going to be shaping not just businesses in Europe, but keep in mind that any US business that wants to do business in Europe has to comply with this too. So those laws are already shaping a lot of both cybersecurity and, now increasingly, AI practices too. So getting some background, reading about it on your own, taking a couple classes, doing what you can to get up to speed on that, I think, is going to help really set you apart as well, along with, of course, considering those different professional certifications, whether it’s more on the policy side through IAPP or whether it’s CISSP or the alphabet soup of other certs that are out there.
Steve Bowcut:
Very good. Thank you. And please do send me links on anything that you think would be useful to our audience. We’ll be sure to put them in the show notes so that they can find them easily.
Scott Shackelford:
Mm-hmm.
Steve Bowcut:
So before we run out of time, I really wanted to talk to you a little bit about cybersecurity ethics and responsibility. I don’t think it’s something we stress enough and I know how crucial it is. So can you talk to us about the work that you’ve done there and how you see ethics in cybersecurity?
Scott Shackelford:
Absolutely. A lot of this is becoming black-letter law, but it’s happening very slowly and in a really fragmented way. So still, frankly, what’s going to be driving behavior, both individual and organizational, day in, day out, is, frankly, ethics. It’s codes of conduct. And that means how we talk about it in the classroom, but even more importantly, in policy circles and business circles and boardrooms really matters.
I’m remembering, for example, just to kind of couch this discussion a little bit, how much attention there was a couple years ago when Colonial Pipeline was breached, at least their IT systems. Right? And they very publicly paid the ransom, and the CEO defended it at the time as being the right thing to do, the ethical thing to do, because in this case, “Hey, it’s critical infrastructure. There’s clear national security interests.” And the concern was, if they didn’t pay up, things could have gotten really out of hand.
And there was a pretty fierce reaction to that when the CEO came out so forcefully in some of his testimony in the aftermath of that breach. And since then, we’ve seen a big pushback. I mean, there’s efforts even to make it illegal to pay ransoms. And just because something’s legal, doesn’t make it ethical, and there’s good reasons not to pay, because you’re incentivizing exactly this kind of behavior. Of course, it could happen more frequently as a result.
But still oftentimes, plenty of organizations, including lots of small businesses, but also even libraries, police departments, you name it, that do pay ransoms, that leaves them in the lurch, because oftentimes, they just don’t have another option. Look at a hospital, whereas people’s lives are literally on the line. And if they don’t pay, the care really could be sacrificed and people could get hurt or worse. So that’s one of those situations where how we think about the ethical undertones here and, more broadly, how we conceptualize, how we measure ethics and, for that matter, corporate social responsibility in this context, I think, is really, really key.
We’ve done a little bit of work on this in our most recent piece. It was something called Cyber Silent Spring. It was an article that was published by Penn, oh boy, maybe about six months ago or so now. And in that one, we looked at this intersection of sustainability and cybersecurity, because there’s a lot of interest in cross-pollination going on right now between those fields, including this idea of ESG+T, with the T being technology, so looking at the broader technological footprint of organizations and how the decisions made there can affect much broader ecosystems.
Just like what happens in the natural environment, that can happen with IoT as well. And we’ve seen that in specifically looking at how some different countries at this point are regulating consumer devices. So in Europe already, as part of their CE trustmark, you get information about the privacy and security features of smart products as a consumer, kind of like ENERGY STAR here in the States.
And the US has a pilot program that’s going to be coming out to do something similar, and the idea is, “Hey, let’s put the consumers in the driver’s seat. Let’s empower them to make choices, reward companies that are making what we think of as smart decisions using different metrics when it comes to cybersecurity and privacy.” That’s not going to be regulatory. It’s not going to be top-down. It’s going to be basically out there for companies to use, if they deem it in their own interest, and it’s a great example of what we’re talking about here.
I have some doubts about how useful it’s going to be in the way it’s conceptualized right now, but it’s something. And I think, thinking about that lens, so the classics of, “Hey, what does responsible behavior look like in this context, whether it’s paying a ransom or thinking about the utility of cost-benefit analysis to make these cybersecurity investment decisions?” All that, I think, should be much more broadly discussed. And there’s been some useful steps in that direction, like MIT’s Moral Machine, but that’s all we had so far, is just some steps.
Steve Bowcut:
Yeah. Well, I think I know it’s extremely important, and there is kind of a broad principle that I think applies there, and it’s the more we can apply ethics and personal or organizational responsibility to cybersecurity, the less black-letter law, regulation that we need. And I think that serves everybody’s purpose. And obviously, there are some things just have to be regulated and we have to have rules that govern how we all play in a society or a culture.
Maybe I’m just getting a little out there a little ways, but it seems like in any industry or technology, the more rules and regulations we get, the more people feel like it’s okay to get away with what they can, because they don’t apply ethics and personal, organizational responsibility because the rule doesn’t say that. Right? “The law doesn’t say I can’t do this, so now I’m going to do it.” But if there isn’t a law there and you’re just expected to be on your best behavior, I think sometimes the behavior would turn out to be better. And that would be for some people, certainly not all people, but it is something worth thinking about.
Scott Shackelford:
Absolutely right. No, I couldn’t agree more. And some professional organizations now do have some pretty useful codes of conduct baked into them. So whenever I’m teaching this stuff to my students, we go through the classic Western approaches to ethics and we apply it to cybersecurity issues like ethical hacking, certifications, or bug bounty programs, and all that jazz.
But I also make the point that, “Hey, if you’re a member, for example, or aspire to be CISSP-certified, well, let’s look at the code of conduct for CISSP-certified professionals.” And a lot of what we talk about from that ethical perspective of having a positive impact on society, practicing honor, integrity, et cetera, it really can be useful and guiding when you’re asked to do something maybe that you’re not comfortable with, or have to think about being a whistleblower in some way.
Steve Bowcut:
Yeah. Excellent. All right. So we’re about out of time. I did want to leave with one final note from you. And given everything we’ve talked about, is there some advice that you would like to offer early-career professionals or students that are trying to make those, which are really pretty heavy decisions, because they’re going to shape your future? So what kind of maybe general advice would you offer?
Scott Shackelford:
Yeah. I think my general advice would be to don’t be afraid of that uncertainty. Right? Really live it. Have that broad set of conversations and experiences that can help inform. Right? As we’ve started, when I first started thinking about how to apply some of my own personal interests to my professional career, 20 years ago at this point, I never would have thought that I’d wind up basically teaching cybersecurity and running a research center dedicated to it. But it’s been fun. Right?
And each step along the way made sense at the time, and you put one foot in front of the other. And before you know it, you’re having a fulfilling and exciting career. So I think the most important advice is just to do that. Right? Don’t be afraid to go broad before you go deep, especially in a field like cybersecurity as you’re just starting out. It’s really useful.
Hey, pick your niche. Figure out what that is. It’s nice to do it. And maybe sooner, rather later, and especially with an eye toward whatever is particularly hot at the moment, whether it’s AI or… I haven’t even mentioned we have this new executive certificate in space cybersecurity, which has been just terrific. And we got some funding from Microsoft and the National Science Foundation to set that up with scholarships for students, which has been wonderful. There’s very, very few places in the world that are offering anything in space cybersecurity right now.
So there there’s different ways. There’s those different intersections, whether it’s AI or space or sustainability and ESG, where you can really make your mark. I mean, this is a field that’s maturing. But even since I’ve gotten into it 15-plus years ago now, there’s not nearly enough people that are really dedicating themselves to, frankly, improving how we use technology in a responsible, ethical, and useful way. So I think there’s a world of opportunity for those who are interested, and happy to do my part to connect, facilitate. So don’t hesitate to reach out.
Steve Bowcut:
Thank you so much. I really do appreciate you. I can’t tell you how much I appreciate you spending some time with us today. I think what you’ve offered our audience is invaluable, and I appreciate you doing that.
Scott Shackelford:
Of course, it’s no problem at all. Again, the pleasure is all mine. Thanks again for reaching out and for what you do.
Steve Bowcut:
All right. And a big thanks to our listeners for being with us today. Please remember to subscribe and review if you find this podcast interesting, and join us next time for another episode of the Cybersecurity Guide Podcast.