Dr. Richard R. Brooks is a professor of electrical and computer engineering at Clemson University, with expertise in cybersecurity, game theory, and strategic reasoning.
His academic and professional background includes work at organizations like Radio Free Europe, the French Stock Exchange, and the World Bank, providing him with a diverse set of experiences influencing his research interests.
Dr. Brooks’ work focuses on the intersection of computer security, civil rights, privacy, and freedom of speech, with a focus on defending against the misuse of technology by authoritarian regimes.
His research explores areas like supply chain attacks, mobile code security, and the economic and policy factors influencing cybersecurity vulnerabilities.
A summary of the episode
Dr. Brooks discusses his work on using game theory and strategic reasoning to analyze cybersecurity problems, as well as his efforts to develop secure systems for critical applications like automotive software updates.
He also highlights the importance of considering user experience and making cybersecurity accessible to non-technical users, rather than placing blame on them.
Listen to the episode
A full transcript of the interview
Steve Bowcut:
Welcome to the Cybersecurity Guide podcast. My name is Steve Kott and I am a writer and editor for Cybersecurity Guide. I will be your host for today’s podcast episode. Thank you for joining us. We appreciate your listening.
Today. Our guest is Dr. Richard R. Brooks, an electrical and computer engineering professor at Clemson University. Today we’re going to be talking about cybersecurity education, civil rights, and global threats. I’m very excited about the topics that we’re going to cover today. I’m looking forward to this.
Let me tell you a little bit about Dr. Brooks. Richard R. Brooks PhD is a professor of electrical and computer engineering at Clemson University, specializing in cybersecurity game theory and strategic reasoning with a PhD in computer science from Louisiana State University, Dr. Brooks has led numerous research projects funded by organizations such as the Air Force Office of Scientific Research, the National Science Foundation, and the Department of Energy. His work explores the intersection of computer security, civil rights, privacy, and freedom of speech with a focus defending against the misuse of technology by authoritarian regimes. And hopefully we’ll get to talk about that a little bit.
He has also contributed to research on supply chain attacks, mobile code security, and the economic and policy factors influencing cybersecurity vulnerabilities. Dr. Brooks’ expertise in combating large scale network attacks and understanding the economics of computer vulnerabilities makes him a leading figure in the field, offering critical insights into how policy and technology shape the future of cybersecurity. And with that, welcome Dr. Brooks. Thank you for joining me today.
Richard Brooks:
Oh, thank you.
Steve Bowcut:
All right. This is going to be fun and interesting. I really appreciate your time. I’d like to start with helping our audience see the academic journey that you took to maybe give them some idea of what they may want to do with their academic journey. So tell us about how you got interested in cybersecurity that at the beginning or was it post-graduate and how did all that work?
Richard Brooks:
My path, it was indirect, and I’ll start with, I got a bachelor’s in applied math from Johns Hopkins in 79. In 79 they did not have computer science or anything like that. Applied math actually had one computing course. And then after that I got a job with NASA helping test satellites. And I then took what I euphemistically call a 12-year European vacation.
I went to work with Radio Free Europe and Munich. I was running their networks and computer system there. Radio for Europe was, well still is a company that’s funded by the US government and it was providing broadcast too, at that time the Soviet Union and Eastern Europe. A lot of the people there were eastern European dissidents. I was running the computer system that included protecting the computer systems.
Early 80s was also a big time for agriculture to start coming out in Germany. I was in Munich, so I was exposed to a lot of the hacker regimes and that was also when computer viruses first started, which were also being produced in Eastern Europe and so on. So my interest started there. I then moved to, from the Munich office to the Paris office, which was audience opinion research. I was the computing person dealing with all of the Soviet area analysts and I dealt a lot with dealing with people from the Soviet Union and how they were dealing with and so on and so forth.
Then I left radio for Europe. I worked for the French Stock Exchange, which then of course also dealt with, we set up a automated stock trading system, which then also dealt with, they had special security concerns with that. And then I left to them, I was consultant for the World Bank and we ran the World Bank Network from Delhi to the Atlantic Ocean, which included what was by that time the former Soviet Union in Eastern Europe. And then I also started traveling a lot to the World Bank regional offices in Sub-Saharan Africa because a lot of that speaks French and we were francophone. And of course with that, it went from I wasn’t allowed to go to Russia to install the system in Moscow and in Warsaw. And I also was exposed to what was, I traveled a lot to Sub-Saharan applicant, see what was going on there and how that went.
After that, I decided to go to graduate school, which was in large part because the French were getting on my nerves and my wife’s German. And we had discussions. I would’ve gone back to Germany. She kind of wanted to go to the U.S, I was going back to the U.S I was going to go to graduate school. So then I got a PhD in computer science from Louisiana State. I eventually went to the Penn State Applied Research Lab and basically still is a Department of Defense research facility. And with that, I started getting research grants looking at computer and network security, which was an area that I remained interested in. But I then now had the academic background and we were doing what research for DARPA, the Office of Natal Research and so on. So it’s sort of how after seven years at Penn State applied Research Lab 20 years ago I moved to Clemson and I’ve been a faculty member here ever since then. And running my own research programs, which are then funded by the various people who might be interested in my ideas.
Steve Bowcut:
Excellent. And one of the takeaways that I get from that is that it’s okay to do your undergraduate work and then go give some life experience and then come back and work on your doctorate stuff. Because now as we’ll see, when we talk about the things that we hope that I hope to talk about as we move forward here, many of those things came from that time in between, right? Many of those interests or things that you decided to focus your research on, is that correct?
Richard Brooks:
It worked for me. I’d say non-traditional and I have absolutely no regrets and I feel that it has helped with a lot of the life experiences I had. Also, I learned fringe and German and definitely with a lot of things I’m doing now, especially the French is very useful.
Steve Bowcut:
Well, from the wealth of experiences that you’ve talked about and that I reviewed in your bio, some of ’em that really stuck out to me was first of all this intersection of computer security, civil rights, privacy and freedom of speech. I think there’s just a lot of people that grapple with these ideas. So maybe it would be interesting for our audience to get your perspective on how these areas, how do they fit in the current digital landscape? If you could talk to that a little bit.
Richard Brooks:
Yeah, it’s inescapable and a lot of what I’m doing with that now, there are actually two NGOs, non-governmental organizations that I’m involved with. One is a African NGO and Senegal, which is actually put together by people who had been trained, people who had been trained by my state department project. And after I finished with that, the trainees incorporated and they repeated my security training themselves in all of the African countries and in Haiti, which I have to admit, I have trouble imagining my graduate students just suddenly evangelizing the courses I gave them. And then there’s a French NGO where I’m a pro bono CTO, we’re putting in place systems, collecting data, collecting data on gender-based violence in Southern Chat. And with that, it’s very important if you’re dealing with any sort of sensitive data that you can keep it, that you can keep it private.
And then for example, Southern Chad, you also have groups like Boko Haram, which is a terrorist group there, which is extremely bad with anything involving women. So you want to be certain that you’re not exposing data or the identities of any people. And particularly in countries where the regime has bad tendencies, you also want to be careful to not only not expose the data but also not expose that the people are using your tools because that would expose it that you’re dealing with ’em.
The groups in Africa, I dealt with that was, it was during a time which they called the Sahel spring, and I was very pleased, or I was very happy. Three of the groups I was dealing with, I don’t want to take too much credit, but three of the groups I was dealing with were used some of the knowledge I had, and it ended up creating free and fair elections and states that used to be run by dictators. And I think the ability of the people to keep their communications private and who was talking to who probably helped in that. And also just being able to coordinate
Steve Bowcut:
That is so fascinating. So that’s an environment where computer data protection, privacy, those things, not just anywhere, that kind of thing will impact people’s lives. We can lose our bank accounts and we can have a ransomware on our business systems, but you’re talking about an environment where people’s lives are really on the line and terrorist groups and those kinds of things where people gather that data and then use that to adversely impact people’s lives in ways that we just don’t have to deal with here in the west. So that is totally fascinating.
Richard Brooks:
We talked with, I mean, it’s not just developing world dictatorships. We’ve also talked with lawyers groups and women’s rights groups in Western countries that also want to be certain that they’re keeping private, who the, let’s say women that have an abusive partner, it’s very important that they’re able to keep their location private and that type of thing.
Steve Bowcut:
Yeah. Well thank you for raising at least my awareness to that. When I think of governments and cybersecurity, the first thing that always comes to my mind is nation state cyber warfare, which it’s a big concern and it’s something that is growing and it’s something we know to be aware of. But you’ve shed kind of a different light on that I think in terms of authoritarian regimes using privacy issues against their own constituents or people in their own region. So that’s an important way to look at it.
Richard Brooks:
I’m also increasingly concerned about, or the way I put it in, I mean there’s so many companies in the US with the data brokers and collecting information about their clients. I just frequently say, I just wonder about these people that read 1984 and thought that’s a business model.
Steve Bowcut:
Exactly.
Richard Brooks:
We have a paper that’s under review, and one part of it is it’s actually from a report from the Mozilla with cars, the Nissan, they’ve changed it since the report came out. They changed this, but they had in their terms of service that they had the right to sell information about the sex lives of their clients. Anybody sitting in a Nissan car including, which just really still flips me out, including the sexual orientation of the people in the car. Now I don’t why my car, I don’t know how or why the car is trying to figure out if I’m gay, but I don’t think that’s something that normally people should have to be concerned about.
Steve Bowcut:
Yeah, yeah. No, I agree. That is kind of scary actually. So I want to maybe make a pivot here a little bit. So I’m not a gamer, but I know a lot of our audience probably is because I think we have a younger demographic that follow this show pretty closely. And you’ve done some work in game theory and strategic reasoning. I would be really interested to understand how those things play into cybersecurity.
Richard Brooks:
This is going to fork into directions.
The first fork is closer to video games. And with that, for example, one thing I’ve done with the trainee I had in Africa, I was dealing a lot with activists and journalists and so on who weren’t very technical, but I wanted to have some sort of test of their ability to what they’ve absorbed, their ability to use the information. And so what we did with that is at the end of the training, we had what we call the award game. And basically I developed role playing games where I and my graduate students were the authoritarian government and they had tasks where they needed to get and share information without us being able to track them online. And that was fun. In some ways it was fun in some ways it allowed them to see better how they could actually use the information. And then that’s one way of just inserting gamification.
Another thing that we’ve done, which relates to video games is, do you know the Onion router?
Steve Bowcut:
Yes.
Richard Brooks:
Okay.
Steve Bowcut:
I think most of our audience would understand what Onion Router is. Okay,
Richard Brooks:
Okay, so tour to get around the nation state firewalls, they have what they call pluggable transports, which way of hiding the fact that you’re using Tor. And so one project that we had was taking Tor browser traffic and we translated it into Minecraft video game sessions.
Tor was good for a lot of different reasons, but one thing that really helped was the modern community had libraries that we could deal with. And that’s always been something that people enjoy hearing about. And one thing that we did was with one of those countries we just checked, we had people in the country playing Minecraft on a server that was outside the country. And it turns out that the authoritarian governments, they don’t want to mess with game traffic.
You have young people who are tech savvy and playing games and if they’re not upset, why upset them? And also then if you have people who might be spending time doing things that the government doesn’t like, just better to let them play video games.
Steve Bowcut:
Interesting.
Richard Brooks:
We found it interesting as well. And then also the interactive games have a pretty good throughput. It’s a fairly fast channel and all that game theory itself actually has little to do with video games.
Game theory was started by von Neumann, who also is the von Neumann that came up with computers. And it’s the type of math where you have one player that’s trying to do one thing and another player that’s trying to do something else. And it’s generally used in analyzing economics is one way of looking at it. And we’ve been using it just looking at, if you look at say, phrasing the computer security problems, as somebody wants to break into my system, I don’t want ’em to break into my system what resources they would put into, what probability of success they would have, how much it would cost me to do defense. That’s one way of looking at it.
One thing that’s traditionally, and you have to have mathematical functions in the middle that allow you to say probability of it working or not probability of it working. And one thing that’s problematic with a lot of that area of research is many people will tend to use the functions for measuring things that are functions that they know how to solve rather than functions that really reflect the problem.
That’s finding a way of taking this adversarial relationship and phrasing it in that way is problematic. Some of the problems I’ve looked at are, for example, well this is kind of an easy one to discuss, which is related to computer security, but it’s not, but it’s an easier one. I just want to hide from the police and the police want to find me and they’ll be like in a car and there’ll be some regions where it’s easier for them to detect the cars with a long distance. But then there are other wooded regions where you’re not going to see the cars. And so if I’m the police, where should I be looking and if I am trying to evade the police, where should I be?
Steve Bowcut:
Interesting.
Richard Brooks:
Yeah, I solved that one. That was a fun problem. Also, just like directly I’m in a car and you’re chasing me and that sort of thing. But there are similar things with bandwidth and denial of service attacks sort of thing.
Steve Bowcut:
Interesting. And it’s just applying that same reasoning that we’d use in say, game development to cybersecurity. You got good guys and bad guys. How do you interact with each other?
Richard Brooks:
Another application, where we’ve looked at that is with autonomous vehicles, and platooning. So you’ve got the intelligent cruise control where you’re driving the car in front of you is driving, and then my car sinks with the car in front. And if you have a whole group of cars like that, you can save a lot on the gas. And then we looked at what sort of disturbances can we put into that which would cause you to use a lot more gasoline? And the actual result of that, it isn’t to find the purpose is not to find ways of messing it up. The purpose is if you can design the platooning system so that it can’t be disturbed by the others, you’re going to have a better and more secure platooning system.
Steve Bowcut:
Exactly. Yeah, exactly. Alright, we better move on. I wanted to talk a little bit about supply chain attacks. It’s a subject that everybody in cybersecurity is aware of and concerned about. And I thought it might be interesting to get your input on supply chain attacks. What are the critical things to know about that?
Richard Brooks:
Okay, start it off with hackers would find ways of getting into your computer at the beginning. They just want to mess up your hard drive or something. Then they wanted to steal your data and or credit card numbers. And then they realized, hey, if I get the server instead of the individual computers, I can just get everyone’s data. So basically they’re paying less attention to your pc, they’re trying to get the servers that have your data. And then they realize, well, if the servers are all using the same software products instead of attacking the server, if I infect the software product, I can get all of these guys.
And therefore, so you end up getting, there’s a bigger payoff, which means they can invest more, which means you’ve got a more sophisticated skillset of the attackers. And also you’ve got at some point, nation states and criminals just sort of merge or involved.
North Korea, the hackers are, the state and cyber attacks are how they’re funding their nuclear weapons problems. Russia, we’ve got a number of different criminal groups who’ve got a number of different intelligence agencies all sort of going and trying to hack into things. And they have a symbiotic relationship. If you’re a hacker group, if you’re a cyber crime group in Russia, you’re not going to mess with people in Russia.
In fact, then if you’ve got a good relationship with their state cyber war groups, you guys can be friendly and actually work together on things. And so it’s sort of working out that, and that’s with the supply chain. That’s sort of why it’s going up there. One thing that we’re doing now, we have automotive software over the air update project. And one thing that we’re doing is we’re bringing together a few different ideas.
There’s a thing called reproducible builds, which is you don’t just do the software build on one system, you do it on multiple systems and you have it set up so that builds on different systems should be bitwise identical. And that gets around the problem of just the build server being infected and then inserting things there, which is a common way of doing that. And we’ve integrated that with smart contracts and distributed ledger technologies. So then you also have redundant copies of the GIS, and you can then also automatically trigger the builds and so on. And that we have a prototype that we’re using with that and we’ve got other people that are dealing with the guts for how to securely transmit that to each ECU in the car.
Steve Bowcut:
I didn’t know that. So then you use distributed ledger or essentially blockchain to make sure that the build that’s being propagated over the air is not, nobody’s messed with it.
Richard Brooks:
That’s our innovation special sauce for that portion. There’ve been other people that are working on kind of similar things, but I find that there are a lot of people that are critical about distributed ledgers and blockchains and definitely there’s a lot of grift in that space. And I personally find the currency application the least interesting and useful part of the whole thing, which sets me apart from a whole group. But a lot of the things with redundancy and signing of updates and all, it’s just a good design approach. The student that’s working on that, the next thing she’s going to do is software bill of materials. That’s one of the things that NIST wants everybody to have so that each thing has attached to something saying all the components, and then if some component is found to be compromised, you’ll be told, oh, your thing is compromised. And we’re currently, that’s just sort of done manually and we’re going to automate that and integrate that into the same stuff.
Steve Bowcut:
Interesting.
Richard Brooks:
NIST is calling supply chain attacks the new normal and it just makes sense because you’ve got a much bigger attack surface on the backend.
Steve Bowcut:
Exactly.
Richard Brooks:
And also things have consolidated, so you don’t have that many vendors and that’s how they’ve been. For example, Okta’s the one that’s come to mind. You have security vendors like Okta sells authorization authentication software to major corporations and very sensitive parts of the US government. And as they’ve been compromised by foreign governments, which have given them access to the entire email systems of the Department of state. So on.
Steve Bowcut:
Interesting stuff. In your bio, I read a little bit about your research along the lines of monopolies and tech policies on cybersecurity and the influence that those things have. Can you talk about that a little bit?
Richard Brooks:
Okay. One thing that I did, I did an NSF proposal, which wasn’t funded based on this, it was an education proposal and the idea was to bring in cybersecurity and economics and the laws that we have a good reference for students who might be interested. There’s a book called “Geekonomics”.
Steve Bowcut:
Geekonomics, got it.
Richard Brooks:
Basically, the laws that we have in place. Another person to look at for information on this is Ross Anderson who recently passed on. He was a professor at Cambridge. Excellent work on economics and computer security. The laws that we have in place, computer hardware and software vendors are not liable for the financial problems related to security issues or failures in their products. Just imagine what the cars would be like if car manufacturers were not liable for injuries with their cars being defected.
Steve Bowcut:
The wheels fall off, right?
Richard Brooks:
And so what we have now, it’s just a logical extension of that. I mean, Circuit Grand could pay 5 cents per person to protect them, protect everyone from losing their data and so on. And why, I mean you lose your life savings doesn’t mean anything to them. I just took him at random as a tech billionaire, I should, I’ll just say generic tech billionaire. But I mean it’s basically moral hazard is the term.
The people that would have to pay money are not the people that are losing money. And what you have is an extension of that. And that’s part of the reason why the security systems, for example, for example, passwords, you’re supposed to have a long random sequence of numbers, digits and so on and so forth. Numbers, digits, letters, special characters shouldn’t make any sense. You should have a different one for each and you should just memorize that. And I mean, of course human beings are so good at memorizing long for sure.
So I mean whoever develops that as the business model for security, it makes no sense. And you end up then with people saying, well, it’s the user’s fault. They used a password that people could have predicted, we are making the systems, it’s not the user’s fault. Password should have been gotten rid of. It almost made sense in the 1960s where nobody had computers. So the threat was kind of small. But now, okay, definitely everyone should have a password manager and that sort of thing, but just you should not be using something like that for authorization to access the machines.
Steve Bowcut:
Yeah, that is interesting. And I like what you pointed out there that sometimes the tech industry will look at the user and say it’s their fault they didn’t follow this, what is essentially an impossible rule to begin with. Yeah, memorize all of these unique individual long random passwords. Humans can’t do that.
Richard Brooks:
I used to have that same sort of attitude. And then I was at a conference where I talked to the guy that was running security for one of the national labs and he said that he told everyone that worked for him. Do you know the ID tint era?
Steve Bowcut:
Yeah, I do.
Richard Brooks:
Also known as problem between keyboard and chair. And he told his people, if anyone ever writes that down, you’re fired right away. We put the systems together. If normal people can’t use them, it’s our fault, not theirs.
Steve Bowcut:
There you go. That’s what I was getting at. I appreciate that
Richard Brooks:
Anybody working in security should really take that to heart.
Steve Bowcut:
Yep. Alright. There’s a couple of things that I want to talk about before we run out of time that kind help our audience chart their course in cybersecurity education. I thought it was fascinating the work that you’re doing in West Africa and those kind of social implications that you’ve been involved with. So how would a student, what would they need to do at where they’re at now to be able to work in that field, some impactful social work? What do they need to keep in mind as they’re charting their course?
Richard Brooks:
Actually, Bruce Schneider has been putting a lot of his weight behind trying to come up with trying to get people interested in social computing-type things. I’ve also talked with journalism schools, they are also starting to integrate computer network security into their curriculum, which is obvious if you think about it. But if you haven’t thought about it, it’s not necessarily a connection that you would’ve made. And there are a number of, oh, I forget the name of it. There’s a company NGO around Silicon Valley that’s looking specifically at social applications. I think the tour project is still running. There are a number of, I’m particularly interested in privacy issues and there are a number of different projects looking at ways of coming up with systems that maintain user privacy. There’s also the I2P, the “Invisible Internet Project”, which is another thing, which is a set of volunteers as far as I’m aware.
And then there are a number of NGOs where you can be active. And I would say that, for example, the French one I’m dealing with it is mainly interested in the responsibility to protect people that are stuck in conflict zones. And so a number of systems like that have needs for people with computing and security skills and you can get involved with that. Another thing to think about, when I first started going to Africa, I was working with the World Bank. I could make one phone call a week home because it was so expensive and Conakry, which is capital City of Guinea, which is like 8-9 million people. At that time, there were two phone mines leaving the city. Now when I go there, they have better 5G than the rural US.
Steve Bowcut:
Really?
Richard Brooks:
And quite a few people have cell phones. I’m also putting this out there. China is very active and selling their equipment to their region, selling products to the region. Africa is the youngest continent on earth. It’s soon in a number of decades, they’re going to have the largest population. Also, if you’re looking at say, technology markets, north America and Europe are saturated, right? Everybody has these things. Africa is a growth market. And another thing is you see a lot of innovation in those. That type of region they had “M-Pesa” from Kenya is actually the largest online payments platform. They developed that because they didn’t have banks. I’ve also dealt with people who’ve done citizen-based election monitoring systems because if anyone here thinks we have election of regularity, but again, the people I dealt with in Senegal, they had a free and fair election for the first time, and that was all citizens getting together with cell phones and tracking what was going on at the different election precinct. And so there’s a lot of innovation going in on these places, and there’s a lot of possibility for, again, China sees it as a good growth market. I’m kind of disappointed we are supposed to be capitalists, right?
Steve Bowcut:
Yeah.
Richard Brooks:
Why aren’t people out there trying to sell things? I’m just saying. And so their commercial usage come up with your own innovations. And that would be in terms education, they’re different levels that you can do things if you want to go into the field, I think to do things that are interesting and reasonable, I would suggest a college degree. Not everybody wants a college degree. College degrees should not be there for everyone. But I would suggest that as a starting point, if you’re getting a college degree, probably in terms of just business opportunities and pay master’s degree generally makes sense. Put in the extra time and you’ll have a skillset that’s more in demand by the employers and that would really give you the skills that you need to do the work.
Steve Bowcut:
So if someone wanted to end up doing research like you’ve done, for example, would you say that they should probably focus on a cybersecurity-specific master’s degree or is computer science fine with maybe some emphasis in security? How critical do you think that cybersecurity-specific piece is?
Richard Brooks:
I have personally been disappointed in the cybersecurity-specific degrees.
I think like computer engineering, computer science, possibly the MIS systems or then actually at Clemson here we have “ICAR” International Center for Automotive Research. We have an automotive research campus and automotive engineering and mechanical engineering and industrial engineering. The overlap with computing and any of those fields is so large that, and there are people doing research in those areas. If you want to do research specifically, that’s where the PhD is worthwhile. But the motivation should not be that if you can call yourself doctor, people will think that you’re smart or that you think that you’re going to get a big monetary payoff.
The motivation should be, I kind of see it as a borderline obsessive-compulsive disorder. There are questions that I just really want to look at and I want to look at in depth. And for that, the PhD is your entry card into that. And then the other thing that the PhD can give you, which is for me, the payoff. You have more control over your life and intellectually what you’re looking at, people have been hiring you to lead and to give the direction and give guidance at that level. So you’re the person who should have the big ideas and say, this is what we’re going to be looking into. And that for me, I would take a 60% pay cut to do that.
Steve Bowcut:
Excellent.
Richard Brooks:
Might happen in academic.
Steve Bowcut:
All right, Dr. Brooks, we’ve used up our allotted time, but this has been fascinating. Thank you so much. I really appreciate you taking some time out of your day to share with our audience. This is going to be very valuable and useful information for them. So thank you for your time today.
Richard Brooks:
Okay.
Steve Bowcut:
All right. And a big thanks to our listeners for being with us, and please remember to subscribe and review if you find this podcast interesting. And join us next time for another episode of the Cybersecurity Guide Podcast.