Afternoon Cyber Tea with Ann Johnson 12.23.25
Ep 121 | 12.23.25

Lorrie Cranor: Why Security Fails Real People

Transcript

[Ann Johnson: Welcome to "Afternoon Cyber Tea," where we explore the intersection of innovation and cybersecurity. I'm your host, Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. [ Music ] Today, I am joined by Dr. Lorrie Cranor, Director of the CyLab Security and Privacy Institute at Carnegie Mellon University, and one of the world's leading researchers on usable security and privacy. Lorrie's groundbreaking work has transformed how we think about authentication, passwords, and the human side of cybersecurity. Lorrie, welcome to "Afternoon Cyber Tea."

Dr. Lorrie Cranor: Thank you.

Ann Johnson: So I am really excited to dig into your research and what it means for our chief information security officers who are trying to build security that works not just in theory, but in practice, and I definitely want to start with this usability gap we have in cybersecurity. I know you have spent your career studying how people actually interact with security tools. So can you tell the audience why do so many security controls fail in practice, and what does that tell us about the usability gap?

Dr. Lorrie Cranor: Yeah, I think in practice, when people are designing security tools, they're focused on security. And they often don't take the time to think about the users and how the tool would fit into their workflow. And often, the security experts behind the tools are not actually usability or human factors experts. And so, without the security people working in partnership with usability people, we often forget to consider the human and the user.

Ann Johnson: Makes a lot of sense, and let's pull the thread on that a little bit. When you think about CISOs and how they are designing their programs today, what is the most common mistake you see them make in terms of usability?

Dr. Lorrie Cranor: I think just not thinking it through.

Ann Johnson: Yeah, that makes sense because as you said, they're cybersecurity professionals. They're not actually looking at it from a user lens or looking at it from a risk lens or from securing their trust- -- you know, their environment lens.

Dr. Lorrie Cranor: Yeah, and increasingly, I think we are seeing CISOs who get it and who are trying to figure out how they can consider the user end. But that's, I think, a relatively new development.

Ann Johnson: Good. Well, I hope it increases, honestly, because when you think about the work that you've done on passwords and authentication, it's been foundational. But it's another place where we have tremendous improvements we need to make from a usability standpoint to particularly make consumers, but also employees at all different types of organizations, safe. We know the passwords themselves are flawed, yet we're still relying on them. So why do you think it's been so hard for the industry to move on from passwords?

Dr. Lorrie Cranor: Well, we haven't really found a great solution that is better than passwords, that meets all the criteria that we have. I think, you know, we want something that is going to be more secure than passwords; easier to use; compatible with a wide range of different devices; and also, by the way, compatible with all sorts of legacy software. And it's really hard to find something that meets all of that criteria. I think in some specific domains, we've been successful, so I think in the context of mobile phones, the biometrics that are used on a lot of mobile phones, either face recognition or a fingerprint, are effective in that context. But it's not effective in contexts that don't have a camera or a fingerprint reader, and it may not be secure enough for a lot of contexts.

Ann Johnson: And I think that's right. I also think that there's so much friction, right, for end users when you move from using some type of biometric, when you try to get them to use some type of even a hardware or software token or some type of YubiKey, et cetera, it's just -- it creates something in their environment. We'd really like to get, honestly, Lorrie, to this place of passwordless authentication, right, but there still then has to be some type of authentication. Do you think passwordless is going to become mainstream, and let's talk for a second about passkeys. You know, I get the prompts on my phone, right, you know, do you want to use a passkey for this app? And it's always like, yes, I want to use passkey for the app, but as a cyber professional, and also a consumer, I often think about what the user experience is because I look at it and say, okay, if this is complex for me, who, ostensibly, has been doing this a long time, you know, what's it like for the average person? So do you think -- do you really think passkeys are the things that are going to remove the friction?

Dr. Lorrie Cranor: Not anytime soon. I think the concept behind passkeys is good, but they're confusing. And yeah, I also am confused by them. If I accept the passkey here, and then I want to access this account from another device, what do I do? And I often, in the passkey process, you know, get confused about where I am and don't know whether it succeeded or what's going on. And so, you know, when my less technically sophisticated friends say, should I use passkeys, I don't really know what to tell them. Yes, in theory, they're more secure, and it will eventually be easier. But you know, if you run into problems, I'm not going to be able to help you.

Ann Johnson: Makes perfect sense. We really need to truly get to the place where we're passwordless, and then, truly get to the place where we make the user experience of logging in incredibly simple. And I know you talked a little bit about biometrics, but for most users, that's at least the simplest thing for them to do, and it's something they're reasonably familiar with.

Dr. Lorrie Cranor: It has worked reasonably well on recent models of cell phones, and it wasn't always that way, though. I remember the first phone that I got that had face recognition, I probably got it about 18 years ago. I turned it on for a couple of weeks when I got the phone, and then I turned it off because any time I was not in a well-lit room, it didn't work. And then, you know, the last straw for me was when I left my phone sitting on the kitchen counter and my six-year-old child picked it up and authenticated. And I was like, okay, maybe I shouldn't be using this, but it's very different 18 years later.

Ann Johnson: It is. The technology has certainly improved 18 years, you know, later. I was actually at RSA Security, so doing hardware tokens up until about 11 years ago, and I think about, just, the light years we've come, right, in just that short period of time. Speaking of that, take us out five to 10 years with your research. What does digital identity look like, and what role is usability actually going to play in making it real and better?

Dr. Lorrie Cranor: Yeah, I'm not very good at predicting the future, and when you say "digital identity," so that's not just the authentication, but also, there are issues like age verification, and knowing there's more to the identity than just unlocking the phone. I think that things are coming to a head where politicians are getting involved, and, you know, age verification is a good example that in jurisdictions all over the world, politicians are saying, well, we need to age-verify kids before we let them access all sorts of things. And the current solutions that vendors are offering are pretty privacy-invasive and not actually very secure and can be easily routed around by not-very-clever kids, right? So that's clearly not how we should be doing this. And so, there are also proposals and systems where, you know, everybody has some sort of a digital wallet, which can be used to store various identity information and credentials, and we'd like to get to a point that anytime you need to prove that you're over 18 or over 21 or under a certain age or whatever, that you should be able to use this digital wallet to prove that without having to send all your personal information to whatever website wants you to do that. I think that that is a great example. I love the way that you expand the conversation about authentication because I did ask you more than an authentication question, and it's also a really great lead into the next topic I want to talk to you about, privacy. And you brought up age verification, there's a tremendous -- when I talk to my own child -- I have a 24-year-old -- when I talk to my own child, this concept of privacy is a little bit foreign to the Instagram, Snapchat, TikTok generation, right? They just don't think about it the same way we do, but I think they should. So how do you see users' expectations about privacy shifting now that we are in an era where we have pervasive data collection and we have AI-driven systems. We have people voluntarily putting all of their information out on social media for the world to see. How do you think about privacy? Yeah, so I've been doing privacy research for about 25 years, and I think people's attitudes have shifted some, but not in the way that it's often characterized. Like, I often hear the media say things, like, you know, young people don't care about privacy anymore. Actually, nobody cares about privacy. Look at all the data they give away, and I don't really think that's true. So when I started doing research in this area, when you talk to people about various technologies that were invading their privacy, they actually were quite surprised. Sometimes they didn't believe that these things were real. I remember talking to people about third-party advertising on the web, and people said, "Really, they can do that? That sounds like science fiction." And, you know, they definitely didn't like it once they heard about it. They said, "It sounds like they're following me behind my back. This is terrible. Are you sure this is really happening?" All right. Today, you talk to people about these sorts of things, and even new things that are just barely happening, and people are not surprised. They're like, yeah, I know. Everybody can spy on you all the time, and there's nothing you can do about it. They don't like it. They still would like to protect their privacy, but they feel powerless to do anything about it, and many of them will say, well, I've really just given up. I like the convenience of using all these privacy-invasive services, and since there's nothing I can do about it, I've just given in, and I use them.

Ann Johnson: Yeah, I agree, and I do agree that I don't think people understand, to your point. When you talked about privacy, everyone is concerned about privacy. When you explain to them, then, the data they're freely giving away, they suddenly realize, oh, well, I'm not actually following my own concerns, or something like that.

Dr. Lorrie Cranor: Well, you said "freely giving away," and I would argue that often it's not free. Like, you don't have to give away this data, but then you're going to miss out on something, or it's going to be a lot harder to do the thing that you want to do, and the workarounds to not give away the data are cumbersome, and time-consuming, or expensive. And so, when people feel like they don't really have a choice, in some ways, they're right.

Ann Johnson: Yeah, exactly. If they need an access to a service -- or most people actually don't read the terms of service, but if they need access to a service -- they're going to give their data away, if they need that access, right?

Dr. Lorrie Cranor: Yeah.

Ann Johnson: I wish terms of services were a little less complex, and more explicit, and said, here, you know, there's a summary, right? TLDR, here's the five things you're agreeing to. That would be ideal.

Dr. Lorrie Cranor: It would be, but then beyond that, we need to actually have real choices for people, so that you can get useful services without having to give everything away.

Ann Johnson: Exactly. So when you advise organizations, what do you tell them about designing for transparency and trust, speaking of my, you know, I wish there was a small TLDR, not just compliance, but actually designing their systems for transparency and trust?

Dr. Lorrie Cranor: Right, yeah, so the first thing to realize is that compliance is not enough if you want to actually have a trustworthy and pleasant user experience. So, you know, you could say, well, we comply with these 10 things, but that doesn't mean you're done. So, it's really important to actually do user studies, and to see how users are navigating your system, and interacting with the privacy-related features, whether they're, you know, the informational, getting information, or changing their settings, or understanding what their current settings actually are. So definitely, you know, start by looking at what users actually do on the system. And then, to improve designs, there's a lot, and I've written a lot on this. We start with things like keeping it simple, trying to put all of the privacy-related things in one place where you can find them, but also, putting just the piece you need to know just in time in the place where the data is collected. So if I'm filling out a form, having a little blurb to the side of the form explaining what you're going to do with what I fill out is great, and then a link there to the full privacy policy if I want all the gory details. But probably, I just want to know right now about this form, and not all the other stuff that your company collects, so those are some examples. We're actually working on a framework at Carnegie Mellon called Users First that is designed to help designers actually improve their privacy-related interfaces in their products and services. And it, basically, has a list of -- we call them threats, but basically, you know, common things that can go wrong. And we ask designers to, basically, go through and systematically look at every touchpoint they have with a user related to privacy and go through this list and say, is the information comprehensible? Are the choices easy to understand? Are there a reasonable number of choices, and things like that?

Ann Johnson: I think that's all fair, and I do think that the simpler you can make it -- humans are busy. Humans are often in a hurry, so putting in practice, not just for consumers, but for your employees, things that make it simple and call out the important things, it's just fundamental -- which takes me to that question about behavioral insights, right? For security leaders, I know that one of your key contributions has been showing that human behavior is actually central to how we secure enterprises and environments. How do you think CISOs should apply your research to improve adoption of security practices from a human behavioral standpoint?

Dr. Lorrie Cranor: CISOs need to look at the research that is applicable for the particular problem they're trying to solve. So if, for example, they're trying to improve their password policy, they should, you know, read the research on password policy. If they're trying to improve their access control system, they should read that research. And I think looking at what has been empirically tested and then trying to figure out how that applies to their particular situation, because, of course, we haven't tested their exact situation most likely. But nonetheless, there's probably things that they can take away from what we and other researchers have tested to figure out how this would apply in their situation. And then, I strongly recommend, once you think you have a solution, doing at least a small user study to make sure that it actually works the way you think it will work.

Ann Johnson: I think that makes a lot of sense, and the one thing that occurs to me is that in academia, you actually have the time, you know, probably never as much time as you want, but you have the time to complete meaningful and well-researched papers and just to do the work that you do; whereas, a lot of businesses are always moving very quickly. So how do you think about advising folks that balance, right? If they want to move really quickly, what shouldn't they sacrifice as they're moving quickly?

Dr. Lorrie Cranor: Yeah, my job, besides teaching students, is to do research; and so, yes, we spend a lot of time on it. And there are ways, though, that you can get the information that you need to make a business decision a lot more quickly and inexpensively. So there's a range of, you know, what a research study means. At the low end of the range, or the easy end of the range, is to say, you know, get a handful of employees to try the system and watch them use the system before you launch. That's, like, the easiest low-hanging fruit. Better would be to get people who are not familiar with your product or service to do that, or if it is a security system for employees, make sure that it's not the security team who are testing it, but whatever other random employees in the company that will have to interact with it. You know, get them to test it. And, you know, even having five to ten people test something can actually give you really useful insights. So at the very minimum, you want to do something like that. And then, you know, depending on what it is that you would like to roll out, there are other ways of getting information. It may be doing some focus groups with people in the target audience. Those also don't take a lot of time, and you can get, you know, eight people in a room and in an hour get a lot of feedback about something, so those are all good things to do. Now, if you have a little bit more time and resources, one of the things that we actually even do in research is that we take advantage of crowd workers in order to do research studies quickly and inexpensively. So you can actually, like, put up a survey for crowd workers, and depending on how particular you are about the demographics of the people you're recruiting, like, you could, in an hour, have a few hundred responses and just pay people, essentially, minimum wage for their time. There are definitely ways that you can get a lot of feedback very quickly.

Ann Johnson: I think that makes a lot of sense, and just, a lot of organizations will test with a small pilot group and make sure they get things right. So I think that's something for everyone to remember. I'm going to ask you a hard question now.

Dr. Lorrie Cranor: Okay.

Ann Johnson: When you think about closing the usability gap, if you could redesign one widely used security control from scratch to actually make it work with humans instead of against them, what control comes to mind?

Dr. Lorrie Cranor: Oh, I mean, passwords is an obvious one that I think we all realize that the system of having people remember -- supposedly remember -- you know, a hundred unique passwords, which is, you know, about the number a lot of people have, it's just completely not working. And so, I think there are a lot of efforts to try to replace that with something else, and I think the workarounds that we have right now, including password managers to remember them for you, are a step in the right direction, but we're not there yet.

Ann Johnson: I think that's fantastic. Let's do the opposite. Is there a security tool that you can think about today that actually gets usability right?

Dr. Lorrie Cranor: So I think encryption in web browsers, you can browse the web and have encryption between your browser and the website, and you don't have to do anything to make it happen.

Ann Johnson: Yep.

Dr. Lorrie Cranor: It says HTTPS and it just does it automatically behind the scenes, and that's beautiful.

Ann Johnson: That's great. I love that, and it was really easy and something everyone will understand.

Dr. Lorrie Cranor: Yeah.

Ann Johnson: So at "Afternoon Cyber Tea," we always close with a note of optimism. What gives you hope that we can finally bridge the usability gap in cybersecurity?

Dr. Lorrie Cranor: Well, we have actually seen progress. When I started working in this area about 25 years ago, there, first of all, was very little research. I started looking for usable security papers, and there were, like, two or three out there, and I started looking for usable security researchers, and I found a dozen or so people. And I looked at, well, what companies were actually thinking about this? And there were very few, and I think today, well, there are thousands of usable security research papers, and at least hundreds, if not thousands, of usable security researchers. And we're seeing that companies are increasingly trying to make some efforts to find more usable security solutions. There's still a lot of work to be done, but I feel that we actually have made progress. And, you know, things like the, you know, encrypted web browsers is a good example of how far we've come.

Ann Johnson: I agree with you, and just the fact that you are doing this work and people like you are focusing on it, honestly, gives me optimism, right? Somebody's actually paying attention and has been for a while. We will solve the usability problems, and, hopefully, the next generation of technology as we adapt it will continue to help us.

Dr. Lorrie Cranor: Yes.

Ann Johnson: So, Lorrie, thank you so much. I know you're incredibly busy. I really appreciate you joining today. Your research has definitely reshaped how we think about usable security, how we think about privacy, and I know the listeners are going to walk away with some practical advice, which is what we always try to give them. Thank you so much.

Dr. Lorrie Cranor: You're welcome. I enjoyed this.

Ann Johnson: And many thanks to our audience for listening in. Join us next time on "Afternoon Cyber Tea." [ Music ] So I invited Lorrie Cranor to join "Afternoon Cyber Tea" because we don't talk enough about usability and cybersecurity, and I think it's an incredibly important topic, and her research and her work over the past many years has led to fantastic outcomes. Great conversation. I know the audience will like it. [ Music ]