Afternoon Cyber Tea with Ann Johnson 1.6.26
Ep 122 | 1.6.26

The Best of Afternoon Cyber Tea 2025

Transcript

Ann Johnson: Welcome to Afternoon Cyber Tea, where we explore the intersection of innovation and cybersecurity. I'm your host, Ann Johnson. From the front lines of digital defense to groundbreaking advancements shaping our digital future, we will bring you the latest insights, expert interviews, and captivating stories to stay one step ahead. Welcome to a very special edition of Afternoon Cyber Tea. I'm Ann Johnson, and today we are going to do something slightly different. At the start of the year, the cybersecurity industry loves to predict the future. AI will change everything. Quantum will break everything. Nation-states will target everything. And here's the thing. In 2025, we weren't just talking about trends anymore, we were actually living them. This year, the conversation on the show got real. We stopped asking what if, and we started asking what now. We heard from CISOs defending critical infrastructure that never sleeps, academics warning us about encryption's expiration date, leaders fighting burnout in their teams, and storytellers reminding us why the human element still matters the most. So as we close out 2025, I wanted to ask, were we right? Did we see what was coming? And most importantly, are we ready for what is next? Here are the conversations that defined 2025. I'm joined by Professor Amy Edmondson, the Novartis Professor of Leadership and Management at Harvard Business School. Amy is not just an expert in psychological safety in the workforce, she is the expert. So could we start just by talking about psychological safety? Because it's a word that's used a lot in organizational development.

Amy Edmondson: The term itself has a kind of implication of comfortable and cozy and nice, and that's just not what it is. So let me first give you my formal definition of psychological safety. It describes a climate in which people believe their voice is welcome, where they believe they can take the interpersonal risks of speaking up, with an idea, a question, a concern, a mistake, a descending view, and not that it will be easy and fun all the time, it usually isn't, but that they believe it's welcome. They believe it's what we do around here. Many of the case studies that I have done gone into great detail on, say, the Columbia launch failure of 2003 or, you know, many other sort of real disasters, were literally avoidable had people spoken up in a timely way. So it's -- I can't tell you how much I think about and value the speaking up about early warning signs. They're not worried about, oh, how do I look? They're like, this could be a nothing, but I'm going to raise it. I'm not going to be afraid of being called Chicken Little, the sky is falling, when of course it isn't. But so I'm much more interested in that topic, right, that people can speak up about early warning signs of a potential breakdown or failure. But early warning signs about psychological safety, you know, and whether or not it's present, I don't think about that quite as much, but to freewheel a little, it's basically I think in today's, you know, complex, turbulent world, an early warning sign is a sign that doesn't happen. It's the bad news, the questions, the dissent, the mistakes, the failures that you're not hearing about. So if you are a leader of a team and you're hearing an awful lot of good news, you know, everything seems to be green and nothing seems to be red, that is probably a warning sign that you don't have enough psychological safety. Because it just can't be the case that things aren't going wrong or that people don't see things differently, but it can be the case that you're not hearing about it. Because this is in fact the kind of environment where psychological safety is most important. And, you know, maybe ironically, when leaders call attention to the fragility, the complexity, the ever-present potential for breakdown, that makes it more psychologically safe, not less. Because fundamentally, it makes it discussable. It makes the reality of the situation discussable. And when leaders don't do that, people will naturally assume or think of the situation in the old-fashioned way, the conception of the work environment where people are supposed to hit their targets and always do a good job and expect certainty and be perfect. Like that is not the world that cybersecurity professionals live in. So when leaders call attention to that reality, to what's at stake and how much very real uncertainty and complexity and interdependence there is, that gives permission for people to speak up about it. Like you're saying, we should expect things to go wrong. The only real question is, will we hear about it; will we hear about it in a timely way? Things are going to happen. There will be breakdowns. There will be coordination and communication breakdowns. But by naming it, and by naming it early and often, it gives people permission to be part of the catch and correct system.

Ann Johnson: I'm excited to welcome Christina Morillo, who is the head of Information Security at the National Football League's New York Giants. Welcome to afternoon cyber tea, Christina.

Christina Morillo: Thank you so much, Ann, thanks for having me.

Ann Johnson: So when you think about your journey and you think about new organizations and different roles, how do you go about assessing where the team is on their cyber journey? And what is your approach to actually taking and shaping a strategy that meets them where they are, but gets them to the place of maturity where you want them to be?

Christina Morillo: So that's always a tough one. One thing that I will say is that I never walk in with a checklist, I always walk in with curiosity. One of my first moves is to listen across functions, right? I want to know how people have experienced security, if they understand security, what, you know, our corporate leaders, how they feel about security, where there are any gaps in terms of the culture as well, that's super important for me. In parallel, I also assess fundamentals, right? I look at our policies, architecture, our identity, awareness, detection. But I'm not really just looking to audit, I'm kind of looking for alignment, I'm looking to see where our security goals are in sync with business priorities, where they're not in sync. And then I build the strategy rooted in where we are, right? Not where we wish we were.

Ann Johnson: So cyber is full of misconceptions, right? How do you go about helping people get from that misconception to actually having a really mature understanding of the industry and a responsible understanding?

Christina Morillo: That's such a great question. One of the biggest misconceptions that I see within cybersecurity is that it's just an IT thing. It's IT's job. It's something technical that sits off to the side. IT will take care of it. It falls under IT, and that's it. The truth, as we both know, is that it's a business risk issue, not just a technical one. So part of what I do is, you know, I work really hard to bring security into broader conversations, like, you know, with operations, with finance, even with HR, right, in terms of like identity and onboarding and all of that stuff, so that people understand like how their day-to-day decisions impact the organization's risk posture. Something else that I see a lot is, oh, if we're compliant, we're secure, right? Like just check the box and that makes us secure. And that's not true. And that's something that I have to emphasize over and over again. I try to incorporate real-world examples. There's so many breaches and examples nowadays, I feel like we see one every other week, where companies are fully compliant and still got hit, right? Because maybe they weren't actually secure where it mattered the most, right? Maybe there was a process failure as an example, not necessarily a technical one. So my real focus is just to make security relatable, right, across the organization.

Ann Johnson: How do you think about risk when you're building a security strategy? And how do you think about compliance? And how do you get your leadership and your peers aligned around the risk and aligned around the cyber risk, even if it isn't related to compliance?

Christina Morillo: I will say that is always a journey, it's a never-ending journey. But one thing I've learned is that risk isn't always about the math, it's about the story, right, or your ability to tell the proper story. So for me, you know, when I get pushback, I don't really argue. What I try to do is I try to reframe the conversation around business impact. And again, I go back to those real-world scenarios. You know, I'll say something like, hey, here's how this type of risk has played out for others. Or, hey, if this happened here, what would this cost us in downtime or reputation? Or how would this impact football operations, right? So I always start with that business impact and what's at stake if the risk plays out. In terms of revenue, reputation, operations, etcetera, I listen for pushback, of course. I tell stories around it. I give examples. I, you know, listen at scale. I try to understand where the pushback is coming from, if there's just like a lack of awareness, if there's a misconception somewhere. And then, you know, ultimately, if things start to feel a little bit subjective, I try to turn them into decision points. You have to be flexible. You have to pivot. I think the most important thing, though, is to keep protecting the mission top of mind. Like, whatever our mission is, right -- if our mission is to win football games, if our mission is to, you know, delight our fans and our customers -- like I have to keep that at the forefront.

Ann Johnson: Well, let's talk about, you know, strategy is only successful if it's well adopted and if you measure it, right, and if you continually measure it and then continually get feedback, get everyone on board, going on a journey. How are you collaborating across the internal departments and with key stakeholders across the other NFL teams? And what is the key to managing those relationships?

Christina Morillo: So it's amazing. I mean, I won't take credit for like the community that has been set up. That's, you know, credit to the NFL CISO and his information security office. They've done a great thing with bringing us all together, like the 32 clubs. So, you know, we're always on phone calls multiple times a month. We have -- we share threat intel. We meet in person a few times a year as well. At the end of the day, we all have the same shared goal, right? Which is to protect our fans, protect our clubs, protect the overall league. One of my favorite elements of this entire journey has been meeting other information security officers across the different teams and learning more about their strategy, their processes, and us kind of like comparing and exchanging notes. That has been a joy because it's like our own little security community. I'm always encouraging people to share more externally so that the overall cyber community can get more of this goodness. But it's all about relationships, I think. You know, it really, for me, has been about relationship building, making that time, not only when they're urgent moments, but just overall.

Ann Johnson: I'm excited to be joined by Frank Shaw, chief communications officer at Microsoft. Welcome to Afternoon Cyber Tea, Frank.

Frank Shaw: It's so great to be here. It's always nice to spend time with you, Ann.

Ann Johnson: Cybersecurity is not just a technical conversation. It's about how people understand risk and ultimately how trust is built. And communication is the key to that bridge to connect the technical reality and also connect human perception.

Frank Shaw: When I think about all the different topics that we have to deal with, security and cybersecurity sort of tests us the most. Because they're inherently complicated topics. They come with an enormous amount of risk. And they're easily misunderstood. We give people the information they need to take action without scaring them into taking the wrong actions, which can easily happen.

Ann Johnson: One of the things that we struggle with, and because you and I have had a lot of conversations, is at the beginning of any event, we're in the fog of war. So we want to get the information out there so people can protect themselves. We want to be as accurate and as transparent and as fast as possible, but these facts are changing also.

Frank Shaw: Transparency is absolutely the key. And our ability to, as an industry, to talk about what has happened and what we have experienced in a way that allows others to learn from it is absolutely critical.

Ann Johnson: The year is 2025, so we're going to talk about artificial intelligence. And you've spoken often about how AI is transforming communications. How do you see AI changing the way organizations handle communications, including cybersecurity communications, and the crisis response to how we shape trust?

Frank Shaw: Effective use of AI allows us to move more rapidly in moments of crisis because we have better access to information and we have better access to then insights about what we might be able to do.

Ann Johnson: Perception can become reality very quickly. A breach doesn't just unfold in technical terms. It trends, it's debated on social media, and sometimes misinformation will outpace the facts.

Frank Shaw: The big challenge we've got from a communication standpoint is this absolute fragmentation of influence. In order to reach the people you want to reach, you have to really be crystal clear on the most important audience for you and then understand who reaches that audience.

Ann Johnson: Security awareness at Microsoft depends on how well we engage our employees. We can patch all day long, but at the end of the day, we need, you know, over 200,000 people to take phishing seriously.

Frank Shaw: From the top on down, we've established security as a high order priority. And one of the ways that I know it's successful is because people complain about it. And they complain about it because that they're having to do something differently. So I do look at that little friction in the system, that sense that I have to do something differently, is a good sign that we're landing our messages internally, and then behavior has shifted. You have to have strategic patience. Because it's going to operate at its schedule, not yours. Trying to fix it at the last minute is also, you know, a little bit of a fool's errand. On the proactive work, we have to think super hard about what is the story we want to tell, and to whom, and what can we say, and when can we say it, and be looking for things all the time.

Ann Johnson: Exactly. So we've also had fun along the way. Sometimes a creative campaign or a great story can really land and stick with people. I would love if you'd walk me through one of your favorite cybersecurity campaigns or stories that you and your team helped bring to life and what made it successful in cutting through the noise.

Frank Shaw: Some of the best ones are where we get permission to look back at a big problem, a challenge, and then take a reporter through what happened there. And this is the transparent part as well. So we detailed all of this in a report for the audiences like we ordinarily do with customers and industry analysts, and, you know, they all want the technical details, and we provided it to them. But we also know that this is something that consumers care about. AI is still relatively new for consumers. It can be seen as scary when they hear about things like cyber criminals targeting them with AI, that's scary. So we wanted to land this in a mainstream way as well. You could say something in one market and have it be effective, and then you say the exact same thing in another market without considering some of the cultural differences and just get a lot of negativity. We rely deeply on the local sensibilities to make sure that it makes sense for them.

Ann Johnson: I consider myself a cyber optimist because I do know for everything you see in the news, as an industry, we've blocked thousands of events. So despite the challenges, there's always something to look forward to in this field, whether it's new talent, new innovation. I truly believe AI will be innovative here, the spirit of collaboration, how we improve communications that are more effective.

Frank Shaw: A lot of my optimism is grounded in the fact that I get to work with these incredibly smart people from across the company in the security space. And, you know, anytime I'm dealing with an incident or an outage or a new program we're putting in place to prevent these things, and you just get to talk to people here at Microsoft, and I'm sure across the entire security industry, who are such bright, committed people doing amazing work to stay ahead of what is just this relentless onslaught. And, you know, every day I feel like, wow, I'm so glad that I have these people on the team here, and everybody should feel great about that.

Ann Johnson: I am thrilled to welcome Dr. Hugh Thompson, the managing partner at Crosspoint Capital Partners and the executive chairman of the RSA Conference. He helps build, execute, and secure the world's largest cybersecurity conference. Welcome to Afternoon Cyber Tea.

Hugh Thompson: Thanks so much for having me.

Ann Johnson: Talk about what goes into building the event. How far in advance do you start planning each conference?

Hugh Thompson: You think about 44,000 humans getting together, there's a lot to pre-plan. So we start about 18 months in advance of the actual event. And it's everything from, you know, what is the theme going to be? How much space do we think we need for different types of sessions? What have we learned from, I guess, the conference two years prior in order to plan for the one that's coming up 18 months from now? So it's a long cycle, and there's an amazing team that's been working on this for a long time.

Ann Johnson: What is your approach to choosing a theme; how does that work? How do you think about a theme that resonates with such a diverse, such a global audience?

Hugh Thompson: It's tough, and there's a lot of debate that goes on internally around the theme every year. And about, I'd say, 12 years ago, we started a track called "The Human Element." And it was all about how people interact with systems. And it was really popular. And then the next year when the debate came up, you know, geez, what's the theme for, you know, 18 months from now? And everybody agreed "human element" was the right one. Because cyber really comes down to people. Whether it's the folks that you're trying to protect, the folks that are the defenders that are in cyber, or the attackers. And ever since then, I think you'll notice if you go back over the last six or seven years, many of the themes have had this human element touch to it.

Ann Johnson: You get these speakers that have such high profiles. You also get everything from hackers to CEOs. So how do you ensure the program, again, appeals to all levels of experience as you work through those program committee decisions?

Hugh Thompson: So as part of the submission, there is a level rating of how technical do you have to be to really get something out of this talk. And what we aim for, depending on the track, is to match up the level of technical sophistication with the track. And we always strike the balance between things that are very specific to a field and also things that can be accessible by just a wide variety of folks that are just curious and want to learn more. It's been an expansion of our programming to not just have some of the very technical sessions, but also have these higher level, philosophical, futures, policy sessions, too. And it really is a testament to how important this industry has become in society.

Ann Johnson: Do you ever get to experience the conference like as an attendee? Do you get to walk the floor and be an attendee?

Hugh Thompson: Yeah, absolutely. I make sure to carve out some amount of time, obviously, it's very busy during the conference week, but some amount of time to walk the show floor. Because it's very important to go to at least two sessions where I don't know the person and it's something that's very interesting to me and it's something that I feel like I don't know very much about. Even though I've been in security my whole career and have written three books on it, you can always learn something from somebody else, no matter who they are. You can't walk away from RSA conference, especially this past year, and not be optimistic about what we can accomplish if we band together as a community, you just can't, because you see the ethos of the people that are in the fight with you. They're folks that really care. They actually care. It is a mission for them. It is a calling. And when you have smart people that are aligned together with a mission against a common enemy, amazing things can happen.

Ann Johnson: Thank you for joining me. I know you need some downtime post the conference. I hope you get that downtime. And I appreciate you making the time because I know how incredibly busy you are. Security often fails not because of technology that is broken, but because the technology does not work for people. Breached data is really a story about us. It is about how attackers adapt and how people keep repeating the same mistakes.

Troy Hunt: As of the time of recording, we've got just over 17 billion breached records in this service, nearly 7 billion unique email addresses. When someone gets breached, they usually get breached more than once. Time on the internet increases risks and increases likelihood of exposure. I find that really that the biggest blocker for organizations disclosing is that their number one priority is not to their customers, despite what the disclosure emails often say, their number one priority, and it's probably not surprising, is to shareholders. And what that means is protecting organizational value, making sure that the share price doesn't take a hit, that investors don't lose confidence. And that's this conundrum that people are referring to as data breach fatigue, where we're getting so many of these notices that we're sort of like, oh, well, you know, it happened again. But maybe what it's doing as well is changing our behaviors or necessitating that we change our behaviors and we stop sort of treating each individual incident as some major thing. And we structure ourselves such that we expect breach and we're resilient to breach. I'm a cybersecurity guy, and I got phished earlier this year, like proper successfully phished. I was jet lagged and I had this email allegedly from MailChimp about my account being locked because of spam complaints, and that seemed very feasible. And I followed the link and my password manager didn't auto complete my strong unique password. So I copied and pasted it. I had two-factor turned on and it requested the six-digit token, which I copied and pasted from my code generator into the phishing site. And about five seconds later, my brain went, hang on a second, you know, this isn't right. So I demonstrated these human weaknesses that social engineering and scams and attackers take advantage of. One of them was fear, losing access to my mailing list. It caught me in a moment of weakness. People have moments of weakness. You know, they're tired, they're rushed, they're concerned about losing something. And the great thing about transparency is that it's almost like a self-evident proof. Open transparency can very quickly disprove, in this case, fraudulent claims.

Ann Johnson: In the same vein, do you think that we're moving toward more transparency, more disclosure, openness? Or will organizations try to minimize what they share unless it's mandated or regulated?

Troy Hunt: Yeah, unfortunately, I think that's what it is. One of the things that a lot of people don't understand is around what are the obligations, the regulatory obligations, of organizations for disclosure. For things like disclosure, the regulatory obligations are usually around reporting to the regulator. So you might have to, if you're in the UK, for example, report to the Information Commissioner's Office, and you have to report to them within 72 hours. But then you get to self-assess around the necessity to report to individuals. And GDPR uses terms like "jeopardizing the rights and freedoms of individuals." In Australia, we have what we call the "Notifiable Data Breach Scheme." And if the breach is likely to cause serious harm to the individuals, you need to disclose to them. But outside of that, and outside of particular specific classes of data, such as medical data or other financial data or other sensitive classes, you just don't need to disclose. And people, when the penny drops, they're outraged. They're like, how on earth do we not have to hear about this? So what will often happen for me is someone will send me data. And while we're doing this podcast, I saw one pop up where someone said, look, this organization has had a data breach, and also here's a link to them denying it. And the link is to a tweet, which basically just says "fake news." Now, I'll have a look at that data, and I'll be able to verify it, and if it's legitimate, I'll get in touch with that organization and say, look, you know, I think you should look at this more closely. It's not consistent with what you've said online. And the advice I'll normally then give is, look, the truth is in the data. We will get to the bottom of the truth. And particularly, if it's in public circulation, you cannot escape that truth. Now, this is your opportunity to have some control over the narrative. You can either analyze this, come up with reasonable conclusions, make statements about it, and deal with it appropriately, or everyone will draw their own conclusions. And they have the data. They will be able to draw accurate conclusions in some cases, inaccurate conclusions in others. But unless you control the narrative, you have no ability to control what people say about it.

Ann Johnson: Welcome, Jack Rhysider, the creator and host of fellow cybersecurity podcast, the Darknet Diaries. Welcome to Afternoon Cyber Tea, Jack.

Jack Rhysider: Thanks for having me.

Ann Johnson: I know you started on your own. You had no background in podcasting. What drove you to tell these stories and what drove you to a podcast for the medium to tell the stories?

Jack Rhysider: I wanted the show to exist, and nobody really understood because I pitched it a few podcasters, and they're like, I don't really understand what you're talking about. Why would anybody want old news? We only do new news here. And so I said, well, I guess this might be something I have to make myself. If I want to hear it and it's not out there, I've got to make it myself. It is maybe one of those, what is it, like overnight success, but took 10 years to make, right? Of all the things that I tried to do, this one is maybe one of the hardest. Because with a podcast, you don't just like, you're done and that's it, you can walk away and let it ride. It's like every week, every day, you've got to go and make another one. It's ridiculous how much work it is to just keep it going. And I almost wish I just had like a basic SaaS app that just generates money every month without me having to do anything. This is quite a lot of fun, the ride that this has taken me on.

Ann Johnson: It's a lot of work. You know, and you doing it on your own, 90 million downloads in less than eight years is extraordinary. And your humility that you're showing is probably a lot of the reason why you're that successful, right?

Jack Rhysider: I mean, I take a lot of inspiration from people who have been successful before me. I want to do that, too. Teach me how you got there. And I want to join you. I want to follow in your footsteps, right? So that's kind of how I look at people who are more successful than me is it's very inspiring and I want to get there as well.

Ann Johnson: So how do you go about your storytelling? How do you make the stories relatable? How do you decide which stories you're going to tell?

Jack Rhysider: A lot of tricks that I think are interesting are we start the story in a specific direction, knowing that we're not going to end in that direction, we're going to end somewhere else. And so we have this strong, you know, right turn or this left turn or something, and these turns that are in the story are the critical parts. And so there's a lot of people that just tell me a story of like, oh yeah, one day I hacked into a company and I stole the assets that they wanted me to steal. And I'm like, okay, great, where's the twist and turn? Like, did you go to the wrong company first? Did you hack the wrong thing first? Did you fail the first 20 times? That way, you know, I can pull those out in the story. That's what I'm looking for in stories, stuff that has all these twists and turns that you never expected us to have to switch into that or go there. And that's what makes a good story for me.

Ann Johnson: Has a story ever challenged your perspective on the right and wrong in cybersecurity?

Jack Rhysider: I think challenging my view is always interesting. I like to pick stories that do challenge my view. Because if I'm interviewing a hacker and he's like, yeah, I hacked the police, and I'm like, that's kind of a jerk thing to do. So I want to back up and I want to say, okay, my first reaction is, I don't like this. My second reaction is probably similar to that. So what's my third reaction? Okay, my third reaction is, I probably don't know enough about your backstory. Tell me, what have the police done to you as you were growing up? Or what is your relationship with this? Tell me about your teenage years. And so then you start to get into this empathy situation where you're understanding their situation and you're like, oh, I see. I might've done the same thing as them if I was in this position. And now you're practically cheering them on, like, yeah, I feel for you, man. Go get 'em. Let's see what you got. What happens next, right? And so I have to kind of back up and put that context into place to give me their worldview.

Ann Johnson: Can we talk about human beings? Human beings are a big part of cybersecurity. They're both victims and they're also folks that perpetrate attacks. What do you think about with the average person? So, you know, if you can think about someone who's not a cyber pro, how should they be thinking about privacy and given everything going on in the world?

Jack Rhysider: I think there's an asymmetry here of what we think our apps are doing and our computers are doing versus what they are doing. Like there's just a whole bunch of, you know, data collection, cookie collection, monitoring, app fingerprinting, all this kind of stuff, that I don't think the average person knows. And I think the cards are almost stacked against them to be like, you just don't -- like we don't even want you to know that we're collecting this data, right? And so we're doing extra work to keep you in the dark. And I think that asymmetry of just to how much privacy you're losing versus knowing you're losing, like what you think is safe isn't safe, and what you think is private isn't private, and all this sort of thing is growing. And I think that's a problem. I guess some people would become hopeless, like, oh, my data's always going to be in a breach or whatever, and maybe even turn to the dark side. Like, you know what, screw it, I'm going to start my own ransomware company. I think what's changed in me over time is I've realized, wait, I do have the ability to not be impacted by these breaches. Like obviously the breaches are going to continue to happen, and my data is going to be in there, whether I like it or not. But could I do something about that? And I think the answer is being more private, right? So I try to use fake names everywhere I go, fake email address or, you know, burner email addresses, burner phone numbers, burner credit cards, like everything that I can possibly do. So that, okay, my data got breached. Well, that's fine. That's Sam Walters and some other phone number and address that's not even in my state. And so even though your data's out there, you can still cut it off and it still has -- it gives you a bigger advantage to what your privacy is today. Because if somebody knows every move you're going to do every day, that is totally different than they knew about a couple things about you 10 years ago because it was in a breach. So I think that there's still some value in cutting it off and not giving up entirely.

Ann Johnson: As I look back on these conversations, a pattern emerges. 2025 was not about new threats, it was about new response. We moved from defense to resilience, from tools to teams, from prediction to preparation. The guests you heard from this year did not just tell us what to worry about, they told us how to lead through it. They reminded us that technology problems require human solutions, that collaboration beats competition, that diversity makes us stronger, and that psychological safety, storytelling, and transparent communication are not soft skills, they are survival skills. So were we right about 2025? Yes and no, like most things. The threats we predicted did show up. But what we did not fully predict was how much the people in this industry would rise to meet the emerging threats. I close every Afternoon Cyber Tea with a bit of optimism. And to end on a note of optimism in a field that's defined by pressure and constant evolution, let me share what gives me hope for the future of cybersecurity and for resilience. What makes me most optimistic? It is absolutely the shift I have witnessed in how we are approaching the work. Throughout 2025, I saw leaders who stopped pretending they have all the answers and started building cultures where their teams could speak up. I saw organizations move from siloed competition to general partnership. I saw an industry that is finally recognizing that our greatest vulnerability is not our code, it is not investing in our people. I am optimistic because the conversations we are having have fundamentally changed. We are acknowledging that the human being behind the keyboard, whether they are defending networks or trying to break them, the human being is the most important thing. And I am optimistic because every guest who sat down with me this year, despite facing enormous pressure, despite facing impossible odds, every guest chose to share transparently what they have learned. They chose transparency over perfection. That generosity, that openness, that commitment to collective progress, that is going to carry us into 2026 and beyond. The best cybersecurity strategy is not the one with the most advanced tooling, it is the one built by teams that trust each other, organizations that learn from failure, and leaders brave enough to ask for help. Thank you for an incredible 2025. Keep your tea warm and your defenses even warmer. [ Music ]