Transcript
1
00:00:00,360 --> 00:00:50,094
Daniel Nestle: Welcome, or welcome back to the trending communicator. I'm your host, Dan Nessel. Man, oh man. What the hell is happening out there? Every day, inch by inch, I feel like the trust crisis is just getting worse. I mean, we can argue about who or what is to blame. Social media, legacy media, politicians, corporations, certain governments, certain NGO's, organized crime, terror organizations. Like I said, man, oh man. Now believe me, I don't own a tinfoil hat and I have no intention of making one. It's bad enough that we have to contend with misinformation, disinformation, deep fakes and more online and offline in our daily lives every day.
2
00:00:50,262 --> 00:01:38,410
Daniel Nestle: But we as communicators, as the PRofession, have to go further and figure out how to protect, defend, mitigate, and when possible, expose and discredit bad information and bad actors all in the service of our brands, customers, clients. Sometimes those clients are governments. You know how it works. But it's frustrating and sometimes it's even demoralizing. Until you understand that we're on the front lines of a conflict that we have not only the responsibility to fight, but that people are counting on us to do so. On our expertise, on our mastery of language and narratives and our ethics. Then dealing with all this becomes a mission and a purpose and a cause that we can all get behind. And I think it's fair to say that my guest today is on this very mission and leading the cause.
3
00:01:38,830 --> 00:02:19,854
Daniel Nestle: Name to the PR Week 40 under 40, Forbes 30 under 30 and recently Inc. Magazine's Female Founders 250 list. Her experience working with us legislators, think tanks, National Laboratories has made her intimately familiar with defensive digital strategies against misinformation and disinformation. So much so that she founded an online risk mitigation technology company that recently secured $20 million, I believe it was, in series b funding. She is a fellow at the National Security Institute, on the board of trustees of the Institute for Public Relations, and is the founder and CEO of Alethea. Please welcome to the show my new friend, Lisa Kaplan.
4
00:02:19,942 --> 00:02:22,318
Lisa Kaplan: Thanks so much for having me. Excited to be here.
5
00:02:22,494 --> 00:03:17,002
Daniel Nestle: My pleasure. And it's interesting because we had organized our recording to take place a couple times. Scheduling happens and then this episode is going to go live about a week or so from today. So I think it's fair that we can reflect on the environment that we are currently in the. And unless you've been under a rock, there's been a lot happening these days. And I think it definitely impacts the way that communicators are dealing with misinformation, disinformation, bad information online. And I imagine that you've had a heck of a time over the past five or ten days or however long it is in the role that you're in, and certainly at Alethea, and I think we can get into that.
6
00:03:17,026 --> 00:03:35,874
Daniel Nestle: But first, if you can tell us a little bit about Alethea and how you got there and your journey towards becoming a founder. 250 Female founder Inc. Magazine list Namer Reed.
7
00:03:35,922 --> 00:04:23,780
Lisa Kaplan: Oh, my God. I feel like at this point, if there is something to point out that I'm female or a youth, I'm on the list. So it's like a participation trophy at this point. But, no, I appreciate the kind introduction. So, yeah, it's been quite a week. I will say I'm glad that we are audio only, because you'd be able to see otherwise. And just since this is coming out a week later, this is the week that President Trump faced an assassination attempt, announced JD Vance as his running me. We also woke up this morning to the news about Crowdstrike, and we also, of course, had some interesting legal opinions come out. And the reality is, all of that impacts companies and corporate brands, whether you want it to or not.
8
00:04:24,360 --> 00:05:18,770
Lisa Kaplan: The reality is, and what I tell everybody is like, listen, if you're not at the table, you're on the menu. And so, welcome to our digital environment. So I'm glad that this is video, because nobody can see the bags under my eyes, but this is also my Super bowl. So our team at Alikia, so we detect and mitigate instances of online risks, and we define online risks as narratives that can cause a risk to an organization. So whether that's your brand and communications, your physical security posture, your regulatory environment, our role is to be able to identify these risks and threats early and be able to give you those insights and great recommendations for what you can actually do about it, so that you're finding stuff out when it's a level one, not a level ten.
9
00:05:19,350 --> 00:06:07,900
Lisa Kaplan: Our internal mantra is, let's reduce the number of days that everybody has to clear their calendar and get into a Tiger team. The way all this started is I was actually the digital director on a Senate campaign in 2018. Prior to that, I'd been working in policy and communications in Washington, DC, mostly consulting, a little bit of time on the Hill, and I got a phone call in 2018. Basically, the pitch was, hey, wanna come be hire number three. And so went for it and built out a digital strategy. And at the time, it was super innovative. It was a question of how do you basically localize your brand? We knew enough about what happened in 2016 to know that disinformation might be a risk in 2018, even though it was a midterm year.
10
00:06:08,090 --> 00:06:55,094
Lisa Kaplan: The senator I was working for is an independent from Maine. He's on armed services, intel, and energy. So I kind of sat there and said he's probably not trading Christmas cards with the Kremlin. And were going to have this coming from the left and the right, because by definition, weren't backed by a party, and were going to be dealing with what was coming out of any side. So really got to know this issue not as a political issue at all, but really a business continuity risk and national security threat. And it sometimes manifests itself in politics. We came up with a way to do both proactive measures and defensive measures. The proactive measures is what you can control. It's how do you make sure that you're buttoned up? How do you make sure that you have the right cybersecurity posture?
11
00:06:55,222 --> 00:07:36,294
Lisa Kaplan: It's also, what are the communications campaigns that you're putting out. We took the effort of really localizing our strategy because we knew our voters, 1.1 million people, and it's not six degrees of separation in Maine, it's one. And so were able to really boil down our target audience so that we would know who these people are and how to reach them. Communications 101. The benefit of that, though, also in our digital world, when you think about the defensive side, is we don't get to pick when a foreign adversary or your competitor or a criminal group decides to launch an attack against you online. What we can do is we can be prepared.
12
00:07:36,422 --> 00:08:20,280
Lisa Kaplan: And so being able to go out and identify early when something is starting helps you to take those proactive steps, whether it's counter messaging, whether it's beefing up physical security in the case of risks, and be able to identify what might be out there that could potentially harm your goal. So flash forward, we won. Winning is fun. And this was just the issue that was keeping me up at night. And so, as I say to friends and family, in a little bit of a temper tantrum, I just decided, I can do this better, and I'm gonna start a company. So I did that in 2019 and just started going out and pounding the pavement. Got laughed out of a lot of conference rooms. 2019 was way too early, and then 2020 happened.
13
00:08:20,700 --> 00:09:14,134
Lisa Kaplan: And I think Covid and January 6 is what made people realize that consumer trust on key issues like the pandemic were at risk and a true challenge. And I think people started to realize that this was a risk that was coming for all of us. And then I think January 6 was the example of what happens when some of these narratives are believed to the point that people think that they're doing, that they're actually taking action. We saw this with Pizzagate as well, but the idea being it was a very visible moment where online narratives turned into offline action in a way that threatened physical security. So since that time, we have built a technology platform that automates all of the work that we've been doing for five years at scale.
14
00:09:14,262 --> 00:09:46,150
Lisa Kaplan: So we built a platform that pulls in disparate data sources, tells you what narratives are emerging, who's behind them, how are they spreading, what can you do about it? Should you even care in the first place? Is it the onesie twosie mean tweet keep it moving? Or is this a one z two z mean tweet that also has a home address and a threat of violence? Is this a onesie two Z mean tweet that's going to turn into thousands of tweets calling for a boycott or attempts to short your stock? So that's the work that our company does, and it's quite a year to be doing it, I'll bet.
15
00:09:46,650 --> 00:10:06,790
Daniel Nestle: I mean, I have lots of questions about the technology itself and how you manage to identify the right threats or the right issues for different companies. I mean, does it require a lot of discovery first to understand what the brand is about? Or is there, like, is there a process that you're able to share about how that happens?
16
00:10:08,010 --> 00:10:39,340
Lisa Kaplan: So, it's a great question. And the way to think about the platform is it's super flexible. So, like a social listening tool, it does run on boolean logic. So it just depends on what you want to know. Do you want to know about your brand? Do you want to know about your executives? Do you want to know about, for example, the go woke, go broke communities, which are known for shorting stocks or targeting DeI initiatives? Do you want to understand a black swan event like what we've seen happen now a couple times this week?
17
00:10:41,040 --> 00:11:24,872
Lisa Kaplan: So you build that query, and it pulls in all sorts of disparate sources of data, and then what happens is our machine learning models, which have been trained by our subject matter expert analysts and our data science teams, who are the ones who do things like attribute russian propaganda campaigns online. And a lot of our team came from actually trust and safety at some of the big platforms when they did all their trust and safety layouts on the data science side. So we built models to be able to identify what we call coordinated behavior. So we've created these different indicators of coordination that help us to not look at the content, because when it comes to the content, we're a democracy. We have to protect freedom of speech, full stop. We look at the behaviors.
18
00:11:24,976 --> 00:11:53,080
Lisa Kaplan: So anybody can say anything that they want, but it's actually a violation of terms of service. Depending on the situation, it may even be illegal to purchase a network of accounts that are basically trying to game curation algorithms. They're trying to make something look newsier than it is. And that can oftentimes signal that there is a direct or an individual actor that is delivered deliberately and maliciously targeting your brand or your company.
19
00:11:53,380 --> 00:11:56,828
Daniel Nestle: Yeah, that's a. That's where bot farms and such come in, correct?
20
00:11:56,884 --> 00:12:09,960
Lisa Kaplan: Exactly. Or latest. I can buy a bot for seven, or you can buy a whole bot network for $7 out of a vending machine in St. Petersburg. With some interesting research that I read lately. Yes. So inexpensive.
21
00:12:10,860 --> 00:12:33,898
Daniel Nestle: Yeah. And that's the issue, right? That's one of the main issues, is that it's a. It doesn't take tremendous amounts of funding to bring down a brand or to bring down a person or something even larger than that, or to try. Anyhow, you know, I saw some videos recently. I saw some videos recently running around on reels, because I don't do TikTok because I'm a grown up, but I look at reels, you know, I know.
22
00:12:33,914 --> 00:12:36,922
Lisa Kaplan: I like you. If you have TikTok, you should just burn your phone.
23
00:12:37,066 --> 00:13:34,420
Daniel Nestle: I believe so. Cause, you know. Well, I mean, I would ask you about that, actually, in particular and specifically how. What a threat TikTok could be. Or is it a threat, indeed? Is that all misinformation, disinformation or noise? But let's just put that to the side. I was looking at these reels that were showing footage from inside bot farms from, I believe was from China, but it could have been from India or someplace else. But the setup that they have is ridiculously small in real estate footprint. An office room that has hundreds and hundreds of cell phones attached together, and, like, just a few human operators just doing whatever it is, whatever horrible and malicious things that they're doing. And that's just one. So where that could be a big business or it could be a lucrative business in really, anywhere.
24
00:13:34,460 --> 00:14:00,820
Daniel Nestle: You can see how the overhead is so low that it's so easy to spread. It must be so difficult to combat so identifying that coordinated behavior, which clearly comes, that's what that's designed to. You see a wall of cell phones and it's coordinating something that's going out there. How do you then how do you combat that? You identify it and then what?
25
00:14:01,600 --> 00:14:52,852
Lisa Kaplan: It really depends, and it depends on what's your goal. And you're absolutely right. I think what we're really talking about is influence operations, which are not new in a military context. It's just we're not talking about dropping leaflets out of a helicopter in Vietnam. So there's not really a high barrier to entry, to your point. And so what started as a government tactic is now anybody can do it situation. So we do see more than just state actors. We see these criminal groups, we see these lone individual operators. I joke that it's the proverbial guy in his basement, but it's the category I also put myself into. Not that we do this. We actually have an ethical statement against spreading disinformation. But I think what the real key here is, to your point, anyone can do it.
26
00:14:52,916 --> 00:15:48,560
Lisa Kaplan: And so to make the right decision, you have to understand what's out there and how it could impact your goal. So I'll give you a couple of examples. We, for example, were working with a client that's an international client, and we caught a network of accounts that were targeting them outside of the United States. In a country that's like a democracy with a question mark at the end of the word democracy. And it was alleging that the company was acting inappropriately in that country in terms of trying to influence essentially politics in that country. As an outsider, what ended up happening was, that's not really a country where you want to say, so. We saw this at our coordination models. We were able to quickly see, hey, this is affiliated with a group that may actually be part of this state.
27
00:15:48,860 --> 00:15:58,026
Lisa Kaplan: And that's not a situation where you want to raise your hand and expose it and say, look over here, we found you. That's a quick way to get all of your equipment seized. Like you don't want to go down.
28
00:15:58,058 --> 00:15:59,258
Daniel Nestle: That path for sure.
29
00:15:59,394 --> 00:16:43,964
Lisa Kaplan: But they caught it early and they were concerned that it could be a big deal. And so what they did instead was they actually were able to use fact checkers who started reaching out to say, hey, is this true? And they were able to prove no. And the fact checks, because the fact checked happened early, was actually the largest part of the narrative. And we see that happen time and time again. So I always like to warn people though fact checking is a thing of the past. Like, people don't care whether or not something's true or not true. I mean, some people care. I like to think that I care. You know, I'm sure that you care, too, Dan, but it's. Yeah, but we're not everybody.
30
00:16:44,052 --> 00:17:13,579
Lisa Kaplan: So I do think, though, a lot of people are looking for information that confirms their biases, and it's a steady stream of information confirming your biases over time. So a fact check alone doesn't work. Oftentimes you have to. It worked in that specific instance because it was a narrative caught early that was new, and because they were able to leverage fact checkers who had a very high following to get that information out early, that the narrative was false altogether.
31
00:17:13,920 --> 00:18:11,430
Daniel Nestle: Yeah, it seems to me, and this goes back to the whole idea of the trust crisis, and this might be fundamental to it, is this reluctance to dig at all and to sort of this tendency to accept narratives quickly, easily, almost like a direct in the blood injection, where the narrative gets implanted and you have sometimes no time at all, sometimes hours, sometimes days. But generally speaking, a very short window to combat that narrative before it's either it's already kind of embedded and believed, or the subject has moved on to the next thing. So, you know, fact checkers, I understand completely that, you know, you see things come out, you know, oh, that's wrong. That's wrong. And then by the time the fact checkers get around or by the time the counternarrative comes out of, it's too late, the damage has already been done.
32
00:18:11,470 --> 00:18:48,298
Daniel Nestle: You can't convince people who've already been convinced. What is it? I forget who said it, but a psychologist or a psychiatrist somewhere about the once somebody's mind is fixed, there's almost no way to change it. Or the last thing that people want to do is change their minds. It's one of the hardest things to get people to do. Even presenting arguments, presenting rational discussions and facts and figures, doesn't really necessarily work anymore the way that it once maybe has. And that's partly the proliferation of all these narratives. So I totally agree, and I think.
33
00:18:48,354 --> 00:19:37,878
Lisa Kaplan: There are some groups that are doing some really amazing work, though. So. And this is not the work that we necessarily do. The work that we do is early detection, which if you're going to be able to deal with these narratives, to be able to manage your online risk. And every organization has online risk because every organization has to exist online. You have to have that early detection component. But I think there's some really great work that's being done right now in terms of how do you reduce polarization and how do you reach people where they are? So there's some great campaigns happening right now, like the disagree better campaign coming out of, actually, the National Governors association, where they have red state, blue state governors, and they're doing PSA campaigns, and they're kind of fun around. How do you disagree?
34
00:19:38,014 --> 00:20:31,750
Lisa Kaplan: And how do you chip away at how polarized we've become and these types of campaigns and ads, it's not going to be the end all, be all, but they matter, and they're making progress. There are also some groups that are looking at, well, okay, if we are all susceptible to disinformation, and we are. I've fallen for it before, and I do this for a living, we will, we all have people that we trust. It works because. Not because anonymous sock puppet, showed us a meme that's not. Nobody brought their zip ties to the capitol because of one meme, is what we like to say. But it's. It is true that, it's that over time, it's almost like radicalization. So how do you deradicalize? There's a group called one America movement who's doing really amazing working with religious leaders to help them bring their. Their.
35
00:20:32,490 --> 00:21:21,598
Lisa Kaplan: Whether it's their synagogue or their mosque or their church, their members back from conspiracy theories. And because, again, they impact all of us, all types of religions, all races, all genders. And what they're doing is they're working with people through context and language that's familiar with them and using their position of trust to be able to help them understand how. Help them understand, like, hey, this might not be right, and so there are ways to do it, but it's much better done on a peer to peer level, because you have to really trust the person and be willing to be vulnerable and be willing to hear you might not be right or that something else might be more right or a better explanation. And so we find that's the challenge.
36
00:21:21,654 --> 00:21:34,330
Lisa Kaplan: It's the disinformation and radicalization happens, and polarization happens at scale to de radicalize, to bring back into the fold. That happens person to person in most effective ways.
37
00:21:35,390 --> 00:21:45,828
Daniel Nestle: It's totally the same, or it sounds the same as, like, cult depro programming to me. I mean, like, exactly like that. And I've seen one too many documentaries about this.
38
00:21:45,974 --> 00:22:22,070
Lisa Kaplan: I was going to say, I think that there are probably a lot of similarities, but again, it's challenging, right? Because we're all susceptible. So just to talk a little bit about the time that I fell for disinformation. So remember the time that we come back in time with me to March 2020, something I'm pretty sure that we've all blocked out. But we couldn't buy toilet paper, we couldn't buy pasta. We were like, we don't know if we're getting Covid through our h vac systems. Do we have to wipe down our groceries? Do we not? Are we going back to work in two weeks? Those sorts of things. There was a rumor that was going around and I was living in Washington, DC.
39
00:22:22,610 --> 00:23:01,092
Lisa Kaplan: There was a rumor that was going around and I got a text message from a friend saying, hey, the whole country is going to be shut down for two weeks. Like you need, it's called this happening through the Stafford act. You have to go buy groceries and make sure you're stocked for two weeks right now. And disinformation targets are our fear, our anxieties, our uncertainties. And it's like, oh my God, I only have like two days worth of food. What do you mean? I'm not gonna be able to doordash a large pizza and make it last for a week, which was my mo at the time. And I went out and was part of the run on groceries. And I obviously called everybody I knew and cared about and told them, oh my God, my friend. Who would know?
40
00:23:01,236 --> 00:23:32,540
Lisa Kaplan: Well, what I didn't know is that my friend was paraphrasing a text message that was going around and spreading like wildfire, and it all came out later that wasn't true. And so because I was scared, because there was an information vacuum, which is the consistent area where disinformation really thrives. And that's something to think about from a corporate strategy perspective, I fell for it. And so, and I do this for a living, and I definitely don't identify as a member of a cult. But maybe at the end you can tell me if I should.
41
00:23:33,290 --> 00:24:28,766
Daniel Nestle: Well, I mean, I think that was a national or society wide emergency and so much uncertainty everywhere. I mean, I guess the argument can be made that were all victimized or part of, to one degree or another, part of a much larger cult ish way of thinking at the time. And I think I certainly was susceptible to some of the misinformation or disinformation that we now know was misinformation at the time, you know, but there's no shame in any of that. I think. I think, you know, it's when it comes down to spreading these things and getting really, I guess, what is it when advocacy is going in the wrong direction? Negative virality, or whatever you want to call it, that's when you're taking part in that process. It's important to reflect on why you've done that and what's happened there.
42
00:24:28,798 --> 00:24:33,486
Daniel Nestle: But, you know, I wouldn't beat myself up about it anymore, Lisa. I think we're good, but we're good with that.
43
00:24:33,518 --> 00:24:34,510
Lisa Kaplan: But I agree.
44
00:24:34,670 --> 00:25:15,586
Daniel Nestle: Yeah, but it's interesting. You said you're talking about the idea of de radicalization on a peer to peer level. Now, as communicators and as people in our profession, we are, and let's bring this down to brands and to what a lot of us do for a living, we communicate to stakeholders. I mean, the most bottom line thing is we talk to stakeholders, tell them stories, and try to convince them of something, persuade them to do one thing or think one way or et cetera. And in recent years, thankfully, one of the largest stakeholder groups that we've come to really value, and in some case, value more than any other, is our employee groups.
45
00:25:15,618 --> 00:25:15,786
Lisa Kaplan: Right?
46
00:25:15,818 --> 00:26:22,900
Daniel Nestle: So our friend Ethan is all about that. We have employee communications, employee engagement. Employee activation has just taken a huge leap over the last decade. I'd say the activism that we see, the spread of misinformation, disinformation that we see, and also the solutions that we see, I think, often take place in the workplace. This is where you're with the people that you trust the most. So the employee community is either a petri dish. Petri dish. Petri Dish. Yeah, I like Petri as well, but it's like niche and niche. I'm not sure which way to go, but I'll go with Petri. That kind of develops the flow of information in a lot of ways. Employee resource groups start things up and spread them around, usually most of the time, for positive results and for the positive intent.
47
00:26:24,120 --> 00:27:00,076
Daniel Nestle: But employee feelings and employee panic or employee reaction to these different pieces of information is something that companies have to consider more than they ever have before. So how are, like, what should. What should companies be doing about when. So let's say you're detecting something. You're detecting a risk. There's a, you know, you have a. An actor out there, or there's a rumor spreading or something like this. What's the role or that employees can play? And what should companies and communicators be. Be doing with their employee communities as part of this solution package?
48
00:27:00,268 --> 00:27:52,774
Lisa Kaplan: So when I answer this question, I'm also operating with the assumption that there's been an investment internal communications and culture. People are generally happy with where they work. They're excited to come into work every day, high NPF scores, that sort of thing. I actually think that the communications team that take the broadest view of how to combat that risk of potential ways to damage the brand or create security risks or other types of business continuity risks, the communications team that don't just think about communications as PR and own social media are the most successful or influencer marketing or advertising or all of those other things. Because when you think about it, you really can't separate and individuals and their place of employment online anymore.
49
00:27:52,862 --> 00:28:36,782
Lisa Kaplan: This is why there's the forced legal disclaimer of, and my opinions don't reflect the, that of my employer or, you know, we've seen people get fired. There was that woman who flipped somebody off while she was on her bike ride commute a couple years ago, and she got terminated as a result. So they're, you just can't separate the two anymore. So I think that's something that you can frankly engage your team in helping you to communicate whether, and you have to think about what the, what it is you're trying to solve for. So if you are facing allegations or something that may call into question, is this a great place to work? You want your employees being the one saying, yes, this is a great place to work. You don't want to be the one coming out saying, I promise this is great.
50
00:28:36,846 --> 00:29:21,790
Lisa Kaplan: Like, people aren't going to trust management on that, understand? But you shouldn't have to. You should be investing in your culture so that you don't have to ask employees, they're going to be saying to their friends, offline members of their community, hey, yeah, I know what's being said out there, but I've always had a great experience. That's, I think, and your employees, too, when you think about it from like a recruiting perspective, it's way better to have them saying, come join our team on LinkedIn. So definitely always look at your community as your employees that way. I do caution people, every internal communication is an external communication. And so whether you choose for that to be true or not, and so there does need to be that level of care.
51
00:29:22,250 --> 00:30:12,622
Lisa Kaplan: Where I think we've seen companies really struggle is what do you speak out on? What do you not speak out on? And that's something that we saw, I think after the protests over the death of George Floyd, we saw every company coming out saying, this is not something we stand for. I also need to be honest. Like, that's from a communications perspective, the obvious thing to do once that video was surfaced, we see companies have more challenges around. Do we say something? Do we not say something? When it comes to emotionally charged issues, and it's complicated, like, do you say something or do you not say something? If China invades Taiwan and you have employees there and customers there, do you say something or do you not say something? If, like, the.
52
00:30:12,726 --> 00:31:05,154
Lisa Kaplan: Depending on how, what happens on any given election moving forward, we saw a lot of companies say, we won't give to people who voted not to certify the election, and then they started giving to those candidates again. And so then, because if you make those bold statements and you change course, even if it's the normal course of business, people are going to rightfully have questions. So my advice is always, don't flip flop. Know where you stand. If it's core to your business to say something. It would be very strange, for example, if CPAC didn't congratulate President Trump on an election victory, if he wins the election. It would also be very strange if, you know, Planned Parenthood congratulated President Trump on an election. So you have to kind of know where your brand is and know where he stands.
53
00:31:05,202 --> 00:31:22,390
Lisa Kaplan: I need elections because it's an easy example for people to understand, and also understand that sometimes, too, it's okay to say nothing. If you are selling toothpaste, nobody's looking at you. To solve a decades old conflict in the Middle east, it's okay to not weigh in.
54
00:31:23,050 --> 00:32:14,244
Daniel Nestle: That's a lesson I think a lot of companies, a lot of brands have learned over the last few years. You know, when do you have standing, in other words, you know, and I think everybody felt that they had standing at certain times or were being performative in the sense that, okay, well, if I don't do this, I'm gonna lose employees or I'm gonna lose brand share or something like this. Even companies that were in no danger of any such thing, or companies that, you know, have such broad audiences and customer bases that simply by changing a square on your social media account can alienate 50% of your customer base. That's the state of the world we live in now. So, in that case, does the company make the stand? Does the company not?
55
00:32:14,292 --> 00:33:12,608
Daniel Nestle: That's where you have to look at your mission, vision, values, and say, does it make sense to do so and really make a statement? Only if it makes sense. So I think that's a learning from the last few years. And we certainly see companies now really holding back, like stepping back almost entirely from any involvement in the melee out there and maybe even reacting more to activism within their own companies by saying, this is not permitted here anymore. We're done. And full disclosure, when Google is firing the anti Israel protesters, you couldn't find a more thrilled person than me. Everybody knows where I stand on this whole issue. Was that the right thing to do or the wrong thing to do? I don't know. It's Google's DNA. It's their call. I hope it's the right thing to do.
56
00:33:12,744 --> 00:34:05,038
Daniel Nestle: But you have to make a stand and say, okay, this is tolerated or it's not tolerated. This is the way, this is that kind of a place to work, or it's not that kind of a place to work. Roll the dice a little bit. Now, how that all fits in, and I'm sorry we've been sort of veering a little bit away, but how that all fits into proactive risk mitigation to threats that you might find coming down the pike. Just for example, as you were talking, I just thought, okay, what if you hear about there's an attack coming or there's something brewing that's going to defame your CEO? Employees don't know about it yet, but you do as a communicator, where do you go with that? Is there an internal comms play on that? Is there a mitigation strategy?
57
00:34:05,094 --> 00:34:24,255
Daniel Nestle: Certainly for all the socials, you want to be ready with all your statements and do your fact checking ahead of time and so on, get ahead of it. But simply by talking about it, does that make it real internally? I don't know. I don't know the answers. I mean, is there, have you had experiences with such things? Is there anything that you can advise us on for that?
58
00:34:24,367 --> 00:35:04,860
Lisa Kaplan: Yeah. So, I mean, this is where early detection is so important, because first you have to figure out, and this is not something that we can help you with. People always want us to say, but is this disinformation? And I always have to say, I don't know. They seem like a very nice person, but no idea if they've ever met Jeffrey Epstein or cheated on their wife or whatever it is. So it's like, not my, I don't know. You got to go figure that out. But the point is you have time to go figure those things out. And those are the kinds of things that are really damaging. And so based on what our job is to tell you one, that it's out there, and so that you're hearing it from us, not because you're getting a phone call from the Wall Street Journal.
59
00:35:05,680 --> 00:35:50,640
Lisa Kaplan: And so being able to figure out, is this true? Is this something that we need to deal with? Is this going to lead to an actual business risk? Do people care? Is it, you know, it depends on what that allegation is. There are certain allegations in the category of Jeffrey Epstein that, like, obviously one post is too many posts of that allegation, where other things like, you know, is this person, this CEO is mean, and it's one tweet on Twitter or X or whatever we're supposed to call it, and nobody's engaging with it, and it's got no followers, and you're like, all right, whatever. So you kind of have to have that early detection and that context and that ability to monitor. Is this going away? I think where people also have to consider is where is this occurring?
60
00:35:50,720 --> 00:36:41,148
Lisa Kaplan: So a lot of teams just have, for example, insight into what's happening on X. And yes, X is influential, particularly to reporters, but that's probably not where most of your stakeholders are getting most of their news and most of their information. And so one of the challenges is how is it that you think about making sure that the people who need to hear the message or need to hear a counter narrative or need to have something pre bunked are actually seeing it? Your Wall Street Journal or your New York Times article, for example, may not do well on GABD, which is an alt right free speech platform, because alt right free speech platform users are not New York Times center left readers.
61
00:36:41,284 --> 00:37:10,044
Lisa Kaplan: And so you have to play that game of three dimensional chess of if something's not true, where do you want to get it out there? And of course, you know, you don't want to make kind of gross, sweeping generalizations. I'm sure there are people on Gab who read the New York Times, but you have to kind of think about a much more complex stakeholder map than communications has been forced to think about in the past. One other thing to consider as well, it used to be that you were an effective communicator.
62
00:37:10,212 --> 00:37:56,854
Lisa Kaplan: If this type of disparaging story or potentially damaging story, if you were able to get it shut down in the first place, you know, you maybe get the, hey, it's the Wall Street Journal or the New York times or Bloomberg or whoever, and here's a list of 20 questions, and you have 2 hours to answer. And if you could call an editor and get that killed, if you could make the story come out differently, if you could really push back. If you could kind of run a ground game, you are considered a good pr person. Now, the question can and should be and is, why didn't you see this coming? Because oftentimes these narratives are out there long before they become mainstream and pose a different type of risk. So what are the narratives that are out there? What can you do about it?
63
00:37:56,902 --> 00:38:22,990
Lisa Kaplan: Do you need to put out more information? Less information? But we do find that to the extent companies can, staying true to your core values, staying true to your messaging, and staying true to like who you are as a company is the most important thing that you can do in the face of all of this, because that's how you maintain that trust, because you continue to be who you are.
64
00:38:27,250 --> 00:39:34,060
Daniel Nestle: My mind is the hamsters running because some of the things you're saying have just lit something up. I keep thinking about this idea of predictability, and that's where Alethea is playing with the predicting the risks or identifying risks that are out there that could become something bigger. I suppose, to oversimplify, predicting what topics are going to be great and what's going to be very powerful for a brand to lean into is gold for communicators and marketers alike. We want to know what our stakeholders are going to be interested in and what our stakeholders are going to be know, get into what's the next big thing coming down. If you could be that trend predictor, the futurist of the company, that's going to say, oh, yeah, this is the next big thing, is chartreuse color, whatever it is, predict something for your people.
65
00:39:35,840 --> 00:40:14,700
Daniel Nestle: But the same applies to that dark side. We need to be predictive in a way of, I mean, just as we're trying to be predictive and focusing our energy and our budgets into what topics are going to hit, shouldn't we be looking at what potholes are in the road and what obstacles are out there and what can hit us back in the same way, it seems to me that the commercial incentive for that is too small.
66
00:40:16,440 --> 00:41:09,020
Lisa Kaplan: So it's interesting because I think that people really are starting to get hammered by this threat, and what's happened is there have been too many bad days, too many missed earnings calls that can all be traced back to a moment online that nobody knew about until they did the retrospective. And so that's where we end up coming in. I will say, just as a startup, like when you're in a crisis, you're calling your pr firm. You weren't calling a seed stage startup back in 2021. You didn't get to us until you tried and failed and tried and failed. And then you said, hey, that crazy lady who talked to us about QAnon back then, does anybody still have her business card? Is she still around? Because we're out of ideas.
67
00:41:09,760 --> 00:41:59,482
Lisa Kaplan: I had one client recently who we started working with them about six months after they had a major incident. And I walked in and were meeting and looks at me and goes, I have a feeling I'm gonna wish I knew you the day that this happened. I said, yeah, me too. But here we are. We got to figure out, this is a different situation. This isn't how do we prevent bleeding? This is how do we build back. And so there are two different modes. What I always tell people is, this is technology solutions. So AI models, all of that. Sure. It's definitely, I think, more expensive than the average social listening tool, because the average social listening tool isn't helping you manage risk. You gotta run the analytics and all of that. And what you're saving is time.
68
00:41:59,626 --> 00:42:42,060
Lisa Kaplan: And what you're getting is a broad view of the Internet. Because if you just have Twitter and you're only finding out what's being talked about, how your brand is being talked about on Twitter, on CNN, how the op ed that you knew about because you penned it and put it out there is being received, you're totally missing the point. You're completely vulnerable. You're waiting until something becomes a level ten crisis. You're losing executive time for the week that it takes to figure out what to do and what to recover. You're pulling your team off of the projects that are moving the business forward. And so what we find is, when we start working with people, it's actually funny. Everybody goes through this learning curve because we're early detection. So we tell you about the things that.
69
00:42:42,100 --> 00:43:22,304
Lisa Kaplan: Yes, we tell you, like, hey, this one's high risk. You have to do something about it. But we're also just like, hey, FYI, this is happening. And people are like, what am I supposed to do with this? FYI, information? We're like, you're supposed to not be an idiot. If you have the opportunity to step in it, anything, you just need to know it's out there. So that if your CEO is being accused of, you know, doing whatever it is, like, maybe make, maybe move the CNBC interview, maybe change the remarks, maybe be prepared that they might be asked a question about it. It's those subtle things that can also support not having unforced errors. So organizations need to get proactive. They need to get predictive.
70
00:43:22,352 --> 00:43:31,130
Lisa Kaplan: They need to understand how everybody from a state adversary to a short seller community, works in order to be able to protect the organization.
71
00:43:31,550 --> 00:44:06,490
Daniel Nestle: Yeah. And it seems to me that so if you're called in after something's already happened, clearly something's already been identified, and now you're there to sort of plug some of the holes and then set something up so that they could predict anything coming in the future. But as you said, you're not the one that's going to tell them what to do about it. Right. You're just telling them there's a problem. But as far as what to do about it, that's where you really need an excellent partner inside in the comms team or a great agency partner to work with on that.
72
00:44:06,610 --> 00:44:08,906
Lisa Kaplan: And we do provide counsel.
73
00:44:08,978 --> 00:44:09,250
Daniel Nestle: Yeah.
74
00:44:09,290 --> 00:44:46,140
Lisa Kaplan: So we do provide counsel because we've seen this happen across industries, across everybody, from the fortune a handful of the fortune ten, to nonprofits that you've only recently heard about those sorts of things. And so we do have that bird's eye view. We do provide the perspective in the strategic council for our own ethical purpose. And this is for two reasons. We don't do the hands on keyboard counter messaging. We don't do like, we're not lawyers, we're not writing your cease and desist letter, but we'll tell you what the magic words are that they need to put in it. And that's for two reasons. So, one, you shouldn't be letting your vendors grade their own homework.
75
00:44:46,480 --> 00:45:16,920
Lisa Kaplan: And that if the people who are doing your messaging to try to make it better, they shouldn't be the ones also, like, then grading it and saying, yes, this is working. Like, there's a little bit of that boundary. But we also don't want to be accused of spreading anything that could be false, misleading disinformation. And we don't have the in house information, nor do we want it to be able to actually do that for you. And so we end up coming in and providing mitigation options, and then it's up to you to see what makes sense for your context, your goals, your risk profile.
76
00:45:17,300 --> 00:46:19,938
Daniel Nestle: Yeah, that makes sense. Where I was going was that the skillset of communicators right now, I think, are increasingly generalists or people who've run comms teams, and certainly people who are in different certain leading industries or highly exposed companies and industries are probably a little bit ahead of the curve in this kind of stuff. But communicators really need to understand, okay, if there's a risk. What do we do? And it's not just the crisis playbook. It's not just saying, okay, I'm a good crisis communicator, so therefore, I know what to do. There's a lot of new and ever changing tools and techniques and mitigation methods that we really need to stay on top of. Right. And so kind of thinking about that, what do you see coming?
77
00:46:19,994 --> 00:46:40,470
Daniel Nestle: Or what's happening now out there that are, for lack of a better word, how's the digital world changing that we need to be really aware of? And what are the kinds of tools and skills that communicators and marketers should now really have under their belt as a table stakes item?
78
00:46:41,490 --> 00:47:42,390
Lisa Kaplan: I think it's table stakes to understand technology. It's automating the grunt work of our jobs. It's automating. It's being able to rapidly deploy counter messaging. Even if you're NPR, you don't deal with social listening. You don't deal with the intelligence side. Embrace technology and know how it works. I think that the other pieces, communications, it used to be something that was done from a podium, and now it's something that's a conversation, whether you want it to be or not. It's interactive. It's hearts and minds. It's telling a story. And so really embrace not just the storytelling that makes the effective communicator on stage, but in that person to person interaction, too. Because now the difference is that every interaction you have is an opportunity for a person to person interaction. And every time somebody has that person to person interaction, it's a communications opportunity.
79
00:47:42,510 --> 00:48:31,680
Lisa Kaplan: Start thinking about how to leverage your call center. Start thinking about if you have brick and mortar, how are you preparing people with talking points, faqs, that sort of thing. The other thing I say, too, is we're all always on, thanks to cell phone cameras. Make sure that deepfakes and those types of risks, too, they're all coming. And so it's really about being prepared. But I would say that's the biggest way that the communications role has changed. It's not just about writing copy a fun marketing campaign. How do you position with press or write the op ed? It's you're the frontline risk manager of the entire company. How do you use those tools to achieve the company's objectives? And that's both in terms of risk and opportunity. Now.
80
00:48:33,940 --> 00:48:56,978
Daniel Nestle: I've always kind of had a slide or two in the different kind of strategy decks that I've presented over the years, saying that conversation is the communicator's superpower conversation, and I shouldn't use, I shouldn't say is, it's like, is one of, because we have a few, but conversation itself is the domain of the communicator and should be the domain of the communicator.
81
00:48:57,034 --> 00:48:59,002
Lisa Kaplan: Oh, yeah, we are chatty people.
82
00:48:59,146 --> 00:49:41,610
Daniel Nestle: We're chatty people, and we're supposed to be able to be sort of, to a degree, empathetic or empathic and be able to respond and go back and forth and interrogate when necessary, but always know where the line is that we're not going to cross. Be aware of the messaging, have it really solid and in our person, and sort of like dyed in the wool into a lot of ways so that we are the safe community, the safe kind of conversationalists for a company, but we shouldn't be the safe, we shouldn't be the conversationalist. It should be the executives. It should be the people whose face, I mean, also the executives should also be our salespeople. It should also be anybody out there who has an interaction with anyone in the public.
83
00:49:41,650 --> 00:50:43,500
Daniel Nestle: So basically, everybody in the company who's representing a brand should have the grasp or a grasp of how to have communication or, sorry, conversation in a way. So what I was, you know, where I was going is that those conversations that we have digitally are the best tool we have and the least expensive tool we have to build trust with our stakeholders. And it amazes me that even now that executives outsource that to others so that their voice isn't really their voice or so, or they don't see the immediate value of it. So I'm not going to waste my time. You know, I have a team to do all this stuff. One example for, you know, just as an example, we talked about X Twixter, whatever you want to call it.
84
00:50:45,480 --> 00:51:51,660
Daniel Nestle: A lot of execs aren't on there, and I understand that's not necessarily the main platform, but LinkedIn. LinkedIn is a major center for especially b two B, but certainly for person to person contact. And in my humble opinion, it is the most solid of all business platforms and the most real, like the most authentic for business people. Why would you not be taking advantage of the conversational capabilities within LinkedIn? And the more you educate people and follow the LinkedIn algorithms and understand that the number one thing LinkedIn wants is for its members to be happy. LinkedIn does not care as much about advertising. They don't care as much about brands. They care about the members because its a membership model. So generating conversations with people and getting into conversations. Nice comments. Hey, how you doing? Oh, congratulations on such and this and that.
85
00:51:52,480 --> 00:52:14,136
Daniel Nestle: Thats a great product, whatever the conversation is. A couple of comments a day, its not a lot of work, but its kind of, you know, shoved aside in the name of, hey, we don't have a lot of time. And then a crisis happens and you're like, why don't these people believe me or trust me? You suddenly get out there on LinkedIn and nobody's responding to you. Well, you haven't built up any rapport.
86
00:52:14,288 --> 00:52:24,300
Lisa Kaplan: Gotta invest. The best thing you can do is invest in trust. It's a lot easier to protect a strong amount of trust than to try to build trust when no one trusts you.
87
00:52:25,080 --> 00:52:29,472
Daniel Nestle: It's just, it boggles the mind. Lisa really does.
88
00:52:29,576 --> 00:53:10,362
Lisa Kaplan: One thing you mentioned, though, that also is resonating with me is like, you're right, it's fast, cheap and easy to do. And we talked about this a little bit at the beginning in like the geopolitical context, but like, this is not enriched uranium, it's memes. Like, and so that's where it's like, yeah, and if this is why, if you're not at the table, you're on the menu because other people are targeting you. And so, like, whether you want to hear it or not, it's true. And all of us have been targeted and it's going to happen. And so the best thing you can do is nobody wants, like, we've tried, like, for myself even to have other people write my posts and stuff like that, I just end up rewriting the whole thing.
89
00:53:10,466 --> 00:53:56,682
Lisa Kaplan: And like, it's the kind of thing because people actually want to know, like, what you have to say, what you have to think. And the actual speech writer that takes so much time to develop the ability to write for a single person because everybody is so different. Subject matters are so nuanced. Companies are complicated, wonderful, crazy kind of beasts that, like, there's nobody better than you to talk about it. And so I completely agree with everything that you're saying. And your stakeholders, whether that's your employees or your investors or your customers, they are there because of you. And not necessarily you as an executive, but you as a company. And that's why you have to tell that story. Because if you let other people tell it to their stakeholders, you have no moat.
90
00:53:56,826 --> 00:54:13,688
Lisa Kaplan: So the best thing you can do is build up that trust, that repeatability, people understanding who you are. So that when you say, hey, that's not true, that's not who we are. We've been telling you who we are this whole time. That's when pointing out something's inaccurate will work.
91
00:54:13,834 --> 00:54:53,440
Daniel Nestle: Yeah. And that's when you look at something, you know, you build that trust, and then, you know, you're working with Alethea. Alethea says, hey, something's bubbling up. Well, guess what? You've got a lot of people who are already in your corner. You can have conversations about it. People will ask you, it's, hey, it's surprising. People will hear news and they may just reach out and say, hey, Dan, is that true? But if you've never had a conversation with that person, they'll never reach out to you. They'll reach. They'll just believe the space that's open, whatever space is there, will be filled by somebody else. Right. Just like you said. I realize we're really. I can't believe we've already been talking for almost an hour. I had one more question for you, and then we can wind it up.
92
00:54:53,520 --> 00:55:29,078
Daniel Nestle: And that is related to, and it wouldn't be the trending communicator if I didn't mention these two letters, AI. How can we go on the offense? Essentially, how can we go on the offense here? And does AI play a role in that? I think we've talked about, and I've talked with other guests about AI as a sort of accelerator for some bad actors here and there. Right. But let's say on the mitigation side, how can we go on the offense? And does AI play a role in that? And, yeah, that's, you can do it.
93
00:55:29,094 --> 00:56:15,186
Lisa Kaplan: But it has to be ethical, and it's not ethical, and it's just an LLM war if you're just churning out content. But if you're using, for example, if you catch a narrative that's out there, let's say there's a narrative that could pose risk to you or your company. It is perfectly appropriate to use AI or an LLM to potentially play around with drafting a response. Make it the first draft. It's way faster. That's totally fine. What you also have to know, though, is that you are assuming the responsibility of whatever goes out the door. And just ask that guy who filed an amicus brief citing a court case that didn't exist, whether or not that's a good idea. But it is a situation where you do have to be careful about what it is that you're saying.
94
00:56:15,338 --> 00:56:39,018
Lisa Kaplan: And so it's completely fine to use it to automate your workflows but you have to be that expert in the loop is what we call it. So using AI to identify, using AI to draft, great. Creating an LLM war where you're spinning up bot networks, not great. That one's going to come back to bite you because a company like ours is going to expose it. It's just like, save yourself the heartache. We're good at our jobs.
95
00:56:39,194 --> 00:57:26,950
Daniel Nestle: Yeah, I've always, and I'm glad you said be the expert in the loop. Ethan Mala calls it the human in the loop. I love the idea of using AI as a testbed, sort of running it, using it as to run scenarios to understand a lot of different options that are available to you to sort of help you with your thinking. And then generating content is exactly as you said. It's got to be done ethically. So we're right up on it. Lisa, and I understand that. I know that you're, as a ink magazine 250 person, I'm sure that you have a thousand different meetings coming up today. So thank you so much for your time. Before you go, I think everybody out there wants to know how to find you. It's alethea.com comma. That's alethea.com and it'll be in the show notes.
96
00:57:27,070 --> 00:57:43,530
Daniel Nestle: You can also find Lisa on LinkedIn. Just look for Lisa Kaplan. And it's not an uncommon name, but look for Lisa Kaplan, the CEO of Alethea. You'll find the right person. And at Twixterx, it's Lisa. Lisa C. Kaplan. Any place else that people can find you, Lisa?
97
00:57:43,650 --> 00:57:44,482
Lisa Kaplan: That's it.
98
00:57:44,626 --> 00:57:51,914
Daniel Nestle: That's it. Any last words that we can kind of pass along to our listeners who hopefully are going to reach out to you after this?
99
00:57:52,042 --> 00:58:17,180
Lisa Kaplan: No, just, I want to say thank you so much for the opportunity for coming on. And I would just say to listeners, because right now, if you're just starting to figure this out, you're not alone. And there are a lot of organizations right now that are realizing that this is a true risk. There is a way to do it and you don't have to go at it alone. And I'd also encourage you to talk to your peers as well, because everybody's figuring this out right now together.
100
00:58:18,080 --> 00:58:27,856
Daniel Nestle: Well, that is, that's brilliant. Thank you so much. And, you know, I'm sure our listeners are going to do just that. So thanks again for coming on, Lisa, and hope to have you again sometime soon.
101
00:58:28,008 --> 00:58:29,140
Lisa Kaplan: Thanks so much.
102
00:58:36,280 --> 00:59:00,570
Daniel Nestle: Thanks for taking the time to listen in on today's conversation. If you enjoyed it, please be sure to subscribe through the podcast player of your choice. Share with your friends and colleagues and leave me a review. Five stars would be preferred, but it's up to you. Do you have ideas for future guests or you want to be on the show? Let me know at dan@trendingcommunicator.com thanks again for listening to the trending communicator.