Category: Discovery & Impact

Title: A Misinformation Expert’s Top Tips to Slow the Spread of Misleading Information in the 2024 Election

In every election, conspiracy theories, false accusations, half-truths and outright lies from candidates jostling for elected office can run amok.

The 2024 election in the United States is no exception as candidates vie for the Oval Office.

Headshot of Leticia Bode
Leticia Bode is a professor of Communication, Culture and Technology in the Graduate School of Arts & Sciences and the research director for the Knight Georgetown Institute.

“Misinformation is actually really consistent, especially when it comes to elections,” said Leticia Bode, professor of Communication, Culture and Technology in the Graduate School of Arts & Sciences and the research director for the Knight-Georgetown Institute. “As long as people are trying to convince you of something, they’re going to sometimes use things that are less than true to convince you of what they want in the world.”

With the rise of generative artificial intelligence in the last year, election observers are bracing for what’s to come as the November 2024 election kicks into gear with the start of primary season.

We asked Bode to explain how you can help stop the flow of misinformation as well as early trends to watch out for in the 2024 election cycle and what a doomsday scenario with AI looks like.

Ask a Professor: Leticia Bode on Misinformation, Technology and the 2024 Election

How can you spot misinformation online? What are some warning signs to pay attention to?

The biggest red flag for me is when something evokes an emotion in you. That doesn’t necessarily mean that it’s misinformation, but it signals that somebody’s trying to manipulate your emotions, and that’s something you should be aware of. When you recognize it, remember to take a deep breath and look into it a little bit before you believe it.

How is the threat of election misinformation different today than it was in recent elections?

Misinformation is actually really consistent, especially when it comes to elections. In one study that looked at election misinformation in 16 different Latin American countries, researchers found that basically all of the election tropes were exactly the same. When they looked at the 2020 U.S. election, it was the same exact kinds of misinformation. So it tends to be things that make you skeptical of the process or the result, and that ends up being consistent across elections.

What makes misinformation different today is how prepared people are for those predictable tropes. In the 2022 elections, I think the media did a much better job preparing people for what they should expect with regard to electoral processes. For example, the media told voters it’s going to take a while to count votes and explained how different states count absentee ballots differently, which can change the perception of how one candidate is ahead of another depending on when votes are counted. I think there’s a lot of what’s called “pre-bunking” that can be really effective in preparing people to identify possible misinformation.

Leticia Bode speaks at Global Face 9, a fact-checking summit in Oslo, Norway, in 2022. Photo credit: Angela Trajanoski.

What role does technology play in misinformation today?

Technology is often a mixed bag. It doesn’t necessarily make anything better or worse, but it can make things more complicated by amplifying the things that already exist in the world. So misinformation is not new. As long as people are trying to convince you of something, they’re going to sometimes use things that are less than true to convince you of what they want in the world.

I’m not sure that technology has necessarily made misinformation more prevalent, but I think it’s made it different in terms of the forms it can take. And we’re seeing that especially right now with large language models and AI that generates images really effectively.

What are some early trends you’re seeing in the misinformation landscape in the 2024 election cycle?

It’s still really early in the election cycle, and I think a lot of the predictable misinformation that we see in elections tends to come later when we get closer to actual electoral contests. Having said that, there’s definitely misinformation that’s being spread by people who are competing for the presidency, so we’re seeing a lot of elite candidate misinformation about each other. And, again, that’s nothing new.

How do you correct someone online without making it personal or “canceling” them?

Think about it as how you would want to be corrected. If you accidentally shared something that wasn’t true, how would you want someone to approach you? You probably want whoever is correcting you to be kind, understanding and empathetic. 

You’d want them to say, “I understand this is complicated. I was confused by it too. Here are some sources that I found that seem to say the opposite of what you’re saying. I’m happy to have a conversation with you about it.”

Make it more of a dialogue as opposed to accusing others, and that will make people more likely to be open to having this conversation.

If you accidentally shared something that wasn’t true, how would you want someone to approach you? You probably want whoever is correcting you to be kind, understanding and empathetic.

Leticia Bode

What role does the government and the media have in stemming the spread of misinformation?

There is a lot that both government and media companies can do and have done over the years. We as a society have to decide how much we value truth versus how much we value other things like free speech. There are a lot of trade-offs that have to be made, and if we’re only prioritizing true information being shared, that means a lot of people are going to be marginalized and their voices are not going to be heard in the world. 

We’ve lived through that version of reality — during the broadcast television era — and it has a certain set of trade-offs with it. So we have to decide as a society what we’re willing to accept.

Is reducing misinformation the responsibility of the user or social media platforms?

I take the opinion that everybody has a responsibility in this. I think that you can think of the platform as being an earlier line of defense while the user is a last line of defense against misinformation for the stuff that gets through.

Having said that, platforms also must have a difficult conversation about where they want to draw the line. We don’t necessarily just want to rely on large corporations to decide whether something is true and what we are allowed to talk about. I don’t think anybody’s fully comfortable with giving them that kind of power.

Do you think misinformation will continue to be pervasive in future election cycles? Or do you have a more hopeful outlook?

Most of the time when I’m asked this sort of question, I say misinformation is fairly stable. The thing that gives me pause is ChatGPT and the image generator DALL-E because I think there will be a critical shift before people learn how to recognize misinformation in those new formats. There are a bunch of elections happening around the world in 2024 — some people are calling it the “World Cup of elections.” So I do worry about the 2024 election cycle because it could get critically bad if the wrong kind of misinformation was disseminated at the wrong time.

With artificial intelligence, what’s the doomsday scenario?

In one simulation I was a part of, the scenario is that it’s only days before the election and effective debunking can’t be disseminated in time. Multiple AI-generated videos appear showing the exact same scene from different angles, which is exactly how we tell people to verify information — check to see if there are multiple people showing the same incident. 

And what would it show? I don’t even want to speculate about what that would be — something terrible about one of the candidates that would completely change people’s willingness to vote for that person.

We have close elections a lot of times in this country. It doesn’t take that many votes to change the outcome of the election. So I think that there’s a real scenario where something like that could happen.