Danielle Lee Tomson, research manager for election rumors at the University of Washington Center for an Informed Public, at the GeekWire offices for a recent podcast recording. (GeekWire Photo / Todd Bishop)
For all the concern about deepfakes fooling the electorate, AI is having another impact on the 2024 campaign. In some cases, like heroic AI-generated images of the candidates, it’s less about veracity, and more about the vibe.
“When AI is used in ways that are illegal, you can face consequences,” explains Danielle Lee Tomson, research manager for election rumors at the University of Washington Center for an Informed Public. “But when AI is used to create an ambience online, you could say, ‘Oh yeah, Donald Trump is not a Steelers linebacker,’ but you feel something — and you can’t fact-check a feeling.”
That’s one of the insights on this week’s episode of the GeekWire Podcast. With just days to go before the Nov. 5 election, guest host Ross Reynolds speaks with Tomson about AI, social media, and trends in the spread of rumors online.
Related links and stories:
Substack: Election Rumor Research @ Center for an Informed Public
New York Times: As Election Looms, Disinformation ‘Has Never Been Worse’
Washington Post: Don’t say ‘vote’: How Instagram hides your political posts
Listen to the episode below, and continue reading for highlights, edited for context and clarity. Subscribe in Apple Podcasts, Spotify, or wherever you listen.
The work of the Center for an Informed Public (CIP): “The Center for an Informed Public was founded by professors in information studies, in law, in science of science studies, to try to inform the public about increasingly specialized knowledge. My team of election rumor researchers is just one of many at the center. We have an amazing group that has a misinformation day. It’s like a media literacy project. We have other teams that are demystifying psychology research and media literacy. So it’s a quite a multi-functional team, and part of our work is communicating research to the public in our various verticals so that they understand it.”
CIP’s election rumors research: “Rumors can be true, they can be false, but most of the time they’re somewhere in between. … Rumoring is part of this natural process of trying to make sense of reality. We do our best to point out, ‘OK, this little piece of truth is here, but it’s being conflated and framed into this larger phenomenon that just isn’t true.’ We try to acknowledge when there are pieces of truth that are being twisted or reframed or repackaged in such a way that they become untrue. We have to think of mis- and disinformation a little more holistically in that sense.”
The state of fact-checking, trust and safety on social media: “A lot of the platforms have gone through larger layoffs in the past couple years, and trust and safety teams, a lot of folks are laid-off there. … We’ve seen this dynamic in a lot of social media platforms who are not sharing as much political content. … [They’re] not going to show you these political memes. Even though they’ll keep you for a long time on the app, because they’ll outrage you or get you excited, it’s maybe not going to be good for the ROI of that company.”
AI’s unexpected impact on the campaign: “We were talking about deepfakes in 2022 and 2020. We’ve been talking about the role of social media now for over a decade. And I think that anytime you have a new technology, there’s a fear of how it may be used.
“Anytime something is explicitly false and explicitly untrue, like an AI generated an image of Donald Trump wearing a Steelers jersey after he went to a Steelers game last Sunday, OK, that’s clearly false, but the veracity here isn’t what’s important. It’s the vibe. It’s this feeling of kinship, this kind of imaginative play on the internet.
“There are instances where there were robocalls using a candidate’s voice telling people to vote on the wrong day in the wrong place. That was prosecuted. So when AI is used in ways that are illegal, you can face consequences. But when AI is used to create an ambience online, you could say, ‘Oh yeah, Donald Trump is not a Steelers linebacker,’ but you feel something — and you can’t fact-check a feeling. So in many ways, facts don’t care about your feelings, but also, feelings don’t care about your facts.”
It doesn’t end with the Nov. 5 election: “I keep two countdowns on my office whiteboard, one till Election Day and one till certification, because I believe there will definitely be a lot more rumoring and sense-making across the political spectrum of what’s going on, what happened, what’s true. So we’re definitely gearing up for monitoring conversations related to election processes and procedures well past election day.”
Listen to the full conversation with Danielle Lee Tomson above, or subscribe to GeekWire in Apple Podcasts, Spotify, or wherever you listen.
Hosted by Ross Reynolds. Edited by Curt Milton. Music by Daniel L.K. Caldwell.
{Categories} _Category: Applications{/Categories}
{URL}https://www.geekwire.com/2024/you-cant-fact-check-a-feeling-uw-researcher-on-ais-unexpected-role-in-the-2024-campaign/{/URL}
{Author}Todd Bishop{/Author}
{Image}https://cdn.geekwire.com/wp-content/uploads/2024/11/danielle-thomson.jpg{/Image}
{Keywords}Podcasts,AI,Center for an Informed Public,Ross Reynolds{/Keywords}
{Source}Applications{/Source}
{Thumb}{/Thumb}