‘Fact-checking’ sign is seen displayed on a laptop screen in this long exposure illustration photo … [+] taken in Poland on June 13, 2020. European Commission officials said that Facebook, Twitter and Google should provide monthly fake news reports to prevent fake news about coronavirus pandemic. (Photo Illustration by Jakub Porzycki/NurPhoto via Getty Images)
NurPhoto via Getty ImagesMisinformation has been identified as the most pressing short-term global risk, according to the Global Risks Report 2024 by the World Economic Forum. As digital information grows more pervasive, business leaders and educators face a critical challenge: How can they help individuals discern fact from fiction in a world awash with falsehoods?
In my recent video interview with David Benigson, CEO of Signal AI, we explored how his company’s AI technology aids organizations in navigating the misinformation minefield. Sander van der Linden, a psychologist and misinformation expert at the University of Cambridge, provided insights into psychological resilience. Adding an educational perspective, OECD analyst Miyako Ikeda shed light on the role of media literacy and critical thinking in equipping the next generation to combat misinformation. Together, these voices offer actionable solutions to one of today’s most urgent problems.
Harnessing AI to Distinguish Fact from Fiction
Businesses today face an overwhelming volume of information, which creates fertile ground for misinformation. “The sheer volume of information is overwhelming, and that’s where misinformation thrives,” says David Benigson, CEO of Signal AI. His company uses a combination of discriminative AI to validate sources and generative AI to synthesize insights, enabling organizations to filter out noise and access credible data. “We’re focused on distilling information down to what really matters,” he explains.
Benigson emphasizes the importance of balancing AI and human expertise—a model he calls “augmented intelligence.” By keeping human analysts in the loop, Signal AI ensures contextual accuracy and critical thinking. “AI can identify patterns, but human expertise provides the necessary context,” he notes. This approach enables organizations to confidently identify emerging risks, spot misinformation, and make informed decisions in real time.
Building Psychological Resilience Against Manipulation
Misinformation thrives on cognitive biases and social influence, making psychological resilience a critical skill. Sander van der Linden explains how “psychological inoculation” helps individuals resist manipulation by exposing them to weakened doses of misinformation techniques, such as polarization and emotional manipulation. “These methods equip people to recognize and counteract misinformation tactics,” he says.
Van der Linden’s Bad News game exemplifies this approach, simulating a social media feed to teach users to spot manipulation tactics in real time. These tools have proven especially effective among young people, who are highly active on social media platforms. “Inoculating students against cognitive biases like the ‘illusory truth effect’—where repeated falsehoods start to feel true—is vital for building resilience,” he adds.
Van der Linden also highlights the need for addressing conspiracy theories with specificity. Using the CONSPIRE framework, students can learn to deconstruct conspiratorial thinking by identifying patterns such as incoherence and immunity to evidence. “Once you see the trick, you’re less likely to fall for it in the future,” he explains.
Fostering Media Literacy Through Education
As digital consumption grows, media literacy has become a cornerstone of education systems worldwide. Miyako Ikeda, an OECD analyst, underscores the urgency: “Fifteen-year-olds now spend an average of 35 hours per week online, making it essential to teach strategies to distinguish fact from opinion and detect biased content.”
The OECD’s Programme for International Student Assessment (PISA) plays a pivotal role in this effort. In 2025, PISA will include new assessments focused on students’ ability to evaluate the credibility of science-related content. This follows insights from PISA 2018 data, which showed that education systems with a higher proportion of students trained to detect biased information had better outcomes in distinguishing fact from opinion.
“Teaching critical thinking and media literacy must start early,” Ikeda explains. She points to inquiry-based teaching methods as a promising approach, where students actively engage in evaluating sources and questioning the validity of information. By embedding these practices into curricula, schools can prepare students to navigate the digital information landscape with confidence and discernment.
Ikeda also stresses the need to balance realism with optimism. “Educators must ensure that students are informed without becoming despondent about the challenges of misinformation,” she notes, emphasizing the importance of fostering both awareness and proactive problem-solving skills.
Integrating AI, Psychology, and Education for a Resilient Future
The fight against misinformation demands a multifaceted approach that combines AI technology, psychological resilience, and media literacy. Benigson’s AI-driven solutions enable organizations to filter reliable data and mitigate risks. Van der Linden’s inoculation techniques equip individuals with the tools to recognize and resist manipulation. Ikeda’s advocacy for media literacy highlights the importance of teaching critical evaluation skills early, building a foundation of resilience for future generations.
“By recognizing cognitive biases and social influences,” van der Linden explains, “students and employees alike become empowered to question the information they consume.” Ikeda adds, “Education systems must rise to the challenge of preparing students to critically evaluate information in a rapidly evolving digital world.”
Actionable Steps for Business Leaders and Educators
Adopt AI-Driven Tools for Reliable Insights: Platforms like Signal AI can help businesses filter misinformation, assess public sentiment, and identify emerging risks.
Introduce Psychological Inoculation Techniques: Prebunking games such as Bad News provide practical, engaging ways to train individuals to spot manipulation tactics.
Implement Media Literacy in Educational Curricula: Schools can build critical thinking skills through inquiry-based approaches, as supported by OECD data.
Promote a Human-AI Partnership for Decision Making: Signal AI’s “augmented intelligence” model balances AI insights with human expertise, fostering smarter decision-making.
Expand Media Literacy Programs: Continuous workshops and digital literacy modules equip students and employees with the skills to critically assess information and detect biases.
Building a Future of Resilience
In an age where misinformation undermines societal cohesion and decision-making, resilience is critical for the stability of organizations and communities. As Saadia Zahidi, Managing Director of the World Economic Forum, warns, “An unstable global order characterized by polarizing narratives and insecurity, worsening impacts of extreme weather, and economic uncertainty are causing accelerating risks.”
By combining technology, psychology, and education, business leaders and educators can equip individuals with the tools to combat misinformation and build a more informed, resilient society. “It’s not just about faster decisions, but smarter ones,” Benigson emphasizes. With these strategies, we can move toward a future where misinformation no longer holds sway.
{Categories} _Category: Implications{/Categories}
{URL}https://www.forbes.com/sites/cathyrubin/2024/12/02/combatting-misinformation-ai-media-literacy-and-psychological-resilience-for-business-leaders-and-educators/{/URL}
{Author}C.M. Rubin, Contributor{/Author}
{Image}https://imageio.forbes.com/specials-images/imageserve/67428dab19f91d6c70649c22/0x0.jpg?format=jpg&height=600&width=1200&fit=bounds{/Image}
{Keywords}AI,/ai,Innovation,/innovation,AI,/ai,Leadership,/leadership,Education,/education,standard{/Keywords}
{Source}Implications{/Source}
{Thumb}{/Thumb}