IN 2025, the global fact-checking landscape faced mounting pressure as AI-generated and deepfake content became more widespread and increasingly sophisticated, often making manipulation harder to detect than in previous years.
This period also coincided with funding uncertainty for several fact-checking organisations and growing debate over platform accountability. In the United States, some major technology companies began shifting away from third-party fact-checking programmes toward community-driven models, such as Community Notes, raising questions about the effectiveness and consistency of these approaches.
Later in the year, the sector faced additional strain following a U.S. policy shift restricted visas for individuals who had worked in fact-checking and content moderation. Together, these developments highlighted the increasingly hostile and unstable environment in which fact-checkers now operate.
READ ALSO: How AI made deepfakes harder to detect in 2025
Against this backdrop, several trends are expected to shape the fact-checking landscape in Nigeria in 2026, particularly around elections and information integrity. As the country moves closer to the next general elections, the quality and credibility of information will play a decisive role in shaping public trust, voter behaviour, and democratic participation.
Recent election cycles have shown that misinformation is no longer confined to polling periods; it emerges early, spreads rapidly, and exploits gaps in digital governance, media literacy, and platform accountability.
In 2026, these dynamics are likely to intensify, presenting new challenges for fact-checkers, journalists, and election stakeholders.
1. Early, prolonged election misinformation
Misinformation in 2026 is expected to take the form of prolonged narrative-building, as seen subtly before the 2023 elections. Political actors may spend months casting doubt on electoral institutions, questioning voter registers, or framing outcomes as predetermined or illegitimate. Once established, these narratives are difficult to reverse and often resurface during key moments, reducing the effectiveness of last-minute fact-checks.
2. More local and convincing AI political fakes
The growing accessibility of generative AI tools means manipulated political content will increasingly reflect Nigerian realities. Deepfake videos, cloned voice notes, and AI-generated images may feature recognisable public figures, local accents, and culturally specific messaging, making it harder for audiences to detect falsehoods. This trend will challenge traditional verification approaches that rely heavily on visual cues or known manipulation patterns.
3. Closed platforms shape political narratives
Encrypted messaging platforms such as WhatsApp and Telegram are expected to remain central to political mobilisation and misinformation in 2026. Messages circulated in closed groups, often framed as “insider information” or community alerts, will be difficult to monitor and correct. By the time these narratives reach public platforms, they may already have shaped opinions at the grassroots level.
4. Rising distrust for independent media
As election-related verification intensifies, fact-checkers and journalists may face growing hostility from political actors and their supporters. Corrections are increasingly at risk of being dismissed as partisan, while verification organisations may be accused of serving foreign or political interests.
Based on past election cycles, there is also a heightened risk of online harassment, legal intimidation, or regulatory pressure being used to discourage scrutiny of political claims ahead of the 2027 elections.
5. Unchecked narratives in local languages
Misinformation in local languages and dialects is expected to continue outpacing fact-checking efforts, particularly in regions where political communication relies heavily on oral, cultural, or religious messaging. At the same time, major technology platforms may rely heavily on automated moderation systems, with limited local context and delayed responses to flagged election-related content. These gaps could allow harmful narratives to spread largely unchecked during critical moments.
Together, these trends suggest that the challenge for fact-checkers in 2026 is likely to extend beyond debunking individual claims. It will increasingly involve tracking long-term narratives, responding to more sophisticated AI-generated content, and finding effective ways to reach audiences in private and local-language spaces.
Strengthening information integrity ahead of Nigeria’s next elections will require faster verification, greater platform accountability, legal protections for independent media, and sustained investment in media literacy and pre-bunking efforts.
Seasoned fact-checker and researcher Fatimah Quadri has written numerous fact-checks, explainers, and media literacy pieces for The FactCheckHub in an effort to combat information disorder. She can be reached at sunmibola_q on X or fquadri@icirnigeria.org.


