How job cuts by major social media companies is affecting global fight against misinformation, disinformationBy Fatimah Quadri on February 18, 2023
JOB cuts by major social media companies is affecting many teams saddled with the task of curbing misinformation and disinformation on those social platforms.
It will be recalled that major social media platforms have spent years in expanding their efforts to tackle misinformation since 2016 hiring policy experts while investing more technology to limit the reach of false narratives.
However, in recent months these platforms have had to cut down staff in charge of handling misinformation, the New York Times has reported.
The report stated that YouTube owned by Google fired two of its five misinformation experts, including the team’s manager leaving behind one person for political misinformation and two for medical misinformation.
It also shed two of its five policy experts who work on hate speech and harassment issues. These experts have played critical roles determining where the company’s acceptable and unacceptable content should be and advised executives on difficult content decisions.
The cuts also reflects across the industry that threatens to undo many of the safeguards that social media platforms put in recent years to ban disinformation.
Twitter under its new owner, Elon Musk has slashed its staff while Meta which owns Facebook, Instagram and WhatsApp has shifted its focus and resources to the submerging world of the Metaverse.
Faced with economic headwinds and political and legal pressure, these major platforms have showed signs that fighting false information online is no longer a priority, raising fears among experts who track the issue that it will further disintegrate trust online.
The president of the liberal media watchdog Media Matters for America, Angelo Carusone said “I wouldn’t say the war is over, but I think we’ve lost key battles.”
After years of efforts, he described a mounting sense of fatigue in the struggle. “I do think we, as a society, have lost the appetite to keep battling. And that means we will lose the war.” he added.
The companies maintain they remain diligent, but the efforts to combat false and misleading information online have declined at a time when the problem of misinformation remains as destructive as ever with an increase of alternative sites competing for users.
Meta recently restored the accounts of former President Donald J. Trump on Facebook and Instagram, barely two years after suspending him for inciting violence ahead of the storming of the Capitol.
Mr. Musk has also invited Mr. Trump back to Twitter, one of the many steps he has taken to dismantle many of the platform’s previous policies. The team that oversaw trust and safety issues — including misinformation — was among those eliminated under Mr. Musk’s leadership.
Researchers at Media Matters found several examples of Covid-19 misinformation being spread on YouTube Shorts, the platform’s service for minute-long videos, in the past week. They also found an array of videos that espoused hateful, misogynistic and transphobic views. Some were from well-known creators.
“YouTube and the other social media platforms are inconsistent in their enforcement of their policies,” Kayla Gogarty, the deputy research director for Media Matters, said.
YouTube said it was always working to strike a balance between allowing free expression and protecting online and real-world communities from harm. Nicole Bell, a spokeswoman for the company, said that YouTube removed six videos flagged by Media Matters for violating its policies, and it terminated a channel for uploading content from a banned creator. But most of the more than two dozen videos flagged by Media Matters did not break the platform’s rules, she said.
Last year, the International Fact-Checking Network, representing more than 80 organizations, warned in a letter addressed to YouTube that the platform was “one of the major conduits of online disinformation and misinformation worldwide,” and that it was not addressing the problem.
The consequences of easing up on the fight against misinformation have become clear on Twitter. A report by two advocacy groups, the Network Contagion Research Institute and the Combat Antisemitism Movement, found a surge in antisemitic content as Mr. Musk took over.
Nora Benavidez, senior counsel at Free Press, an advocacy group for digital rights and accountability, said the experience at Twitter showed that moderating offensive content remained important for the viability of platforms, regardless of economic considerations.
“Content moderation is good for business, and it is good for democracy,” she said. “Companies are failing to do that because they seem to think they don’t have a big enough role to play, so they’re turning their back on it.”