Facebook Misinformation: A People Problem Not Solvable By Technology
There used to be a saying: “A lie can get around the world before the truth can even get its shoes on.” Who that saying is attributed to is sketchy, just like plenty of Facebook misinformation, misattributed quotes, and falsehoods both accidental and intentional. And that is before you start to factor in emotions of cultural debate, the passions of politics, and profits of the lords of information who are among the richest and most influential of human beings ever.
Among many ongoing issues Covid-19 has seemed to amplify, Facebook misinformation and social media’s ability to spread a lie around the world before the truth can remember its password is prominent among them. This is not new of course, with Facebook misinformation having a starring role in the 2020 elections, the Trump years, and pretty much any breaking political or cultural story since Facebook became a thing. The media coverage of Facebook misinformation has become a constant of the discourse, seemingly
The Washington Post has a story of successfully fighting misinformation the old fashioned way, sometimes despite Facebook’s best efforts:
Amid the online scare stories and anti-vaccine memes, an army of local influencers and everyday users is waging a grass-roots campaign on Facebook, Reddit and other platforms to gently win over the vaccine skeptical. They’re spending hours moderating forums, responding to comments, linking to research studies, and sharing tips on how to talk to fearful family members.
“It feels a lot like covid is something that is completely out of control and there is nothing we can do, like it’s this out-of-control wildfire, and I’m just one person with a little hose,” said Kate Bilowitz, an Oakland, Calif.-based mom who works for a real estate company and co-founded Vaccine Talk. “But when people reach out to us, it feels like it’s making a little bit of a difference.”
Their work exemplifies Facebook’s stated goal to “bring the world closer together.” But Bilowitz and others who run similar forums say that the interventions made by technology companies are often counterproductive, and that software algorithms frequently delete valuable conversations mistaken for misinformation.
“Facebook is attempting to shut down misinformation by shutting down all conversation entirely,” she said. “I strongly believe that civil, evidence-based discussion works, and Facebook’s policies make it extremely difficult for that to happen.”
Facebook leans on software and its army of 15,000 human moderators to detect covid misinformation. Last month, it said it had taken down more than 20 million posts since the start of the pandemic. But it routinely misses new memes and conspiracy theories, while at times scrubbing legitimate information. At the same time, its news feed algorithms boost posts that get the most engagement, often helping the most sensational claims go viral.
Facebook and YouTube spent a year fighting covid misinformation. It’s still spreading.
Twitter and Google-owned YouTube have employed similar strategies, using algorithms to parse text posts and listen to videos, sometimes banning them immediately and other times flagging them for a human to review. The companies have peppered their sites with links to official coronavirus information from the Centers for Disease Control and Prevention and World Health Organization, as well as pushing articles from mainstream news organizations to the top of people’s feeds and search results.
Still, misinformation remains a big enough problem online that last month, U.S. Surgeon General Vivek H. Murthy called it “a serious threat to public health” and President Biden accused Facebook of “killing people.”
President Biden’s characteristic hyperbole aside, misinformation has long been the enemy to sound medical advice. By extension, that same misinformation is the adversary to public health in general. This is nothing new; it is just that in the modern world we live in technology has replaced word-of-mouth gossip and made things like Facebook misinformation nearly omnipresent. The “information superhighway” of 90s parlance was going to inevitably be used for good and bad information alike as the internet age took hold. The twin advancements of social media and smart phones as the millennium got rolling rendered the analogy of information being an interstate archaic. With the entirety of human knowledge now available in the palm of your hand to virtually anyone, anywhere, information was more like the all-encompassing Force from Star Wars: all around us, with every living thing.
Which means misinformation was also going to be all around us, in every device, in every hand. Big Tech leaders like the Zuckerburgs of the world touted the ability of algorithms and technology to manage such things. But the issues of Facebook misinformation are not just math problems. They are people problems. If the social media era has taught us anything, it is the limits of technology and algorithms to keep up with the complexities of human behavior — especially something as nuanced as conversation and debate. The interactions of human relationships and emotions do not translate well to the rigid calculations tech companies need to make the complex social media algorithms work both in practice and for profit.
Crisis reveals things, and whether it is Covid-19, elections, or general purpose conspiracy theories, Facebook misinformation nor the broader issue of social media wrongness has revealed that online misinformation is never going to be solved by a social media company. It is impossible, since the social media business — like all businesses — is a people business. There is no way to chase the needle of pushing a trending item to maximize shares and views while fact checking it in real time. There is no way to herd all of humanity into thinking the same thing at the same time, especially when your business model is to fast track and monetize group think. As Facebook is finding out the hard way, controlling speech and thought, regardless of the motivation or reason for doing so, is very much like trying to sculpt Jell-O into a staircase.
In 2017, Facebook’s stated mission was “to give people the power to share and make the world more open and connected.” In 2019, Facebook modified their corporate statement to “to give people the power to build community and bring the world closer together.” The ongoing problems with Facebook misinformation show that while you can give people power to build community, they are going to build the community they want to build. They might have the power to bring the world closer together, but usually that means bringing the world as they already see it closer together and more connected, not just because of human nature, but because that’s how Facebook designed their product to work.
The answer to misinformation, such as there is one for a problem that has existed for the entirety of human history, is not going to be a technological fix. It is, always has been, and always will be a people problem that is only addressable by people. The small victories of people of good faith finding each other and reasoning together need to be praised, and showcased as a contrast to the doom scrolling social media can easily become. Facebook designing their product to foster the latter while tending to throw the good of the former out with the dirty bathwater of misinformation is an issue. But the problem of misinformation is a people problem, and the good news is people can solve it on the lowest level to a large extent without any complicated algorithms, big tech regulatory hearings, or even tsk tsking from talking heads and elected officials. It just takes a little effort to discern better, trust but verify information, and actually talk to people like they are people — including online — instead of just targets to be hit with a smash of the “post” button.
In the modern world, and despite the caterwauling over Facebook misinformation, it is entirely in everyone’s power to send truth around the world just as fast as the lie via social media. When we do not, there is no one to blame but ourselves.
The problem is that I think a lot of the misinformation people are acting in bad faith and I’m rather angry and tired of constantly being told that the only way to solve the problem is endless assumptions of good faith, followed by hugs and mugs of cocoa. They might as well state “heads I win, tails you lose.” How much are people’s misconceptions sincere? How much of it is bullshit thrown up to prevent people from doing the right thing? How much is shitposting and trolling?Report
“the people I disagree with are all acting in bad faith” is copeReport
Abject misinformation is not “disagreement”.
There is such a thing as objective reality, and in objective reality, no amount of horse dewormer will cure COVID-19 but can, in fact, kill you.
But you know that. Weird how you defend it though.Report
Comment in modReport
Released. Really, how hard is it to remember that if you include the vowel in sh*t, the comment will go into moderation?Report
Doom scrolling – We have become so used to the constant barrage of bad news the media insists on delivering that it’s become entertainment for us.
Even NPR does it. I have to be careful when the kid is in the car because of the constant drum beat of COVID, or Afghanistan, or shootings in distant places, or violent protests, etc. Perhaps out collective attention span is so short because we get so blasted with a given topic we almost immediately begin to tune it out and look for something else.Report
There is a reason we are featuring two pieces of writing about Jeopardy today and tomorrow. Going to openly admit it’s been a struggle last few weeks keeping a balance. The Afghanistan one is the really tough one because not just the current doom scrolling but knowing what is coming next week you have to just be like “k, can’t stop it, walk away for a bit.” Also why I Tweet and write about food so much, its mostly for me, because to do what we do you have to intake so much news to keep the output up, I need that food and fun and just normal stuff to feel human again sometimes.Report
This is why I read a lot of science news.Report