Facebook is still a cesspool of pandemic hoaxes, and it's killing people
Earlier in the year, Media Matters reported on the presence of dozens of Facebook groups promoting the newest hoax cure in the COVID-19 pandemic: ivermectin. Primarily used to fight parasites in livestock animals, ivermectin is popular and relatively safe as a dewormer—but do not under any circumstances give it to some breeds of dogs—with many species-specific formulations. It is also used as a dewormer in humans, if you are unfortunate enough to need it.
What it does not do is cure COVID-19, because a virus is not a worm that burrows into your intestinal lining. Research at one point suggested ivermectin might kill COVID-19 in a petri dish, but that isn’t saying much. A cinder block will also kill COVID-19 in a petri dish, if you douse it in gasoline, light a match to it, and throw it with enough force. As of yet, home improvement centers haven’t reported a new run on cinder blocks, but give it time. As the pandemic drags on, everything from tanning beds to nebulized ketchup will find momentary purchase as the newest cure. Cinder blocks will have to wait their turn.
A month later, The New York Times has revisited Media Matters’ findings to give an update, and the update is what you would expect. Facebook has “taken down a handful of the groups,” but the rest continue to thrive. Facebook continues to give the usual public responses, vowing that it’s for sure cracking down on this sort of rampant and dangerous misinformation, but the situation remains what it is. Reporters and researchers can easily find groups promoting ivermectin and coaching others how to dose themselves with the livestock versions, can report them to Facebook, and can watch as the company removes a bare handful and ignores the rest.
A not particularly new wrinkle is the effort by anti-vaccine and pro-miracle-cure Facebook group administrators to evade automated Facebook moderation by encouraging group members to write in code, with intentional misspellings or in-group abbreviations so that the words “vaccine” or “masks” or “ivermectin” do not trigger a Facebook response. You will note that in its own research, Media Matters was able to find a large number of such posts. It’s not possible that Facebook can’t find what outside researchers can so easily find and catalog. It’s just not.
Then there’s the newest Facebook-promoted pandemic hoax, with anti-vaccine advocates warning those that do come down with severe COVID-19 infections not to go to hospitals for treatment, allegedly because hospitals both won’t let you get your bleach, ivermectin, or Betadine miracle cures and may try to kill you outright in order to boost pandemic death rates.
The most notable side effect of these Facebook groups is, of course, death. The internet is currently awash in examples of anti-vaccine, pro-miracle hoax believers who eagerly helped spread one or more of the hoaxes only to end up dead weeks or months later due to their own misinformation. But the choice of a livestock medication as the newest fake miracle cure is also wreaking havoc on farms, in horse barns, and among average pet owners as the rampant Facebook misinformation results in a run on the non-human versions of the drug that has left it in short supply.
It’s not just going to be humans dying from this particular conspiracy theory. It may have ramifications for both your pets and the nation’s food supply.
I am not a gazillion-dollar tech company premised on monetizing human paranoias, so there are a lot of details about this situation that elude me. There are some facts, however, that are indisputable. Reporters have regularly been able to find the sort of content Facebook claims to not allow, and have done so with little to no resources at their disposal. Facebook not only has the cash to finance entire teams for the job, but can also provide the technical tools far more advanced than journalism’s “look around for a day or two and see what you find.”
Even if Facebook was only concerned about maintaining profit margins by understaffing its worldwide content management teams to the point that they were less effective than a single Media Matters researcher, it is still difficult to make sense of their seemingly incompetent reactions here. Pandemic anti-vaccine and other information being spread on Facebook is getting people killed, disrupting markets, and painting the company as a malignant societal force.
From purely a profit-loss standpoint, an exception to Facebook’s indifference toward its societal damages would be in order. Send in a temporary task force with the clout of not just one random journalist scanning the site, but ten or twenty. Mop up the groups promoting medical misinformation that is demonstrably dangerous. Make a big deal of it. It would cost the company approximately nothing.
Facebook likely spends as much on individual corporate parties as all of its critics have spent in two years of looking; when it comes to finding specific disinformation about a specific medical hoax during a specific deadly pandemic currently running roughshod over the population, it could handle this one. Maybe not perfectly, but to a degree that a single journalist could not clip dozens of dangerous examples in the span of a single afternoon.
If the company believes misinformation resulting in sickness and death isn’t worth any more effort than it’s giving, it’s reasonable for the public to come to its own conclusions as to why. Company leadership’s willingness to promote hoax-adjacent conservatism in other contexts may explain a lot about its willingness to tolerate deaths during this one. The company has tried its level best to flatter Trumpian conservatism specifically, though whether out of camaraderie or fear remains dodgy, and could well feel that cracking down on pandemic hoaxes spread primarily by Trump fans would so upset the willing promoters of the hoaxes that abiding deaths is the cheaper option.
We can speculate all we like, but the fact of the matter remains. Journalists can find dangerous medical misinformation about vaccines, “miracle” drugs, and other potentially deadly hokum without much looking … and Facebook, for its own part, can’t.
That ain’t a limitation of their technology, and it’s not a function of their size. It means they’ve chosen to spend less effort removing that deadly misinformation than journalists from Media Matters, The New York Times, or other outlets are to find it. That is intentional.