OPINION: Censorship not a solution to misinformation
Removing harmful rhetoric promotes, reinforces beliefs
February 17, 2022
Joe Rogan, a podcast host on Spotify, and his spreading of misinformation surrounding COVID-19 is just one fragment of a difficult discussion in our society: how do we tackle misinformation on the internet?
Misinformation has circulated on the internet since its beginning. People have always been encouraged only to believe information from trusted sources online, as there are many sources that promote false information either to harm people, make a profit or because they are misinformed themselves.
This problem has received mass amounts of attention through the course of the pandemic and has caused a major political divide. Many people have promoted false information, namely anti-vax rhetoric and unproven treatments, which have cost, and will continue to cost, people’s lives.
In the case of Rogan, his most infamous moment during the pandemic was his promotion of ivermectin as a cure for COVID-19. Ivermectin has not been proven to be effective or safe in treating infection, which makes encouraging its use extremely dangerous.
This action has driven massive outrage in both the public and the medical community, who all are calling for an end to Rogan’s show on Spotify.
Spotify says it will add content advisories to podcast episodes discussing COVID-19 and direct listeners to reliable sources, according to a company press release.. However, many are not satisfied and want the podcaster off the platform entirely.
However, Spotify Chief Executive Daniel Ek said in a memo provided to the New York Times that silencing Rogan is not the answer to the problem and canceling voices is a “slippery slope.”
It is possible Ek only said this because Rogan’s podcast is a Spotify exclusive and removing the show would equal a loss in revenue. Nevertheless, it brings up a very important discussion about how to address misinformation online.
Rogan and the COVID-19 misinformation situation is not the first debate surrounding misinformation we have seen. Former President Donald Trump’s presidency directly inspired a harsher push for fact-checking public officials and popular media figures.
Since then, topics like climate change and election integrity have faced a huge crackdown on misinformation.
But, there are different ways to combat misinformation. While a lot of it either gets publicly fact-checked or content advisories are plastered on the post it came from, many advocate for a harsher approach of de-platforming, banning or censoring people who spread misinformation.
I firmly believe it is our civic responsibility and right to call out lies and bring the truth to light. We live in an era where government officials and news media have manipulated the trust of the people; celebrities and other public figures have capitalized off of that reality.
However, I also believe suppressing people from speaking their minds – even if rooted in falsehood– can be more harmful than productive.
Suppressing speech does not suppress someone’s thoughts. In other words, banning someone for misinformation will not make them change their beliefs.
This also has an impact on their following. Banning someone for misinformation is not going to help that person’s followers realize the truth. Instead, they are going to see it as a personal attack against the person they trust. This will only strengthen their beliefs.
In other words, silencing the leader does not mean their followers are going to disperse. At the end of the day, people will continue to believe what they want to believe. It is often difficult to change that.
Gage Berz, junior computer engineering major, said he believes misinformation is dangerous to the public. However, he believes silencing people is an ineffective strategy to remove misinformation.
People who spread misinformation will continue to do so no matter how many times they get removed, Berz said.
“While de-platforming may seemingly fix the issue by removing the misinformation from platforms, it creates minority groups on other fringe areas of the internet which will grow silently,” he said. “Alex Jones, a conspiracy theorist, was de-platformed on almost all social media platforms, but, even with that removal, he still has his own site pulling in millions of views.”
In a way, this is akin to the dark web. People who like to share illegal content did not stop doing so just because the internet does not allow it. Instead, they find other ways to spread their ideas.
People who spread false information will only do the same. They may not go to the extreme and post on the dark web, but they will migrate to obscure parts of the internet where the crackdown on misinformation is not nearly as enforced.
Freshman psychology major Loki Hogman said he believes censorship rarely works as intended.
Like Berz, Hogman said silencing people will only make that person angrier. The act will encourage them to continue spreading false information, only elsewhere.
“Misinformation is really hard to combat because hooking people’s attention with fabricated stories to make someone sound correct will and has always happened with humans,” he said. “The main tool against misinformation is teaching people how to identify it, so they can stay away and move on to something reliable.”
The idea of teaching people how to identify misinformation brings up a very good point: sometimes, in order to reaffirm what is right, we have to be able to see what is wrong.
A key point in any argument is not only saying why you are right but also saying why the other side is wrong and disproving them with the evidence you have. Removing all false information and only letting people access correct information completely dilutes that concept.
Letting people have access to information, even if it is incorrect, allows people to criticize and disprove said information. I think that can be more influential in getting people to change their viewpoints because you are disproving something rather than removing it entirely.
Because at the end of the day, stopping false information entirely is an incredibly unrealistic expectation. All that we can do is recognize misinformation and call it out when we notice it.