Skip to content

Americans Want Transparency In Content Moderation Decisions On Social Media

On Thursday, Facebook removed a series Trump campaign ads that feature a Nazi concentration camp symbol, which Facebook said violates its policy against organized hate. This move from Facebook adds to the debate that rages around content moderation and free speech on social media platforms, specifically over how much control we want companies like Facebook and Twitter to have. A joint poll from the John S. and James L. Knight Foundation and Gallup published on Tuesday might provide some insights on how social media companies and policymakers can move forward. The key takeaway? Platforms should offer more transparency about moderation decisions.

According to the poll, which surveyed results from interviews that took place between December 3 and 15, 2019, and between March 17 and 30 of this year, Americans are largely in favor of the creation of independent oversight boards to govern content policy on social media, similar to Facebook’s oversight board. Vital to those oversight boards are diversity and transparency.

“Board transparency, including publishing reports that explain decisions and areas where board members disagreed, was the top consideration [among polled Americans],” writes Daphne Keller, platform regulation director for the Stanford Cyber Policy Center, in the report. “That tells us a lot about both current frustration with platforms (people are sick of opaque content policy and takedown decisions that don’t seem to make sense) and about what it will take to give Facebook’s new board much-needed legitimacy in the eyes of the public.”

The poll also finds that around two-thirds of Americans agree that the internet should be a place for free expression, yet 85% support the removal of false or misleading health information from social media, and 81% support the removal of intentionally misleading information on elections or other political issues. Misinformation and disinformation are both perfectly legal forms of free speech, according to the First Amendment, which makes it difficult for the government to enact policy around how platforms moderate content. And even as most Americans don’t trust big tech companies to be in charge of policing content on their own platforms, they trust the government to moderate content even less.

Kickstart Your Online Business With These 300+ Video Tutorials!

That’s not stopping the government from trying to regulate internet content, as we’ve seen with the Trump administration’s proposal to Congress to scale back Section 230, a federal law that protects media companies from legal liability for harmful posts by third parties on their platforms. This law allows platforms to take down content as they see fit, like if a post violates the platform’s policies. According to the Gallup/Knight poll, most Americans support keeping the law in place, and say that individuals who post or share harmful content should be liable, not social media companies.

“For years, social media companies like Facebook and Twitter have been really reluctant to be the ‘arbiters of truth,’ but recent events like the coronavirus pandemic have brought with them a pretty profound shift in this posture, this reluctance to moderate certain kinds of speech online,” said John Sands, director of learning and impact at Knight Foundation. “These companies have exercised their discretion to remove coronavirus and health-related scams and disinformation and even to begin fact checking the president of the United States. These platforms have also turned into the leading venues for the exchange of viewpoints that we think of as fundamental to American democracy. So while Section 230 empowers these companies to determine what third party content to allow, what our poll finds is that there’s a great deal of skepticism about whether and how this broad authority should be used.”

The bundle of contradictions presented in the report is symptomatic of the fact that we’re stuck between a rock and a hard place when it comes to content moderation. Freedom of speech hits differently on the internet. One false post claiming the benefits of hydroxychloroquine in combating coronavirus can reach millions of people and cause real harm, and we saw how Russian Facebook ads divided and targeted US voters in the 2016 presidential election. But social media companies are under no legal obligation to monitor harmful speech, and governments can’t really make them or compel them to offer things like counter speech without running into First Amendment roadblocks. Nor do people have the easy option of just choosing another platform with which to engage, because social platforms like Google, Twitter and Facebook have a monopoly on our digital public squares. Being silenced on Twitter is like being cut off from the rest of the world.

“Every platform can have a right to its own editorial policy up until the point where they’re the only platform, or close enough to being the only platform, and then the rules should be different,” said Keller in a phone interview. “We do have a history of saying that if you’re the private owner of the speech infrastructure that everybody depends on, and you’re a monopoly or close to one, then the government can regulate you in these special ways. But getting to a point of doing that for today’s platforms would be a First Amendment fight of a generation, and I’m not sure that today’s Supreme Court would agree that it’s ok to do that.”

Ultimately, big tech will do what’s best for their bottom lines. However, a combination of influences from governments that control lucrative markets, advertisers and public outcry determine what companies will decide to do when it comes to moderating free speech. Which is why encouraging companies to be transparent, whether that’s through an oversight board or just being more upfront about decisions to take down posts, is a step in the right direction.

“Anything that creates more transparency so that we can understand what works and what doesn’t work in terms of platform moderation policy, what backfires, and which policies look good on the surface but turn out to disproportionately harm people of certain racial backgrounds, for example,” said Keller. “Platforms having enough transparency to understand what’s going on would be a really good starting point.”

Source link

Achieve Goals You Never Thought Possible 4X Faster

4XSystem

Lena Khalid is an Accountant by profession. She quits her job that requires a lot of travelling and work from home since 2008.

Started with affiliate marketing, and she learns the trick of the trades fast. She created a few membership sites and focusing in smaller niches.

In 2010, she started to assist offline businesses going online via website design and consultation on internet marketing.

Today, LenaKhalid.com has a list of related websites to assist business owners to get online fast!!

Back To Top

This site is protected by wp-copyrightpro.com