Facebook eyes “Election Commission” in possible bid to shed political scrutiny

Facebook eyes “Election Commission” in possible bid to shed political scrutiny

by Tech News
0 comments 71 views
A+A-
Reset

Global implications —

Zuckerberg is reportedly unhappy with being seen as the decider.

Tim De Chant

Guests stand next to a Facebook Elections USA sign in the Facebook Lounge ahead of the first Republican presidential debate at Quicken Loans Arena in Cleveland, Ohio, U.S., on Thursday, Aug. 6, 2015.

Enlarge / Guests stand next to a Facebook Elections USA sign in the Facebook Lounge ahead of the first Republican presidential debate at Quicken Loans Arena in Cleveland, Ohio, U.S., on Thursday, Aug. 6, 2015.

Facebook may finally be acknowledging that its handling of elections around the world has been less than stellar. And this time, the company’s response could amount to more than just another apology from Mark Zuckerberg.

The social media company is considering creating an “election commission” that would guide it on election-related issues around the world, according to a report in The New York Times. The commission would advise Facebook on everything from disinformation to political advertising, and if implemented, the change could be a boon for the company’s public relations. The commission would ideally also take some heat off CEO Mark Zuckerberg, who reportedly doesn’t want to be the “sole decision maker on political content,” the Times reports.

A Facebook spokesperson declined to comment for this story when contacted by Ars.

“There is already this perception that Facebook, an American social media company, is going in and tilting elections of other countries through its platform,” Nathaniel Persily, a law professor at Stanford University, told The New York Times. “Whatever decisions Facebook makes have global implications.”

Bungled elections

Facebook has caught plenty of flak over the years in its handling of elections in the US and elsewhere. Among its many missteps, the company bungled the handling of a disinformation and manipulation campaign by Russian operatives seeking to shape the outcome of the 2016 presidential election. Facebook also drew fire for its Cambridge Analytica scandal, in which a researcher gathered the personal data of 87 million users who hadn’t consented and handed it over to the consulting firm, which then used it to shape election and advertising campaigns for politicians around the world.

The missteps continued well after the 2016 election had receded. In Brazil, Facebook’s WhatsApp messaging service was a main conduit of disinformation in the 2018 elections, and the company’s flagship site was at the nexus of similar campaigns during the 2019 elections in India. Back in the US, Democrats for years had been decrying the company’s tolerance of Donald Trump’s inflammatory speech on the platform. And when he was ultimately banned in January after sparking an insurrection at the US Capitol, Republicans became furious.

In the run-up to the 2020 election, CEO Mark Zuckerberg said that Facebook would take steps to “protect our democracy,” though those steps were relatively small and well-trodden. They included encouraging voter registration, offering information about how elections work, and giving a vague promise to do things that would “reduce the chances of violence and unrest.” The company also added labels to posts with election-related disinformation, a strategy that has seen mixed results at best.

Passing the buck

Handing off contentious decisions related to politics and elections would allow the company to claim that it has sufficient external oversight in such cases, even though implementing any changes is ultimately the responsibility of Facebook’s executives.

The commission apparently takes cues from the company’s so-called Oversight Board, which reviews a small selection of contested moderation decisions—commissions and omissions—and suggests changes the company can make to address any related issues. Though nominally independent, the group still has Facebook’s fingerprints on it. Its board members, who range from nonprofit leaders to professors, journalists, and think tank executives, were initially selected by Facebook, and Facebook is allowed to propose replacements as members step down. Its trustees, which oversee the organization and confirm board members, are selected by Facebook.

The Oversight Board’s rulings are binding, which means that if the board says a post should be reposted or an account reactivated, Facebook must accept the decision. The board also provides broader recommendations for how the company can improve, and while these recommendations could drive more substantial changes on its platforms, they are non-binding. So far, the company has taken a pick-and-choose approach, implementing a majority of changes “fully or in part,” studying a few more, and taking no action on at least two of them.

Because of the way it’s set up, the Oversight Board is a reactive organization. Much like the US court system, it cannot rule on cases that are not brought before it. The election commission would apparently be more proactive, offering Facebook a chance to fix problems before they get out of hand.

Read More

You may also like

Leave a Comment