How Big Tech is Handling Election Day

As many Americans turn to social media and the internet for election day news, big tech companies have made several policies and monitoring measures to ensure a smooth 2020 election.

Facebook

 Facebook put in place new measures to ensure a safe and secure election. In an announcement released Oct. 7, Facebook announced it was focusing on two primary objectives: Helping more Americans register and vote and protecting the integrity of the election by fighting foreign interference, misinformation, and voter suppression.

Since the 2016 election and Russian interference, which used Facebook’s platform, the company has made substantial investments into its platform to work on security issues.

“We worked on more than 200 elections around the globe since [2016], learning from each, and now have more than 35,000 people across the company working on safety and security issues,” the company said.

The company has removed more than 120,000 pieces of Facebook and Instagram content that violates its voting integrity policies and rejected ad submissions that would have run 2.2 million times because they were not properly authorized.

The company also blocked political ads in the final week before the election in an effort to stop misinformation.

“We’re going to block new political and issue ads during the final week of the campaign. It’s important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech, but in the final days of an election there may not be enough time to contest new claims,” said Mark Zuckerberg in a Facebook post.

Additionally, the company created an ad library to allow users to see what political ads are being bought and by whom and how much those entities are spending.

After the polls are closed, Facebook has plans to stop the circulation of all political ads in an effort to reduce misinformation about the results of the election. The ban on political ads will be temporary, and the company will notify users when the ban is lifted.

Twitter

Twitter has undertaken far more drastic steps than Facebook in the political advertising realm. In October 2019, Twitter CEO Jack Dorsey announced that the company was banning all political ads from its site.

“We’ve made the decision to stop all political advertising on Twitter globally. We believe political message reach should be earned, not bought,” Dorsey announced via a tweet.

In May of 2020, Twitter began labeling misinformation in users’ tweets about COVID-19. In early October, the company expanded that policy to address misleading information about the election and voting.

“We do not allow anyone to use Twitter to manipulate or interfere in elections or other civic processes, and recently expanded our civic integrity policy to address how we’ll handle misleading information surrounding these events. Under this policy, we will label Tweets that falsely claim a win for any candidate and will remove Tweets that encourage violence or call for people to interfere with election results or the smooth operation of polling places,” announced Twitter.

The labels that Twitter will place on misleading information will direct users to sources that Twitter deems credible and will require users to click through warning labels.

Twitter will not allow either candidate to call the election for themselves before at least two authoritative sources have made independent election calls.

Additionally, a post meant to incite violence or election interference from any user will be subject to removal.

Twitter is also providing its users with an election hub where users can look for Twitter curated information about polling, voting, and candidates.

The fact-checking labels and the ban on political advertisements are permanent fixtures of Twitter now and will not go away after the election. The added context to tweets is temporary, but Twitter has not said when it will end.

YouTube

YouTube’s approach to the election is far less drastic than Twitter and Facebook’s measures.

The company does not plan to have a “war room” like Twitter and Facebook and will not ramp up its video removal process. YouTube leadership fully expects the normal video removal process to be sufficient.

“Of course, we’re taking the elections incredibly seriously. The foundational work that will play a really major role for all of this began three years ago when we really began the work in earnest in terms of our responsibility as a global platform,” Neal Mohan, YouTube’s chief product officer, said in an interview.

YouTube’s current policies do not allow videos that mislead voters about how to vote or the eligibility of a candidate.

After the polls close, YouTube will create a playlist of videos offering election coverage to its users, including coverage from both CNN and Fox News.

The company will also provide an information panel above election-related search results and below videos discussing the election. The panel will give the viewers information warning that results may not be final and offer a link to Google’s real-time election results feature, which uses information from The Associated Press.

Starting in 2021, YouTube will no longer offer its “masthead” advertising spot on its homepage. Currently, advertisers can make reservations for full-day advertisements on the YouTube homepage. President Donald Trump currently occupies the spot.

+ posts

Share

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn
Share on email
Email

Subscribe to get the latest consumer news

More consumer News