May 15, 2019
The US tech giant announced that it will tighten rules in order to curb the spread of hate and terrorist ideas online in the aftermath of New Zealand’s mosque shootings, which were livestreamed on Facebook. The news comes ahead of the “Tech for Good” summit, which gathers tech executives and world leaders.
The social media platform is introducing a “one-strike” policy for its Facebook Live feature, which will temporarily restrict access for users who have broken the social network’s guidelines and faced disciplinary action, the company said in a statement.
Facebook will ban first-time offenders from using the feature for set periods of time; the list of offences that could entail such a punishment will also be broadened. However, the company did not specify what actions could lead to a strike, or how long it might last.
The social network also revealed its plans to tighten rules for its other features in coming weeks, namely preventing such offenders from creating ads.
“Our goal is to minimise risk on live while enabling people to use live in a positive way every day”, Facebook Vice President of Integrity Guy Rosen said in the statement.
Facebook also pledged to fund university research on algorithms developed to identify so-called manipulated media, something which its systems struggled to detect following the Christchurch shooting in New Zealand.
The country’s Prime Minister Jacinda Ardern, who is spearheading the “Christchurch Call” initiative to curb the spread of online violence, welcomed the announcement, saying it is “a good first step to restrict the application being used as a tool for terrorists”. According to her, the move “shows the Christchurch Call is being acted on”.
Facebook revealed its censorship plans ahead of the second “Tech for Good” summit, scheduled for 15 May. Prime Minister Ardern, along with French President Emmanuel Macron, is co-chairing the event during which world leaders and tech executives are expected to agree on a pledge to fight the spread of terrorist content on the Internet. Ardern, who earlier denounced the “unprecedented” use of social media in the massacres at two Christchurch mosques last March, indicated that much needs to be done.
“There is a lot more work to do, but I am pleased Facebook has taken additional steps today alongside the Call and look forward to a long term collaboration to make social media safer by removing terrorist content from it”, she said.
This article was posted: Wednesday, May 15, 2019 at 6:20 am