Two weeks after Facebook live was used in a mass shooting in a mosque in New Zealand, Facebook could likely introduce new restrictions to its live video streaming service.
The chief operating officer of Facebook, Sheryl Sandberg, admitted that the social media giant “must do more” in order to stop the platform from being used to share violent content.
The site took down approximately 1.5 million videos of the said attack globally in the 24 hours after the shootings, 1.2 million of which were blocked at the upload stage.
The Facebook exec said that the users who have violated the community standards of the site could be banned from using the live video streaming function.
In an open letter on the New Zealand Herald, Sandberg said: “People with bad intentions will always try to get around our security measures. That’s why we must work to continually stay ahead.”
She claimed that Facebook is investing in research to establish “better technology to quickly identify edited versions of violent videos and images” and prevent users from re-sharing them.
Sandberg claimed that more than 900 different videos showing portions of the said massacre were edited to avoid detection.
She said that the firm had also removed hate groups in the aftermath of the attack, including the United Patriots Front, the Lads Society, National Front New Zealand, and the Antipodean Resistance.
She stated: “These groups will be banned from our services, and we will also remove praise and support of these groups when we become aware of it.”
The said measures come after 50 people were killed in the shootings at two mosques located in Christchurch, New Zealand, earlier this month by an anti-Muslim extremist who live-streamed the massacre on Facebook.
Facebook and other social media companies have been criticised over the rapid spread of the footage across the networks and across the globe.
This week, Facebook revealed that it was banning white nationalism on its service, changing a long-held policy that the activity was not necessarily racist.