8 Apr 2019

Facebook inaction over livestream 'evidence of need for better regulation' - commentator

6:49 am on 8 April 2019

A technology commentator says it's shocking Facebook is not taking its responsibilities to regulate the live streaming of videos more seriously.

The logo of Facebook is seen on a smartphone.

Photo: AFP / Alexander Pohl/NurPhoto

Facebook is under pressure after the 15 March mosque attacks were livestreamed on the social networking site.

It took the social media company 29 minutes to detect the livestreamed video of the massacre. About 1.3 million copies of the video were blocked from Facebook, but 300,000 copies were published and shared.

However, Facebook chief executive Mark Zuckerberg told ABC's Good Morning America the company would not look to delay their livestream feature.

Paul Brislen said the decision was disappointing but not surprising.

"We'd hoped Facebook would do the right thing and actually look at its own systems and processes and realise what having a completely unmoderated, unmanaged livestream means for anybody around the world, because this isn't the first time that atrocities have been committed and then screened live," Mr Brislen said.

"Facebook has repeatedly said, yes, it's going to do something. Eventually as it's turned out, yes means no, and it won't be doing anything and I think it's just more evidence of the need for better regulation around the way Facebook behaves."

Facebook could be brought to task with a two-pronged approach, he said.

"We need local regulation that requires publishers who are providing material or making material available here in New Zealand to abide by NZ law.

"Secondly, an international move towards regulation of the way these social media giants operate between borders, because it's not just Facebook live, there are numerous issues with the kinds of fake news - the way they disregard privacy laws or election laws of the world all needs to be addressed, we can't just pick off little bits here and there."

He said New Zealand was in a very strong position because of what happened in Christchurch to demand change and beefing up the Harmful Digital Communications Act could be a start.

"I've made dozens of complaints to Facebook, Instagram and Twitter over the years about content. Some of which has been acted on, but the vast majority of which has been rejected as not breaching their community standards.

"I think it's time we say 'your community standards, when you operate in New Zealand, are defined by New Zealand law' and then we tell them what we want and what sort of timeframe."

Facebook CEO Mark Zuckerberg

Facebook founder Mark Zuckerberg is calling on governments to help with the responsibility for monitoring harmful content online. Photo: AFP

Mr Brislen said the best way to force Facebook to take notice was to hit them financially.

"The EU has just introduced some massive fines for breaches of privacy - 4 percent of the company's gross turnover for the year - so we're talking about fines that actually make them sit up and take notice."

He said Germany had successfully suppressed some online speech, which proved it could be done here too.

"They've found a way to make it work for Nazi symbols and far-right extreme content. If you ever want to make your Twitter feed a lot cleaner you change your location setting from Auckland to Germany and 90 percent of the foulness disappears.

"If they can do it at that level, they can do it at this level for New Zealand."

Netsafe chief executive Martin Cocker said there was no easy fix in keeping harmful content off social media.

"We haven't seen the efforts to fine the multi-nationals very successful when it's just a single country. We've seen the Europeans have a go at it as a massive single market ... Australia, it remains to be seen if they can act on it.

"When you think of these companies you think the answer might be simple, a delay or technology to monitor it but in practical terms making that work is really hard ... if it was easy people would be doing it already."

He said any change that New Zealand brought in had to be realistic.

"There's no point in having laws that can't be enforced and don't apply in a practical sense, so what we're hoping is New Zealand will continue to engage fairly constructively with all of these multi-nationals and lean on them as much as we can, but ultimately realise that we'll have to do safety together rather than separately."

Last week Facebook told the Privacy Commissioner it had not changed anything so far about its livestream function.

In an op-ed published in the Washington Post, Mr Zuckerberg last week said the responsibility for monitoring harmful content was too great for firms alone and called for governments and regulators to play a more active role.

He called for new laws in four areas: "Harmful content, election integrity, privacy and data portability."

"Lawmakers often tell me we have too much power over speech, and frankly I agree," Mr Zuckerberg writes, adding that Facebook was "creating an independent body so people can appeal our decisions" about what is posted and what is taken down.

He said new regulations should be the same for all websites so that it would be easier to stop "harmful content" from spreading quickly across platforms.

- with additional reporting from BBC

Get the RNZ app

for ad-free news and current affairs