Liberal MP Lucy Wicks, chair of Australia’s Parliamentary Committee on Social Media and Online Safety, criticized digital platforms on Wednesday afternoon for touting “very strict community standards policies” despite various cases of non-users. protected by these standards.
“My concern is that I see very strong community standard policies, or hateful content policies or ‘insert community safety name’ policies of various platforms. I almost can’t blame them, but I find a very big gap with the application of them,” she told Meta, who appeared before the committee for questioning.
Wicks’ comments were made in light of 15 Australian female politicians, including herself, being the target of abusive online comments that were left online for weeks and only removed following a intervention by law enforcement.
Meta Australia APAC Policy Officer Mia Garlick, who appeared before the committee yesterday, acknowledged the reality that heinous bullying material still remains on her company’s platforms.
She then said that Meta’s machine learning models for identifying and removing bullying and harassing content aren’t as capable as its other models of detecting other types of content.
“As far as the ability of artificial intelligence to identify all of this type of content, it’s still learning and we’re getting better,” Garlick told the committee.
“[Bullying and harassment] was one of the categories in our Community Standards Enforcement Report that had somewhat of the lowest proactive detection rate and slowly accumulating as the machines learn on their own.”
Meta’s Australian policy manager, Josh Machin, who was also before the committee, added that his company was working to change the perception that it is not effectively meeting its community standards. He said Meta’s community standards for bullying and harassment follow Australian laws, which he explained means Meta has processes that geo-block or restrict access to certain content that would be deemed inappropriate in Australia.
Wicks said she wasn’t convinced Meta had taken enough action, telling Meta representatives that digital platforms should be doing more to keep people safe online.
“There’s a very big distinct difference between saying something that’s against the law, which is about free speech, and you’re talking about this gray area here, which is basically saying, ‘Oh no, we have to. make it illegal so you can control it”, but misinformation and disinformation are not made illegal so you can delete it.”
“That seems incoherent to me.”
Earlier today, Wicks also asked Snapchat about the suicide of Matilda Rosewarne, a 15-year-old girl who suffered various instances of cyberbullying before her death, including circulating fake nude images of herself on Snapchat. and a Belgian pornographic website.
Snapchat’s public policy manager for Asia-Pacific, Henry Turnbull, has apologized for the death and, like Meta representatives, has accepted that online abuse exists on his company’s platforms.
“I just wanted to say how sorry I am for this. [the Rosewarne family] are going through right now,” he said.
When asked how Snapchat is working to end cyberbullying, Turnbull said that Snapchat images are designed to disappear after being viewed, but phones have the ability to take screenshots, which , according to him, is beyond the company’s control. He also said that Snapchat does not open to a newsfeed of other people’s content, which he says encourages users to passively scroll through potentially harmful content.
He also said that Australian government regulation could help stop bullying content through the creation of a single regulatory framework, which could lead to the expansion of the Online Safety Act so that laws proposed anti-trolling fall within its remit.
“Complexity is very difficult for small businesses. If you have a clear regulatory framework, it’s easier for businesses to understand their obligations. It’s easier for government and regulators to hold people to account. easier for consumers to understand their rights,” he said.
Submissions to the Parliamentary Committee on Social Media and Online Safety close next week.
The committee was previously due to deliver its findings last month, but the inquiry has been extended until March 15. The committee has so far heard from major digital platforms, victims of online abuse and Facebook whistleblower France Haugen.
When Haugen appeared before the committee last month, she said Meta cuts the “minimum” for harmful content, especially when content is offered in languages that aren’t spoken prominently in developed countries, because criticism from these underrepresented users is minimal. .
The Social Media and Online Safety Inquiry and the recently established Anti-Trolling Bill Inquiry are both seeking to complete their reports before the Australian federal election. Liberal Senator and Attorney General Michaelia Cash has previously said social media reforms are among her party’s top items for this year.