NewsBite

UPDATED

Social media inquiry: Facebook concedes people might have to contact local member to get action on harmful content

The social media giant says it has noticed a horror trend in what people are posting in the past couple of years.

Online trolling will be a problem ‘for some time yet’

Facebook has conceded users needing to remove fake catfish accounts impersonating them may have to resort to contacting their local member to help.

Representatives from Meta, the newly named parent company of Facebook, gave evidence at the parliamentary inquiry into social media and online safety on Thursday.

Liberal committee chair Lucy Wicks raised a number of examples of posts she’d experienced that weren’t removed by Facebook’s algorithm, including one stating “wh**e deserves to have … sliced open with a chainsaw”.

“None of these were removed with any degree of algorithm,” she said.

She said she ended up taking one of the horrific posts to the AFP to get it removed.

Facebook’s director of policy in Australia, Mia Garlick, said that “sadly” over the last couple of years the company had noticed commentary – particularly about female public figures – had become incredibly abusive and sexualised.

“Over the years, we have made changes in policy to remove gendered abuse and sexualised commentary of that nature,” she said.

Facebook was grilled at the inquiry on Thursday.
Facebook was grilled at the inquiry on Thursday.

Ms Wicks also brought up a local business in her electorate she said was desperate to get rid of a fake account claiming to be them.

She said the business resorted to getting followers to help by mass reporting the page so it would be removed.

She intervened after seeing the plea and had success.

Facebook’s head of public policy Josh Machin said the social media giant used a combination of automation and human review when looking at reports of harmful content, and had differing levels of priority. For instance, terrorism would be more urgent than bullying.

“I think one of the most important things we can do in order to make sure people are able to raise issues with us is to make sure we have got very close relationships with local organisations and individuals who can bring things to our attention,” he said.

He raised the eSafety Commissioner and small business commissioners as examples.

“People may go to their member of parliament,” he added.

“The approach we try to take is having a no-closed door with all our local partners. People can report through the platform or via our contacts.”

Facebook says they’ve noticed increasingly abusive commentary against female public figures.
Facebook says they’ve noticed increasingly abusive commentary against female public figures.

Ms Wicks said that often those entities were a last resort for people and she hoped it wouldn’t take a member of parliament to get Facebook to take action for an ordinary person.

Ms Garlick said mass reporting of a problematic post or account didn’t actually change the priority.

“I agree with you there is more we could be doing,” she said.

It comes after television and radio personality Erin Molan told the inquiry earlier this week she had tried to report a threat to kill her unborn baby to Facebook but it came back with an automated response saying it didn’t meet the threshold for inappropriate content.

“The testimony of Ms Molan, I want to reassure you the types of threats she received absolutely violate our policies and it is absolutely distressing to hear of her experience,” Ms Garlick said.

“It is also very common for women in public life to receive those types of threats.

“We haven’t been able to locate that original complaint.

“I think a police report was made and we worked through that process to make sure we were taking appropriate action.”

Read related topics:Facebook

Original URL: https://www.news.com.au/technology/online/social/social-media-inquiry-youtube-grilled-about-covid19-misinformation/news-story/902fdc3edf65fb0d28a8daca8bd79a2d