Dossiers and cover-ups Facebook puts profits before safety whistleblower reveals
When Sydney teenager Tilda first started using Instagram, it sparked a crisis of confidence in how she looked that lasted two years.
âI would constantly compare myself to Victoriaâs Secret models when I was a 12-year-old girl with a flat chest,â Tilda says. âI was faced with these images of girls who are fully grown, and have probably had some work done - it was challenging, but itâs also whatâs to be expected when youâre a 12-year-old who goes on Instagram with no real idea of reality or expectation of what social media is like.â
With the support of her mother, Tilda eventually gained the perspective and maturity to move past this. Now 15, she uses Instagram mainly to follow Liverpool Football Club and message friends, TikTok for the humour and SnapChat to connect with friends. She no longer worries âabout what other girls look like or if they have a smaller waist than I do or bigger lips or whatever.â
Sydney teenager Tilda, 15, uses Instagram, TikTok and Snapchat.Credit:Louise Kennerley
Tilda was officially too young when she joined Instagram - the terms and conditions state users must be at least 13 - but this is common. As we now know, thanks to damning testimony from a whistleblower in the US, former Facebook employee Frances Haugen, Instagram is not safe for young people aged 13 and older anyway.
Haugen took a trove of documents - thousands of pages of internal research, briefing notes, presentations and memos, legal advice and messages posted on the Facebook Workplace forums - when she left the company in May.
She shared the documents with The Wall Street Journal (which has been publishing its series The Facebook Files since mid-September), the US Securities and Exchange Commission, and the US Congress, where Haugen testified this week.
It confirms what many of us always suspected - that Facebook and its family of products are damaging to society and individuals. It also appears that the company knew how to fix many of the problems, but instead tried to cover it up.
âThe choices being made inside of Facebook are disastrous for our children, for our public safety, for our privacy and for our democracy,â Haugen said in her testimony to the US Senate Commerce Committee this week.
âLeft alone, Facebook will continue to make choices that go against the common good. Our common good. When we realised big tobacco was hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action.â
As a result of Haugenâs testimony, Facebook has come under united bipartisan pressure from both Democrats and Republicans and has been forced to shelve plans for an Instagram Kids app for the preteen market - at least for now.
Former Facebook employee and whistleblower Frances Haugen provided damning testimony about the social media giant. Credit:AP
Even Facebook itself appears to be no longer standing in the way of regulation.
âItâs been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act,â a Facebook spokesperson says.
In Australia the story has reignited a debate about how to regulate content on the internet, and not just in terms of cyber-safety. In an extraordinary intervention, both Prime Minister Scott Morrison and Deputy Prime Minister Barnaby Joyce this week flagged the possibility of making the platforms legally responsible for defamatory comments.
Morrison accused the tech giants of allowing their platforms to become a âcowardâs palaceâ for anonymous trolls who âdestroy peopleâs lives and say the most foul and offensive things to people, and do so with impunityâ.
Haugenâs Facebook dossier is wide-ranging. It reveals details about a program called âcross checkâ or âXCheckâ that whitelists high-profile accounts so that the companyâs normal enforcement measures against harassment and incitement to violence donât apply. It suggests the company was consistently willing to accept 10-20 per cent more misinformation if it meant 1 per cent more engagement. It describes how Facebook prematurely removed controls put in place before the November 2020 US presidential election in December to reprioritise engagement, just a few weeks before the US Capitol riot. It exposes the weakness of the companyâs response to criminal issues in developing countries, from drug cartels in Mexico to human traffickers in the Middle East.
One of the most striking disclosures was what Facebook knew about the harmful effects of its photo-sharing app Instagram for many users, especially teenage girls who account for a large chunk of the audience.
Australiaâs Assistant Minister for Mental Health David Coleman, a former chairman of NineMSN, says the whistleblower files demonstrate the social media giants âcanât be trusted to act in the best interests of childrenâ. He is scathing of Facebook and Instagramâs âabysmalâ efforts to enforce their own age-limit restrictions.
âThere are undoubtedly millions of children who are on social media platforms at an age where it is unsafe for them to be there,â he says.
âWhat is the role of society and government if not to protect kids, and we know that we canât trust the social media platforms to do that.â
Australia has led the way on cyber-safety, establishing the worldâs first eSafety commissioner in 2015 and passing the Online Safety Act 2021, which technology companies must comply with by mid-2022. Australia was among only a handful of countries to force technology platforms to pay news publishers for content, and debate is now turning to defamation.
There are clear signs of a growing appetite within some sections of the Australian government to crack down further on the social media giants. But with just four already-packed parliamentary sitting weeks left this year, momentum for reform could be stymied by the headwinds of the looming federal election and campaign season.
Facebook says the company removed more than 600,000 underage accounts on Instagram over the past three months, and has thousands of staff as well as AI technology dedicated to removing accounts belonging to underage users.
Many people can use Instagram and not be harmed, or the problems can fade with time like in Tildaâs case. For others, the appâs relentless focus on social competition and the algorithms that can lead users from healthy recipes to pro-anorexia content at warp speed, can contribute to the development of eating disorders or self-harm.
Haugenâs documents show that internal Facebook research found more than 40 per cent of teenage Instagram users who reported feeling âunattractiveâ said the feeling began on the app, one in five teens say Instagram makes them feel worse about themselves, and many teens reported the app undermined their confidence in their friendships. Teens regularly said they wanted to spend less time on Instagram but lacked the self-control to do so.
Facebook researchers concluded some problems around social comparison were specific to Instagram, not social media more generally. Some Facebook executives resisted an internal push for change, saying the social competition was the âfun partâ of Instagram for users, and in public the company cited external research which downplayed the correlation between social media usage and mental health harms.
In a public post Facebook founder Mark Zuckerberg said it was false that Facebook prioritised profit over safety. He said the Instagram research had been mischaracterised because it also showed many teenage girls who struggled with loneliness, anxiety, sadness and eating issues said Instagram made these problems better, not worse.
This week Morrison and Joyce seized on the debate about online abuse proliferating on social media as a stick to threaten a further crackdown through defamation law reform.
In a deliberate choice of words, Morrison said that platforms that refused to unmask trolls were ânot a platform any more, theyâre a publisherâ. Joyce, whose daughter has been the subject of scurrilous gossip by anonymous commenters, declared that platforms âmust be held liableâ, saying if âthey enable the vice, they pay the priceâ.
Their comments follow a High Court decision last month that found media outlets were legally responsible as âpublishersâ for third partiesâ comments on their Facebook pages even if they were not aware of the comments. The bombshell ruling also has implications for other administrators of Facebook pages, including MPs and regular citizens.
Facebook founder and CEO Mark Zuckerberg said it was false that Facebook prioritised profit over safety.Credit:AP
Associate professor Jason Bosland, director of the Centre for Media and Communications Law at Melbourne Law School, says making the social media giants liable for defamatory remarks circulating on their platforms as soon as they are published would be an âextremeâ outcome, and probably make Facebook, Twitter and other companies unable to operate due to the legal risk.
âYou would have very few experts that are consulted that would suggest that Facebook should be liable for absolutely everything thatâs published on their platform without notice,â Bosland says.
The nationâs attorneys-general, led by Mark Speakman in NSW, are considering the options for defamation law reform.
Australiaâs eSafety commissioner Julie Inman Grant says the whistleblower revelations, while not surprising, could galvanise action in the US and that in turn would bolster Australiaâs efforts. She too likens it to efforts to regulate car safety and mandate seatbelts in the 1960s to 1980s.
âThis is the tech industryâs seatbelt moment,â she says. âFor too long, they have not had any brakes put on them whatsoever and the primary reason is because they served as a driver of innovation and inspiration and growth and development and no government wants to put the brakes on that.â
Australiaâs efforts include giving the eSafety commissioner statutory powers to order the removal of content, working with the industry to promote the concept of âsafety by designâ and the Online Safety Act 2021. The Act takes a co-regulatory approach - eSafety has produced a white paper outlining the expected outcomes, and technology companies or industry bodies (for sectors such as social media platforms, internet of things, or gaming providers) have until June 2022 to register codes showing how they plan to comply. Those codes need to be approved by eSafety and will be registered under the Act, giving regulatory force.
Inman Grant says previously Australia regulated for a specific set of harms, such as cyber-bullying, image-based abuse and illegal online content such as child sexual abuse or terrorist content. The new Online Safety Act is about basic online safety expectations or a social licence to operate.
However, the US is a market 12 times larger than Australia and the home jurisdiction for Facebook and most other technology platforms, so regulation in the US would be of far greater import.
Communications Minister Paul Fletcher and Inman Grant this week jointly wrote to the US Senate Committee for Commerce sharing details about Australiaâs regulatory approach and offering for Inman Grant to appear at the hearings.
Australiaâs eSafety commissioner Julie Inman Grant says the whistleblower revelations, while not surprising, could galvanise action in the US and that in turn would bolster Australiaâs efforts.Credit:Alex Ellinghausen
Inman Grant believes international standards are inevitable for technology, just like they are now embraced by the car industry.
Conflicts of profits and safetyDuring her two years at Facebook, Frances Haugen says she saw the company ârepeatedly encounter conflicts between its own profits and our safety [and] consistently resolve these conflicts in favour of its own profitsâ.
Haugen, who had previously worked at Google, Pinterest and Yelp, grew so concerned at what she saw at Facebook that she resigned and decided to compile evidence before she left.
She says the solution lies not just in regulation but in a demand for full transparency about Facebookâs data and algorithms. She says at other large tech companies such as Google, independent researchers can download and analyse company search results from the internet, but Facebook âhides behind walls that keeps researchers and regulators from understanding the true dynamics of their systemâ.
However, Inman Grant says it would be very difficult to regulate algorithms because they are not static - you would also need to be given information about how the algorithms adapt and change through machine learning, and regulators like eSafety would have to employ a team of data scientists and data engineers.
There is also a question over whether private companies should be compelled to share proprietary information - algorithms being like the âsecret sauceâ that helps their products compete in the marketplace.
In response to Haugenâs testimony, a Facebook spokesperson says: âA Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives â" and testified more than six times to not working on the subject matter in question. We donât agree with her characterisation of the many issues she testified about.â
Inman Grant, who has worked at Microsoft, Twitter and as a lobbyist to the US Congress, describes this response as âa classic obfuscation techniqueâ.
âIâve seen those talking points written before. I donât think they hold much water.â
Lisa Visentin is a federal political reporter at The Sydney Morning Herald and The Age, covering education and communications.Connect via Twitter or email.