Facebook whistleblower Frances Haugen has agreed to a meeting with the tech giant’s Oversight Board after it admitted she’d exposed ‘internal issues’ at the social media firm.
The Oversight Board — which was created to help the company answer difficult questions around freedom of expression online by using an independent panel of experts — announced the meeting with Haugen in a press release Monday.
It said: ‘In the last few weeks, new information about Facebook’s approach to content moderation has come to light as a result of the actions of a former Facebook employee, Frances Haugen.
‘In light of the serious claims made about Facebook by Ms. Haugen, we have extended an invitation for her to speak to the Board over the coming weeks, which she has accepted,’ the release reads.
No further information on what ‘new information’ the Board was referring to were shared, although Haugen made multiple damaging claims.
‘The choices made by companies like Facebook have real-world consequences for the freedom of expression and human rights of billions of people across the world. In this context, transparency around rules is essential.’
Facebook’s Oversight Board will meet with company whistleblower Frances Haugen (pictured) in the coming weeks to ‘discuss her experiences’ and ‘gather information that can help push for greater transparency and accountability’
Haugen, who testified to Congress last week after anonymously filing eight complaints about her former employer to the Securities and Exchange Commission, has accused the company of using its algorithm to push negative posts which keep users coming back to their newsfeeds.
The board will speak with Haugen about her alleged findings and will continue its investigation into whether or not Facebook has ‘been fully forthcoming in its responses on its “cross-check” system’.
Facebook, whose corporate leaders have denounced Haugen’s claims, has also asked the Oversight Board to review how the company’s cross-check system can be improved and to offer recommendations.
Haugen went before Congress last Tuesday to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens and of being dishonest in its public fight against hate and misinformation.
‘Facebook’s products harm children, stoke division and weaken our democracy,’ she said as she called on Congress to act.
‘The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.’
Haugen’s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company´s civic integrity unit.
The Oversight Board — which was created to help the company answer difficult questions around freedom of expression online — announced the meeting with Haugen in the above press release on Monday
The announcement came less than a week after Haugen testified at a Senate Commerce hearing (pictured), claiming that the social media giant is harmful to children and contributes to the spread of hate speech and misinformation
They formed the basis of The Wall Street Journal’s Facebook Files expose, which were so damaging the firm paused efforts to create an Instagram Kids app.
In wake of the allegations, Facebook announced three new controls it says will help children.
Speaking on NBC’s State of the Union Sunday, Facebook’s Vice President of Global Affairs Nick Clegg unveiled the new ‘nudge’ feature, which he claims will dramatically boost the wellbeing of young social media users.
Clegg said: ‘We’re going to introduce something, which I think will make a considerable difference which is where our systems see that a teenager is looking at the same content over and over again.
‘And it’s content which may not be conducive to their well-being we will nudge them to look at other content.’
Clegg didn’t say which platform the tool will be used on.
He did go on, however, to highlight two other features, one of which will prompt users to take a break after too long on Instagram. And the other will see parents given as-yet-unspecified control over how their teenagers use the social media app.
‘We are constantly iterating in order to improve our products. We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use,’ Clegg said in an interview with Dana Bash on CNN’s ‘State of the Union’ Sunday.
The announcement also comes one day after Facebook’s Vice President of Global Affairs Nick Clegg, right, made his remarks about the platform’s new child safety controls on CNN’s State of the Union
Clegg was also grilled in different interviews on both CNN and ABC’s ‘This Week with George Stephanopoulos’ about the use of algorithms in amplifying misinformation ahead of January 6 Capitol riots.
However, he responded by claiming that Facebook’s algorithms are ‘giant spam filters’ and, if removed, people would see more potentially harmful content like hate speech and misinformation.
Clegg said that Facebook has invested $13 billion over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on such issues.
However, last Tuesday, Haugen, a 37-year-old data expert, said during her blistering testimony in front of Congress that Facebook founder Mark Zuckerberg is only ‘accountable to himself’ and has even been directly involved in company decisions that saw Facebook putting profit over ‘changes that would have significantly decreased misinformation, hate speech and other inciting content’.
She said that executives were aware that Facebook and Instagram, which it owns, were harmful for children, with a leaked internal study revealing that teenage girls had increased suicidal thoughts from using Instagram.
Haugen also testified to Congress that Facebook founder Mark Zuckerberg (pictured) is only ‘accountable to himself’ and has even been directly involved in company decisions that saw Facebook putting profit over ‘changes that would have significantly decreased misinformation, hate speech and other inciting content’
Haugen also claimed that Facebook’s algorithms, centered around ‘likes’ and ‘shares’, rewarded ‘dangerous online talk has led to actual violence that harms and even kills people.’
The tech giant slapped down Haugen after she testified, saying that the data scientist never attended meetings with top executives and that she was wildly misinformed about the company.
Mark Zuckerberg wrote in an open letter to his staff: ‘I think most of us just don’t recognize the false picture of the company that is being painted.’
Facebook’s director of policy communications, Lena Pietsch, responded to Haugen’s testimony by pointing out she worked at the company for less than two years.
Pietsch added that Haugen ‘had no direct reports, never attended a decision-point meeting with C-level executives – and testified more than six times to not working on the subject matter in question.’