Monday, October 25, 2021

Facebook whistleblower 'shocked' by company’s investment in metaverse

 

Like22 Comments|

Facebook whistleblower Frances Haugen was “shocked” when she heard the company planned to hire 10,000 engineers in Europe to work on the “metaverse,” a version of the internet based on virtual and augmented reality, when its money would be better spent on safety, she told British lawmakers Monday.

© Provided by NBC News

“I was shocked to hear that Facebook wants to double down on the metaverse and that they’re going to hire 10,000 engineers in Europe to work on the metaverse,” she said. “Because I was like, ‘Wow, do you know what we could have done with safety if we had 10,000 more engineers?’ It would have been amazing.”

Haugen, 37, a former product manager on Facebook’s civic misinformation team, testified before a joint committee discussing a draft of an online safety bill to regulate social media. Her evidence will inform the committee’s work in shaping the bill, which is due to be put before Parliament for approval next year. 

The bill would hold social media companies accountable for harmful content shared on their platforms, leading to major fines in the United Kingdom if they don’t remove material like child sexual abuse and terrorist content. 

Current Time 0:14
Duration 2:00
Loaded49.90%
0
Facebook whistleblower describes negatives of company’s engagement based ranking

“There is a view inside the company that safety is a cost center, it’s not a growth center, which I think is very short-term in thinking, because Facebook’s own research has shown that when people have worse integrity experiences on the site, they are less likely to retain,” Haugen said.

Drew Pusateri, a spokesperson, said Facebook has invested heavily in efforts to improve its platform.

“This criticism makes no sense on its face given nearly every document discussed details the work people did to better understand how to address our challenges,” Pusateri said in an email.

CEO Mark Zuckerberg has pushed back against ideas that Facebook has prioritized growth over safety.

“We care deeply about issues like safety, well-being and mental health,” he said on Facebook this month after Haugen testified before U.S. lawmakers. “If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?” 

Facebook told NBC News that it has spent $13 billion since 2016 to counter bad content and that it employs 40,000 people to work on safety and security. 

A consortium of 17 media outlets, including NBC News, published dozens of reports Monday based on tens of thousands of pages of leaked internal documents obtained by Haugen, which included disclosures that were made to the Securities and Exchange Commission and provided to Congress in redacted form by her legal counsel. 

Haugen attributed many of Facebook’s failures to effectively police its platform to the company’s “culture of positivity.”

“There is a philosophy that people focus on the good,” she said. “And that’s not always a bad thing, but the problem is that when it’s so intense that it discourages people from looking at hard questions, then it becomes dangerous.”

She said that Facebook employees based in Silicon Valley are likely to look at their own Facebook news feeds and see “a nice place where people are nice to each other” and that they don’t see the harms to society created elsewhere in the world, such as Ethiopia, where leaked documents show the platform is being used to spread hate speech and incite violence without adequate content moderation in several local languages.

While researchers on Facebook’s integrity teams might be acutely aware of the harms the platform amplifies, the information doesn’t necessarily reach executives, Haugen said.

“Good news trickles up, but not the bad news,” she said. “Executives see all the good they are generating, and they can write off the bad as the costs of doing all the good.”

She said the voices of employees who highlight the harms and call for mitigation strategies “don’t get amplified internally because they are making the company grow a little slower, and it’s a company that lionizes growth.”

Current Time 0:08
Duration 2:54
Loaded27.56%
0
Internal Facebook documents reveal more red flags about misinformation on site

Damian Collins, the chair of the joint committee, said in a statement before the hearing: “Frances Haugen’s evidence has so far strengthened the case for an independent regulator with the power to audit and inspect the big tech companies. There needs to be greater transparency on the decisions companies like Facebook take when they trade off user safety for user engagement.”

In her appearance on Capitol Hill on Oct. 5, Haugen called on lawmakers to require more transparency from Facebook.

“I believe that Facebook’s products harm children, stoke division, weaken our democracy and much more,” she told the Senate Commerce subcommittee on consumer protection.

No comments:

Post a Comment