Monday, September 26

The author of the Facebook leaks: “Between society and profits, the company always opted for money”


The former Facebook worker who has sparked the biggest social media malpractice scandal since Cambridge Analytica has chosen to come forward. Her name is Frances Haugen and she worked at Mark Zuckerberg’s corporation until last May, at which point she decided to leave it and leak a large number of internal documents. These reports, published by the Wall Street Journal, have shown that the multinational was aware that its activity causes a toxic impact on society, but that in most cases it does nothing to remedy it.

A network of 672,000 bots operated on Facebook to manipulate Spanish public opinion during the pandemic

Know more

“There were conflicts of interest between what was good for society and what was good for Facebook,” Haugen explained on Sunday’s prime-time US show. 60 minutes, from CBS. “Facebook, time and time again, chose to optimize its own interests, such as making more money,” he said.

This former Facebook employee, a 37-year-old computer engineer, was part of the company’s “Civic Integrity” team. His mission was to ensure that the company had “a positive role in society.” Decided to become a deep Throat of Facebook because the multinational had become the complete opposite of that.

Documents leaked by Haugen have revealed Facebook knows that Instagram is toxic to many teenagers, as it makes them feel worse about their body, but that it does not act to prevent it. Also, that the employees of the social network discovered that it is used for drug and human trafficking in developing countries, that they reported it to the top management and received a “lukewarm response” from them. Or that its algorithms aimed at decreasing polarization and hate speech did the opposite, but Facebook didn’t change them because they were so effective at capturing users’ attention.

Facebook has created an incentive system that is dividing people

One of those reports pointed directly to the impact of algorithms on European political discourse. According to the former Facebook worker, the parties complained that the social network forced them to be more and more aggressive in their propaganda to gain visibility on the social network.

It is something that also caused the polarization of the content of some media. “When you have a system that you know can be hacked in anger, it’s easier to provoke people’s ire. And publishers say, ‘Oh, if I make more angry, polarizing and divisive content, I get more money.’ Facebook has created an incentive system that is dividing people. ”

“Every new language costs more money”

In the 60-minute interview, the former Facebook worker continued to expose reports that expose Facebook. A new document addressed to its managers and uncovered this Sunday in the program concludes that “the current set of financial incentives generated by our algorithms does not seem to be aligned with our mission as a company.”

However, a part of her interview focused on explaining how the social network especially harms secondary market societies for her, those whose conversation does not take place in English.

“It’s really important to remember that Facebook makes different amounts of money in every country in the world,” Haugen said. “Every time Facebook expands into a new linguistic area, it costs as much, if not more, to make the security systems for that language as it does to do English or French. Each new language costs more money, but there is progressively less. And thus, for economic reasons, for Facebook it does not make sense that the platform is secure in many of these parts of the world. ”

For economic reasons, for Facebook it does not make sense that the platform is secure in many of these parts of the world

The most serious consequence of the practice described by Haugen, which leaves platforms with little or no supervision in minority languages, occurred in Myanmar. UN inspectors who investigated the genocide against the Rohingya noted that Facebook played a key role in calls for violence. The company ended up acknowledging that it did not have enough moderators capable of speaking Burmese to detect what was going on.

The problem is repeated in many other parts of the world when misinformation comes into play. During the interview, the filter company explained that in many countries, especially where there are no teams of journalists specialized in verifying content, “disinformation on Facebook is directly related to deaths.” “Imagine what happens in a town in Africa where a photo of a massacre is shared, it is said that it was in the next town and that it is necessary to arm oneself.”

Watch the world

Haugen’s statements coincide with those of other former colleagues who have decided to tell what their day to day was on Facebook. They especially remember those of Sophie Zhang, a data scientist who was part of the team in charge of eliminating botnets and other types of malicious activity from the social network. According to his testimony, this unit was totally overwhelmed by the size and significance of its work: “I have made decisions that affected national presidents without any supervision,” he warned. The documents that she leaked highlighted the performance of her team in Spain, which took weeks to block some 700,000 false accounts that contaminated the political conversation during the pandemic.

Haugen’s revelations will go even further. In addition to the Wall Street Journal, she also sent the documents to which she had access to members of the US Congress, which has an investigative commission open on the large platforms as a result of the Cambridge Analytica case. The engineer will appear before congressmen this week.

Facebook has sent a statement to the media in relation to the Journal’s publications and Haugen’s interview, in which it denies the inaction reported by its former employee. “Every day our teams must balance protecting the right of billions of people to speak out against the need to keep the platform a safe and positive space. We continue to make significant improvements to curb the distribution of misinformation and harmful content. To suggest that we promote bad content without doing anything is not true ”



www.eldiario.es