The Instagram algorithm does not like the account of the Movimiento Marika de Madrid (MMM). This LGTBI activist group has seen how the automatic moderation of this social network, where it has about 7,500 followers, has blocked its profile up to four times in three months. The first three times they requested a review of the machine decision and regained control of the profile within hours. But the latest deactivation spanned nearly two weeks, and the group feared it would be final.
Life and murder of Sonia Rescalvo, a before and after in the fight against transphobia
Instagram has reactivated the account after elDiario.es asked about the reasons for the block. The social network acknowledges that it was “an error” and apologizes to the group for the inconvenience. However, when faced with questions from this medium, company sources have not wanted to confirm why its algorithms repeatedly raise the alarm about a markedly activist profile and whose activity in the last two months has been very focused on the call for protests by the rebound in LGTBIphobic attacks.
The social network has not given any explanation to the group either. “As happened on previous occasions, they have not explicitly disclosed the reason for closing the account and this time, they even threatened to close it permanently if we continue to ask for such reasons,” they reveal. However, in the Marika Movement in Madrid they suspect what the cause may be: the reappropriation of the word “marika”, which Instagram has on its list of hateful terms.
The MMM recalls that one of the algorithmic blocks against his account came immediately after the viralization of his poster, calling for a protest against Samuel’s murder. It read “against the Nazis, marikonazos.” The fear of losing their Instagram account led them to self-censor and remove the poster, which is still posted on their other networks, like twitter.
“It is increasingly evident that the Instagram algorithm has a bias that is being counterproductive for people belonging to oppressed groups. In the last year we have seen in Spain how other groups queer They have also seen how their accounts have been closed several times, “the group denounces.” While accounts of individuals and groups that use Instagram are constantly closed to denounce situations of racism, machismo, transphobia, homophobia, gordofobia and other structural oppressions, the accounts that they are really creating hate speech (like the ones that called the racist and homophobic demonstration in Chueca a few weeks ago) are roaming freely on the social network “.
The other option that the MMM points out as a possible trigger for the blockade are the massive reporting campaigns organized by the extreme right. “If a massive denunciation of our account has been called by trolls that take advantage of the bias of the algorithm, Instagram should be able to detect the nature of our account and not fall into the trap of said trolls, vetting and censoring content so valid and necessary in the times that run “, expresses in a statement that will be published this Thursday and to which elDiario.es has had access.
The algorithm and the grayscale
However, its rules also leave a warning: “Our policies are designed to give space to these debates, although we demand that the intention be clear. Otherwise, we can remove the content,” they emphasize. The big problem is that the moderation algorithms that Facebook, Instagram and the rest of social networks use are not precise when they must move in that gray scale, experts warn.
Our policies are designed to accommodate these debates, although we require that the intent be clear. Otherwise, we can delete the content
This debate had a strong echo in Brussels two years ago. When the EU prepared an adaptation of its copyright rules to the current digital context in 2019, the Commission and a group of MEPs proposed forcing large platforms to install filters that detect whether what their users upload is content protected with copyright. If so, those filters should censor it. Technologists and digital rights organizations warned that this recognition technology is not mature enough to evaluate the context of a content.
At that time the complaint was that the algorithms would be unable to detect whether protected content is uploaded for a parodic or critical purpose, uses that copyright cannot block. Parody and criticism, like the reappropriation of hate speech, fall within that gray terrain and algorithms fail to detect them. A recent example of their fallibility in this area came when YouTube blocked a parodic video by the comedy duo Pantomima Full about the deniers.
The duo made parodic claims about the effectiveness of the mask, the vaccine, or the existence of a “plandemic.” The YouTube algorithm detected that the video “explicitly questions the effectiveness of the guidelines recommended by the WHO or the health or local authorities” and blocked it. The Pantomime Full account requested a review, which was negative again. Finally, after several hours and a notable media controversy, YouTube staff rectified their machines.
“Due to COVID-19, we reduced staff in certain places and as a result, we have been relying heavily on machines to identify content policy violations. We know this is increasing the number of videos removed, including videos that do not violate our policies.” , the video platform was justified in a statement.
We cannot depend on a biased algorithm and possible troll attacks taking advantage of this situation.
Marika Movement of Madrid
The content filters ended up being approved in Brussels, although Spain has yet to include them in its legislation. From the Marika Movement of Madrid they ask for a public reflection on how the algorithms of social networks can determine public discourse. The closure of her activist account on Instagram has also coincided with the appearance in the US Senate of Frances Haugen, a former worker who has raised an alarm in the same sense.
“We insist that the debate on moderation on Instagram has to be opened because this reopening of the account is ‘bread for today and hunger for tomorrow’ and, at any time, it could be closed to us and other groups or people. activists “, emphasizes the MMM:” We cannot depend on a biased algorithm and possible attacks by trolls taking advantage of this situation “.