Monday, September 20

Facebook suspends its AI for applying racist biases | Digital Trends Spanish

Facebook has disabled the AI-based recommendation feature after a serious racist incident occurred. In accordance with The New York Times, the system cataloged a video from the british newspaper Daily Mail in which black people appeared as a video of “primates”. Specifically, the social network showed an automatic message in the video asking the user if they want to continue watching videos of primates.

It was former Facebook employee Darci Groves who sounded the alarm by tweeting about the bug last week and sharing a screenshot.

Um. This “keep seeing” prompt is unacceptable, @Facebook. And despite the video being more than a year old, a friend got this prompt yesterday. Friends at FB, please escalate. This is egregious.

& mdash; Darci Groves (@tweetsbydarci) September 2, 2021

According to the source medium, the social network has deactivated the artificial intelligence system, noting that the message was not entered by any employee, but that everything was the work of AI. The platform is conducting an investigation and will not reactivate the system until it is sure the problem will not reoccur.

“We apologize to anyone who has seen these offensive recommendations,” the company said in a statement. “We disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent it from happening again,” he added.

It is not the first time that an incident of a racist nature has happened and in which an artificial intelligence system is involved. In 2015, the Google Photos algorithm made the serious mistake of cataloging images of black people within the category “gorillas”.

At the moment Google has not been able to solve this error and the only solution it has carried out is to eliminate the Gorillas category along with others related to primates so that a similar solution is not repeated.

Editor’s Recommendations

var stage = decodeURIComponent(0); var options = JSON.parse(decodeURIComponent('')); var allOptions = {};

if (stage > 0 && window.DTOptions) { allOptions = window.DTOptions.getAll();

Object.keys(options).forEach(function(groupK) { if (options[groupK] && typeof options[groupK] === 'object') { Object.keys(options[groupK]).forEach(function(k) { if (!allOptions[groupK] || typeof allOptions[groupK] !== 'object') { allOptions[groupK] = {}; }

allOptions[groupK][k] = options[groupK][k]; }); } }); } else { allOptions = options; }

var getAll = function () { return allOptions; };

var get = function (key, group, def) { key = key || ''; group = group || decodeURIComponent('qnqb92BhrzmkpqGx'); def = (typeof def !== 'undefined') ? def : null;

if (typeof allOptions[group] !== 'undefined') { if (key && typeof allOptions[group][key] !== 'undefined') { return allOptions[group][key]; } }

return def; };

var set = function (key, group, data) { key = key || ''; group = group || decodeURIComponent('qnqb92BhrzmkpqGx'); data = data || null;

if (key) { if (typeof allOptions[group] === 'undefined') { allOptions[group] = {}; }

allOptions[group][key] = data; } };

var del = function (key, group) { key = key || ''; group = group || decodeURIComponent('qnqb92BhrzmkpqGx');

if (typeof allOptions[group] !== 'undefined') { if (key && typeof allOptions[group][key] !== 'undefined') { allOptions[group][key] = null; } } };

window.DTOptions = { get: get, getAll: getAll, set: set, del: del, }; }());

Leave a Reply

Your email address will not be published. Required fields are marked *