Wednesday, July 6

Labor launches a tool to facilitate the transparency of labor algorithms

The Ministry of Labor has presented this Friday a tool to make it easier for workers to request information from companies about the algorithms that act on them and how their decisions influence them. It is a right introduced through the rider law that any company committee can already exercise, but the unions have warned of problems when applying it.

Spain will monitor Artificial Intelligence like drugs or food

Know more

With the new tool – designed in coordination with a group of civil society experts – the department headed by Yolanda Díaz wants to set up a mechanism that clearly establishes what data workers’ representatives can request and how companies must provide it. She also gives advice to entrepreneurs on what factors to consider when implementing this type of technology and how to require them from their algorithm providers.

The transparency tool comes at a time when worker complaints about automated systems that rate their productivity and performance, assign schedules or workloads, or decide if they’re eligible for promotion have increased.

“We already have algorithmic bosses, but the use of this technology has consequences for flesh and blood people,” explained the Vice President of the Government and Minister of Labor, Yolanda Díaz.

“We must use mathematical formulas to free ourselves from excessive fatigue and to improve the quality of our work”, he stated: “But the use of these formulas can have absolutely perverse effects on workers. This tool tries to correct those perversities, some of which we already know”.

Facilities to request information

The new tool is made up of a questionnaire that “aims to specify and systematize the business obligations of information” about the algorithms that affect workers, allowing them to “know the most relevant characteristics and technical details” of these.

Among the data that works councils can request is information on which artificial intelligence systems used by the company can influence workers or evaluate them, how they have been implemented and why. This section also includes what decisions made by people are based on information provided by algorithms, what type of technology they use or what company has designed them, if they have been purchased from third parties.

The answers of the companies must include “significant, clear and simple information about the logic and operation of the algorithm, including the variables and parameters used”, emphasizes Work.

I also support companies

In addition to the tool, the material presented this Friday by Labor includes a practical guide for both employees and companies on everything related to labor algorithms. Also a report on how this technology is being used by companies around the world, as well as the repercussions that have been detected. The latter is produced by Eticas, one of the first consultancies specialized in auditing algorithms.

“We wanted to put on the table that there is already a legal framework to exercise the rights of people with respect to algorithms, what we need is to specify how to use it,” explained Gemma Galdón, director of Ethics and coordinator of the group of experts and experts who have collaborated with the Ministry in the design of the tool and the material on which it is based.

“What we wanted is for it not to talk about principles but about practices: if what we want is for algorithms to be transparent, how can we make them transparent”, insisted the expert: “Dealing with very specific things that would be useful from now on”.

Another of the group’s objectives, its own members have revealed, has been to make it easier for companies to meet these transparency obligations and give them more information about how this sector works.

“There is a business model in the artificial intelligence industry that consists of increasing the appearance of complexity of algorithms to increase their price”, revealed Javier Sánchez, professor at the University of Granada. “This guide can also help companies not to pay for systems that aren’t really worth it.”

Leave a Reply

Your email address will not be published.