Wednesday, May 18

WHO puts under the magnifying glass the use of AI in health | Digital Trends Spanish

A full report of 165 pages delivered on June 28 by the World Health Organization (WHO) on the use of artificial intelligence (AI) in the medical field.

Among the positive things, the support for diagnoses and facilitating processes stands out, but among the negative is the excessive use of personal data of patients, which would threaten privacy.

The WHO highlights six main points to consider about AI in health:

  • Protection of human autonomy
  • Promote human well-being and safety and the public interest
  • Ensure transparency, explicability and intelligibility
  • Promoting responsibility and accountability
  • Ensure inclusion and equity
  • Promote responsive and sustainable AI.

Positive things in favor of AI in the report:

“AI is being considered to support diagnosis in a number of ways, including in radiology and medical imaging. These applications, although more widely used than other AI applications, are still relatively novel, and AI is not yet used routinely in clinical decision-making, ”the report states.

Other things that stand out:

  • The accuracy of the diagnosis
  • improved record keeping
  • The hope that it could lead to dramatically improved outcomes for patients with stroke, heart attack, or other conditions where early diagnosis is crucial
  • The emergence of machine learning technologies in healthcare could help predict the spread of disease and possibly even prevent epidemics in the future.

Negative things about AI in health:

Mainly, the WHO’s concerns about artificial intelligence have to do with patient privacy and the huge amount of data handled by these algorithms, which can lead to massive hacks and even misdiagnoses in the face of poor information cross-overs.

“The collection of data without the informed consent of individuals for its intended uses (commercial or otherwise) undermines the agency, dignity and human rights of those individuals; however, even informed consent may be insufficient to compensate for the asymmetry of power between data collectors and the people who are the sources ”, he concludes.

Editor’s Recommendations