How objective is an algorithm

At the screening of "Coded Bias," students discussed the limits of neutrality and objectivity of artificial intelligence and algorithms.

On November 9th, the MINT Coordination, Hessen-Technikum, and Café 1 hosted a movie night: approximately 35 students and technical interns gathered to watch the documentary "Coded Bias." Today, algorithmic systems and artificial intelligence control and support many processes we use in daily life and in the professional world. For example, based on algorithmic calculations, women with similar financial situations are assigned lower creditworthiness than men, and black individuals with criminal records face stricter police regulations than white individuals with similar records. Additionally, AI plays a significant role in the surveillance of public spaces, raising questions about how democratic and personal rights may conflict with algorithmic systems.

At the same time, such algorithmically supported results appear to be neutral and objective, shaping perceptions and practices across various domains. How can we deal with these logics and effects? How can the requirements and needs of different groups be considered in the development of algorithmic systems? In which areas do we need these systems, and are there limits to their use, especially with regard to self-learning systems?

The film raises many questions about how our world and our interactions are processed, represented, and - sometimes unequally - shaped by data and software. We discussed these questions with students from departments 1 and 2, the aspiring technology developers, afterward. Does facial recognition in public spaces provide more security, especially for women? If AI is endorsed in democratic states but negatively viewed in authoritarian systems, how can the use of AI be consistently safe, even during changes in government?

It's clear that there are no simple answers. Asking such questions and developing an attitude as an engineer, in addition to expertise in mathematics, programming, and the like, are necessary competencies for the future of our society.

 

Die Gleichstellungsbeauftragten der Frankfurt UASID: 11970
last updated on: 03.19.2024