Artificial Intelligence and Gender: Lecture by Gina Neff
Does artificial intelligence have gender? Why do we give robots female names and human appearance? Is it possible that we reproduce gender stereotypes through AI? How many women actually create the AI systems affecting healthcare, policing, and education?
These questions and many others will be addressed by the exclusive guest of the Centre for Gender and Science, Professor Gina Neff, a Senior Research Fellow and Associate Professor at the Oxford Internet Institute and at the Department of Sociology, University of Oxford, who specialises on new technologies and innovations. The lecture takes place as a part of the Week of Science and Technology on Monday, November 11 at 17:00 at the Faculty of Civil Engeneering of the Czech Technical University in Prague. In her lecture, Professor Neff will reflect on the social and political preconceptions encoded in the data used by AI, and show examples of how these systems incorporate human biases about women and their role in society.
“I see Gina Neff’s work as very pioneering and absolutely necessary to understand artificial intelligence in the social context. We place great emphasis on the success of artificial intelligence in solving tasks, but we rarely consider who creates the solutions and on what principles and data they are based. I firmly believe that this is why the basis must be open research and as high diversity of the researchers as possible,” says Tereza Bartoníčková, founder and president of the Prague Internet Institute, who will introduce Professor Neff’s lecture to the audience.
Gina Neff is the author of three books and over three dozen research articles on innovation and the impact of digital transformation. In 2012, she published a book Venture Labor: Work and the Burden of Risk in Innovative Industries which won the 2013 American Sociological Association Communication and Information Technologies Best Book Award. The lecture will be held in English without translation. Find more about the event here.