Coded Bias (Black Boxes)

Every minute in Coded Bias had a new surprise for me, my mind could not understand how the Bias is everywhere and could not stop myself from wondering what our future will be like if this Bias grows. In this documentary we saw a group of women investigating the bias in algorithms, this happened after M.I.T media lab researcher Joy Buolamwini uncovered flaws in facial technology. Firstly, I want to share a very interesting quote from the movie that data is what we are using to teach machines how to learn different kinds of patterns; so, if you have largely skewed data sets AI is based on data and data reflects our history so the past dwells within our algorithms. Also, Cathy O’Neil the author of the book “Weapons of Math Destruction” When Cathy O’Neil said that mathematics was being used as a shield for corrupt practices, Cathy told us how she simply describes AI which is using historical information to make a prediction about the future. Machine learning is a scoring system which scores the probability of what you are about to do. Moreover, A very interesting thing that she said in a talk was that it is all about who has the code because the person who has it can deploy it, and that people are suffering algorithmic har without being told what is happening.  There is a strong connection between movies and courses because in this course we are searching and testing and understanding what is happening around us especially when it comes to technology. The movie showed us how Technology and AI can affect our lives and how a bias in such systems could affect us negatively. The bias in accepting females into work could affect my life personally, and every female who is interested in taking a job in a company that uses AI to finalize their applicants. There are many aspects that I know will not be the same especially after watching this film and one of the main aspects is how do we train public interest technologists? Who is going to fight for justice in the new digital world? What is needed most to protect the public interest? Also, As Buolamwini said that the training for computer scientists must change; so that there is a sense of responsibility, changing how we learn to computer scientists will be a huge part of it. In Addition, in a Harvard talk Race, Technology & algorithmic bias panel, we learn more about the objective measurements, evaluation, systems that moved us from injustice in the analog world to justice in the new digital world. Also, here are some questions that were asked during the talk, how can we have justice? How the effects of racism manifest in simple exercises of aggregating names? Latanya Sweeny professor of government and technology in residence in Department of government, Harvard University talked about how the pursuit of technology is not exempt from the same ill that we find in other parts of our society and maybe even more potent today, and how technology was allowing the very specific types of fraud, very specific ways to disenfranchise people to really exist. Moreover, Sweeny showed us how data on the internet could affect us especially when people trust the information they find on the internet because when we get into a community that we feel like it is more like us we trust it more, and here is the places where huge frauds happen. Another very interesting aspect was about the class that Professor Sweeny teaches is called tech science to save the world and only the name of this class makes me want to take. The Algorithmic Justice League is a digital advocacy organization based in Cambridge, AJL aims to raise awareness of the social implications of artificial intelligence through art and research which was founded by Joy Buolamwini. Also, in the end of the talk I learnt that it is important for all of us to apply the laws and regulations to technology and to help technologists to do their job better.

Links:

Coded Bias from Netflix 

Race, technology, and algorithmic Bias (Vision and Justice)

https://youtu.be/Y6fUc5_whX8

Photo link: https://www.vpr.org/sites/vpr/files/styles/x_large/public/202103/coded-bias-documentary-ai-artificial-intelligence-technology-algorithms-20210305.jpg

One thought on “Coded Bias (Black Boxes)

Leave a comment

Design a site like this with WordPress.com
Get started