Friday, July 31, 2020

interview joy buolamwini

interview joy buolamwini i mentioned joy in a blog post a few weeks ago, when she won the $50000 grand prize in the search for hidden figures contest. ive been a huge fan of joys work ever since i read her blog post on inclusive coding, or incoding.  shes a fulbright fellow and a rhodes scholar, and now shes at the center for civic media, working on projects that empower communities via tech education and highlight bias in algorithms. i had the distinct privilege of chatting with her last week and we had a fun conversation about algorithmic bias, surveillance, and her work at the media lab. check it out: * * * * * learn more about: the ALGORITHMIC JUSTICE LEAGUE the CENTER FOR CIVIC MEDIA the MEDIA LAB * * * * * some snippets: (on algorithmic bias) (and how facial recognition software wont detect her face unless shes wearing a white mask): This was an issue Id run into when I was an undergraduate. I went to Georgia Tech and did my bachelors in computer science, and I used to work in a social robots lab. And some of the projects I did there involved computer vision, and in that contextI had a hard time being picked up [by facial recognition software]. I would end up borrowing my roommates face to get the job done So I started exploring that a little bit. Why was this happening? Was it just about my facial features, was it about the illumination, was it about the pose, what was going on? And then I read this report called the Perpetual Line-Up, which talks about unregulated use of facial recognition in the United States, and it showed that one in two adults 117 million people in the US have their faces in facial recognition databases that can currently be searched unwarranted, using algorithms that havent been audited for accuracy. Heres the catch: with some initial tests to see the demographic accuracy of some of these facial recognition systems, they saw that the systems performed worse for women overall, worse for people who were considered younger (under 24), and worse overall for people of color so its not just me, having one bad incident! This is something a bit more systematic. (on why it matters) So lets look into the realm of law enforcementbecause here thats where you have the potential issue of having civil liberties being breached, and also you start going into the realm of disparate impact. Is this technology that were using being targeted to a specific demographic more than another? What I was very concerned about was misidentification if Im tagged as somebody else [on Facebook], okay, maybe its funny, maybe its offensive, but perhaps its not as high stakes as if Im misidentified as a criminal suspect. (about being at the center for civic media) I do not know if Algorithmic Justice League would have existed had I not been a part of the Center for Civic Mediajust having the freedom to explore ideas that might not always resonate with the core technical leanings of a space like MIT. Were definitely looking at the social impact of technology, so its not, Can we make it bigger, faster, stronger, more efficient, smaller? But starting at, Should we be doing this in the first place? Starting at issues of power: who has control? who gets to decide and why, and what does that tell us about society? And so in exploring this, I could have viewed my face not being consistently detected as, Oh, this is a technical challenge but being in the space of the Center for Civic Media definitely orients me to [say], This is not just a technical challenge this is as much a reflection of society as other spaces where you see inequities that need to be addressed. Post Tagged #admissions interview #Center for Civic Media #MIT Media Lab #sound recording