Penn Art of Web S22 Reading Reflections
Please write a comment, reflection, or question about the reading at the appropriate section. (1 to 2 sentences max)

Week 14 –New Ways of Seeing, Episode 3 – Digital Justice – James Bridle (Rachel)


I took an electronics course over the summer, and found it odd and uncomfortable talking about “male” and “female” plugs and pins while sticking them into each other, but left it at that. I think it’s super cool to see an example of someone actually going to RadioShack and creating a vision for what genderless plugs could look like, packaging and all. I think this example shows the difference between art, which takes you into a new world, and design, which is landed in the world we live in today.

I find it interesting that this sort of bias for computer science and computers being associated with the male gender started partially because of how the home computer was culturally male-coded from the get-go, as well as video games being marketed to boys. I’m guessing that, at the time, these weren’t conscious decisions, but they have such a huge impact on how the field of computer science is viewed currently. 

I find it crazy how most algorithms play a huge part of our lives such as recruiting, hiring, mortgage, etc. What’s even more dangerous is the people developing these algos and making the algos biased towards certain groups.

There is an interesting link between this reading and our previous reading by Catherine D'Ignazio and Lauren Klein. James Bridle interviews a few speakers that mention how prejudice and bias is built into various technologies and communities. This idea is tied back to the challenging power chapter in a way that amplifies the idea bias is built into larger societal problems, not just the code of the product. 

I found it interesting that so much of what dominates cultural thinking (eg. gender roles, male-centric thinking, racial bias, etc) is ingrained into algorithms and technology. You would think that it wouldn’t take up as much of a presence as it does because technology doesn’t have characteristics like gender or race. The fact that it does highlights how humanity is reflected in technology. 

The incident with Amazon and its prejudiced, outdated CV scanning machine learning protocols particularly struck me — especially the quote “The biggest automation company in the world, still could not automate equality.” With all of our past readings, I’ve really come to realize the extent to which prejudice and biases are ingrained in the technology that we use, as well as how technology today can still unintentionally perpetuate these unfair biases if relied upon too heavily.

I think it’s really interesting to see how much society’s own bias is imprinted onto technology. Even though algorithms seem like something that should not have flaws like this, it seems impossible for them to not contain prejudices.

Thinking back to last reading, it’s surprising to realize how much subjective thinking and societal prejudices influence technology in all aspects, from biased data to gendered electrical plugs. I wonder if it’ll ever be possible to fully mitigate these digital injustices, and how much time that might take us.

Week 13 – Data Feminism Chapter 2 (Cecily)

Reading about data feminism as well as Ruha Benjamin’s terminology: the “New Jim Code” was really eye-opening about the existing biases and harmful stereotypes that inherently manifest themselves in technology, algorithms, and code. I had never thought about how “predictive” software can use past data, which is often affected by historical perspectives and social inequities, to limit the futures of marginalized demographic groups.

After reading this article, it made me realize the connection between social issues and data/algorithms. With large informations of data, one can determine if there was social basis as shown in the redlining map. 

The article was inspiring. Especially the highlight of the Local Lotto. I personally used to work at a large tech company where I was leading efforts for the company that covered concepts that secure power. The work I focused on was identifying and mitigating bias in data sets and algorithms from a technical perspective. Although many of the solutions that we would build were a bandaid to a larger problem in society that surfaced as a result of the matrix of domination that existed off the company’s platform. The problems that arose were a result of the past determining the future. As a result, I entirely understand and value the importance of challenging power. Challenging the power dynamic that is a result of the matrix of domination seems to be the start of a solution that will then permeate up the chain to securing power for others, although it remains unclear how to do this at scale. Maybe it will take as long as it took to create to dismantle, piece by piece, community by community.  

This article was incredibly interesting given the way it talked about important the data was but how hard it was to actually collect it since it was something no one had been tracking. I think it is interesting how much “meaningless” data is kept track of but how little we track more important data that could be helpful in understanding patterns and problems in our society. I thought it was particularly interesting when it discussed the power of who collects the data and what data they collect in terms of identifying data bias and the reasons for why we only have a lot of data about certain topics and not others. How do the larger ties between data and societal/social/political problems impact how we visualize data or how we should?

I found this to be the most engaging reading, highlighting the importance of including real people & communities in data design and broadly touching upon a myriad of oppressive American systems upheld by unjust data. The Local Lotto stood out to me, as it was an example of tackling some of these root issues through education. While I found the data science pedagogy valuable, I was concerned about race and ethnicity aspects not being well integrated into the curriculum, especially in the wake of a current national pushback against important historical instruction and the negative stigma surrounding critical race theory. How can counter-data, which quantifies and visualizes structural oppression, be used to further education on social inequalities and data collection itself?

This report reminds me of when I read about supposedly objective AI technology that could profile criminals or risk factors based on facial recognition, but ended up disproportionately targeting people of color. Even though we tend to view technology and algorithms as inherently unbiased, they will always be trained on biased human data, and it’s incredibly important that we actively work to counter discrimination in tech and also remain critical of data we see.

I loved the example of the Local Lotto. It was such a thoughtful way to take abstract concepts, like data collection, mathematics, and equity, and bring them alive in a kid’s neighborhood through something they interact with and would have the opportunity to see differently given this new lens. I also really appreciated how the authors shared the limitations of the Local Lotto, too, pointing out that the organizers didn’t feel prepared to talk about race and were critiqued as outsiders by the kids. Overall, I loved the amount of research in this piece and how it came from so many different angles — history, philosophy, technology ethics, data science — and how each insight was woven into super engaging stories and anecdotes. Great writing.

I think this article was a really amazing analysis of how algorithms are not neutral, and work to uphold structural inequalities in society. I really liked their point that although fighting bias can be helpful, the only way to truly fix this issue is to challenge the root of the problem, which is current power structures. You can really tell that the authors who wrote this book are passionate about the subject, the chapter is so thorough and pulls on so many various examples. It is really a compelling piece of writing. 

This reading emphasized how data visualization can be used as a means to highlight social problems and call attention to a specific cause. I found it really interesting that despite a lack of data, the creators were still able to get their point across through data visualization. A question that came up for me about data visualization was if counterdata should be distinguished from data in projects involving the two.

I’ve definitely heard a lot about biases in machine learning and data from other articles and sources, but they never really offered a way to push the conversation forwards. In this article, I appreciate how it discusses ways to begin thinking about solutions, and how to bring attention to these issues in ways that most people can actively engage with.

Week 12 – The Hidden Life of an Amazon User (Cindy)


This reading made me thought about how large cooperate websites can track user data and gain lots of information by writing large volume of code. It is crazy how they can get large volumes of user data without the user knowing. Most importantly, large companies can benefit from this as they can use ML algos to make their products better and they are profiting just from receiving customer data. 

I am inspired by Joana Moll’s data visualizations that sit at the intersection of art, technical data mining and environmental justice. She is able to remove herself from the facade that the web presents to use and digs deeper to understand how the seemingly inconspicuous effect us and the world we live in. Her untraditional approach to data viz is particularly making me think differently about our next project and how I should both present my data but think about what the effects of it are. 

At first I didn’t realize that you scroll to view all the scripts that are running in the back on each page of the website. I always kind of half-realized that Amazon loaded somewhat slow, but this visualization opened my eyes to the sheer amount of things that are going into tracking every single click you make on their site.

This reading made me think abut how much the web conceals from users in interactions as simple as buying a book. I’m torn between whether or not this is a good or bad thing because on the one hand, users would be overloaded if they were presented with all of this information when interacting with a website, but on the other hand they also have a right to their own privacy. I think this is why it’s important to ensure that tech companies are being held responsible, and I think that data visualization is a great tool that can be used to do so.