Penn Art of Web F21 Reading Reflections
Please write a comment, reflection, or question about the reading at the appropriate section. (1 to 2 sentences max).

Week 14b – New Ways of Seeing, Episode 3 – Digital Justice – James Bridle (Becky)

I think people often assume that AI and technology is unbiased and will treat all individuals as equal, but forget that for behind every algorithm or piece of technology are humans that design/create them. The topics brought up in this podcast are very important for all designers, engineers..etc to keep in mind as we move more and more towards a technology powered world.

The most surprising/novel part of the podcast for me was the fact that electrical plugs are gendered. I hadn't known that this was the case before and appreciate how Blas critiqued this aspect of society in his work. 

The most intriguing information to me from the podcast was the example on Amazon’s algorithm used to filter and assess resumes and cover letters. I found it interesting how the smallest change of language could deny women from high-entry level jobs. If more and more society becomes dependent on AI Technology, could AI Technology ever be contextual without it being biased without the need for intervention?

I do not find it particularly surprising the amount of bias found in AI and in technology in general, because as mentioned in the previous reading all of this is built on a structure that is biased/unjust at its core. It is incredibly concerning however that these unjust technologies can play such a prevalent role in our lives/could impact our lives so greatly. It just makes me think about how important it is that we correct these things.

I find it ironic (and super cool) how women were the original workers of “computer work” in the 1950s and 1960s and contributed much of the early code to IBM’s mainframes and NASA’s space missions. It makes me wonder what technology today would look like if men never penetrated the tech field. Would AI and technology be biased towards men? Or would it still find a way to exclude women and minorities?

I think the most dangerous part of this digital injustice is that as tech has been a big part of the society and of our daily lives, we often encounter such biased technology without even noticing. If we think about how these biased technology could effect and shape the way we think, I think it is important for all engineers and designers to be mindful to create more inclusive and equalized culture in technology.

The example of a new AI created with data from women reminds me of another class I’m taking, called DSGN 300. In that class, we go back through the history of things and address their bias in order to redesign them for the future. Though, I don’t think the risk of bias in AI will prevent us from becoming more reliant on it in the future — it seems like it’ll be something we come to accept and live with, just like any other design that excludes groups of people intentionally or not.

Again I am fascinated and disgusted by the blatantly discriminatory methods on which technological advancements are based or technological practices are transformed. Yesterday in my other design class, we were discussing a book called Design Justice, and the presenter shared that while he was interning at some tech company, someone came to them asking for an algorithm to differentiate between races (for maybe resumes?). So it’s important to acknowledge that these practices are still happening, and that we (both consumer and producer) must be active in recognizing and fighting against them.

Learning about algorithms biased against women, I was interested in exploring the philosophy of big tech--to what extent do tech companies think about what it is they are doing and what justifies it. In particular, with the example about Amazon’s software to help companies identify the most promising job candidates, which penalized resumes that included the word “women” graduates of all-women’s colleges, I wonder if Amazon anticipated these biases or attempted to account for them while developing the algorithm and who designed the algorithm. 

I think this podcast, like other work we’ve seen, highlights the fact that big tech will need to be held accountable for their influence on society at some point. Right now, it seems that while there is some awareness of how big tech can negatively impact people (ex. Facebook), there is no concrete regulation against them. As time goes on, I believe it will be necessary for such regulations to exist to fight this kind of digital injustice that occurs in technology.

In addition to the biases being programmed into our technology, I think that another point this podcast really drives home is how much bias exists in our naming conventions and the language we use to describe the world. After listening, I googled “gender changers” and got this as one Google result: "Gender changers are devices that change the end of a cable into another type, allowing two cable assemblies with the same or different genders to mate.” Did it have to be worded this way? Why do they have to “mate” specifically? The language really makes a difference.

Week 14 – Data Feminism – Catherine D'Ignazio and Lauren Klein (Lindsay)

I really appreciated how D’Ignazio and Klein connected a wide range of structural issues that affect minorities them to our everyday technologies. The reading reminded me of the fairly recent facial recognition initiative being pushed by police departments in the United Kingdom. In this surveillance tactic, cameras are stationed around major cities and are used to catch or track people wanted for crimes. The only issue is that over 3,000 people have been wrongly identified with these cameras and people of color are misidentified at higher rates than white people. https://bigbrotherwatch.org.uk/campaigns/stop-facial-recognition/

The idea that data visualizations can sometimes construct a biased and inaccurate narrative of certain demographics reminded me of a conversation we had in DSGN 300 about photography. Similar to the artists of data visualizations, photographers face a similar challenge in ensuring they portray their subjects in an unbiased manner and with full context to avoid contributing to deficit narratives.

The use of data visualizations to visualize counterdata while taking into account where the data is from, the data that is chosen to be represented, and who the creators are behind the visualizations reminded me of the abolitionist W.E.B. Du Bois’s visualizations on the Black experience in America. These graphs and charts were created from census data to show how Black Americans were able to overcome and persevere despite the United States’s long history of slavery. 

This semester I learned a bit about racism in technology and also the complex history of how our society is constructed on the basis of discrimination. I was shocked by how unaware I was of this issue, and it is clear to me now how vigilant we must be in gathering, analyzing, and presenting data (and also in how we critically view these presentations). It is refreshing to read about people who are fighting against modern problems, such as automated systems and the refusal to validate “invisible” experiences.

It’s always so shocking to think about how deeply injustice penetrates into every aspect of our everyday lives. As designers its really important to dive deeper and realize how every decision has the ability to create large negative impacts.

It is really interesting to connect design to ideas surrounding social justice and feminism. This really demonstrated to me that it is really difficult to highlight unbiased data and this reading leaves me wondering how we as a society can overcome this concept and make data visualization and design in general have less bias and discrimination. 

As we become more reliant on algorithms and AI it’s important to understand that they aren’t objective in the sense that they perpetuate the opinions held by their creators. Reading about how we can teach this awareness and actively counter this idea through “data feminism” was very inspiring, and I hope it really does leave an impact on the next generation.

This reading reminds me of a discussion I had in another class: Though seemingly objective, even maps that delineate geographic / territory boundaries (ex. states) are a means of reinforcing power inequalities over minority groups such as Native peoples (who may lack this concept of ownership over the land). Examples like these serve as an important reminder to be critical of our data, not only with regards to the information being presented but also to the implicit biases involved in the data collection itself.

D’Ignazio and Klein’s reading reinforced ideas and provided examples that I had learned about previously. It reminded me of another data algorithm specifically used in the housing market that dictates neighborhoods with cheap housing prices that can be bought. Penn has used this open-source before and from this algorithm, it has helped enable gentrification. However, from the reading, my question is can counterdata lead to enabling policy or a code of ethics that can prevent further biases and means of discrimination?

This reading reminded me of a lot of similar discussions we have had this semester about the ethical implications of technology and whether or not alorithms can be objective if they have human creators behind them. I think this issue will continue to persist as big data and algorithmic approaches to problems become increasingly popular, and I feel that it’s something that designers need to be aware of in the intersection of data and tech.

This reading reminded me of the discussion we had last week about data viz. As I learned that data visualization usually has message or a story to the users/readers, and often is impossible to provide objective or non-biased data, this reading is another reminder of the importance of being mindful of what story the data wants to deliver, where it comes from, and who created them.

I think that there are a lot of really significant points discussed in this reading. The main thing that stuck with me is that we need to go beyond the surface in order to incite meaningful change to society. Of course it is important to combat bias, but if we think about deconstructing the whole system that created these biases and injustices we will have something substantially more meanginful. It seems kind of daunting to think that that is what’s required because Im not really sure how we as a society can effectively go about that, but regardless it is worth it to try.

Week 11 – What a Line Can Say – Investigating the Semiotic Potential of the Connecting Line in Data Visualizations (Jihu)

The line is a graphic tool that feels so intuitive and seems so basic, yet it carries so many meanings and has many functions, as described by Verena Lechner. The elaboration on the narrative power lines hold was fascinating to read about, especially because I feel one point emphasized in this class is the development of a story through design.