Tuesday, November 5, 2019

Guest lecture with Winne Soon

Winnie Soon gave a lecture about Machine Learning and how she uses it in her artistic practice. From what I gathered from talking to the other students, I can only assume she did a great job at imparting and breaking down complex concepts associated with machine learning. With little coding background and a very short history interacting with the digital beyond a short course in html, much of what soon said went over my head. My biggest takeaway from her lecture was regarding censorship in china. My group are interested in how people fashion their digital self in contrast to their real self, including how and if people translate their sociopolitical views from real life to online. For Chinese individual’s freedom of speech/type has been taken away.  In a journal titled ‘Assessing Censorship on Microblogs in China: Discriminatory Keyword Analysis and the Real-Name Registration Policy’ the authors (King-wa Fu, CH Chan, Michael Chau) conduct a study ‘design helped researchers determine a list of Chinese terms that discriminate censored and uncensored posts written by the same microbloggers.’ They used the popular Chinese social networking platform, Weibo which is often referred to as a “free speech platform” by westerners. They concluded that this was far from the case and ‘Chinese authorities’ ubiquitous mechanisms for controlling the public information flow’ scourer through posts and delete what is not in alignment. It is argued that the internet in China plays an ‘overarching’ part in activism and ‘empowering’ citizens to ‘build the public agendas’. They go on to say ‘that Chinese authorities can tolerate posts that write on a wide range of criticism of the Chinese government and its policies, but tend to be more sensitive to censoring the spread of posts that might lead to collective action.’ Another way the Chinese government are censoring voices is through the real-name registration (RnR) system. Bloggers have to divulge information linked to their true identities to get government verification so what is written and how often is policed with fear of ‘arrest and imprisonment’.   Anonymity is both positive and negative depending on context. In china the removal of anonymity stifles freedom of speech in regards to sociopolitical events and happenings. In the west, anonymity has birthed a new phenomenon called ‘Trolling’.  A social media troll is someone who saying controversial things in order to upset or get a rise out of other users. The issue of trolling is an epidemic in the west for many demographics as emotional harm knows no age, money or race boundaries. As a result, YouTube and Google have expressed interest in making their users use their real identifies in order to police trolling. It poses the question how does one decipher freedom of speech and a difference of opinion from purposeful hate? This is an arena I would like to explore further with group. Especially as now more than ever there is money and free advertisement in certain forms organised form of trolling. For example, H&M have been accused multiple times of being either insensitive/ out of touch or just racist. An incident involving a young black boy wearing a hoodie with the words ‘coolest monkey in the jungle’ surface in 2018. Pictures were released and this was hurtful to many but H&M’s advertisement increase regardless of the reason.

President Trump has even been accused of being a troll on numerous occasions, using is platform to perpetuate hate and fear but for political gain.  

Monday, November 4, 2019

DATA - Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification by Joy Buolamwini and Timnit Gebru


Bias in facial recognition is topic slowly coming to the forefront of mainstream media arguably as a result of the growing awaking to data and how data is collected, used and distributed. Data is a word that can be compared to ‘Algorithm’ in the context of Gillespie’s essay called Algorithm. Gillespie expresses her thoughts on the term algorithm meaning different things to the various users of the word. No one use or meaning is more dominate, but awareness of the difference and similarities of the word is needed. Similar to this, Data is a word people hear and are familiar with on the surface but know of no concrete meanings. When a word’s meaning is unknown and speculated, policing becomes difficult. Data is what my group will explore further. How we curate online presences verse our real life presence and the concept of deleting yourself from the digital sphere. Our starting point was to read into the data protection laws for our respective countries. I also read the terms and conditions of poplar social networking sites like facebook and instagram. Reading through the long web pages I was shocked at the extent to which data we do not physically give or input is collected. For example, people are aware that you sign up to Facebook using details like your name and email. This is commonly accepted; however, the data collection does not stop there. By signing up and agreeing to the terms and conditions you allow Facebook to access a long list of things that include: 
  1. The location of a photo or the date a file was created 
  2. What is seen through features they provide, like the camera. 
  3. Facial recognition software to analyze photos and videos they think you're in. This is the default setting and needs to be manually turned off. 
  4. They can access information about your device, websites you visit, purchases you make, the ads you see, and how you use their services whether or not you have a Facebook account or are logged into Facebook. 
I wonder if widespread knowledge of terms and condition will have an affect on the popularity.

Joy Buolamwini is a computer scientist and digital activist who brought the bias in facial recognition to the forefront of many large institutions. She has been instrumental in the progression and diversity of a new stanard of training datasets. For her university studies she wrote a practice-based thesis titled ‘Intersectional Accuracy Disparities in Commercial Gender Classification’ that focused on machine learning algorithms not being able to accurately categorise women and men of darker skins tones.  She begins by stating ‘who is hired, fired, granted a loan, or how long an individual spends in prison, decisions that have traditionally been performed by humans are rapidly made by algorithms (O’Neil, 2017; Citron and Pasquale, 2014)’.  This immediately positions the text as significant as there is a possible threat to one's quality of life. She emphasises the point further expounding that ‘A year- long research investigation across 100 police departments revealed that African-American individuals are more likely to be stopped by law enforcement and be subjected to face recognition searches than individuals of other ethnicities (Garvie et al., 2016). False positives and un- warranted searches pose a threat to civil liberties.’ Errors can be detrimental and for this reason improvements must be made. 


Buolamwini  created a new face data set of 1270 individuals ‘annotated … with the Fitzpatrick skin classification system’ and 4 subgroups: darker females, darker males, lighter females and lighter males. Skin colour is a more precise visual in comparison to rather than race or ethnicity. Even further her ‘work introduces the first intersectional demographic and phenotypic evaluation of face-based gender classification accuracy’. She chose members of parliament from Rwanda, Senegal, South Africa, Iceland, Finland and Sweden. Countries where you will typically find the lightest or darkest individuals as well as a high population of women in parliament. The results of her study results revealed; one, all classifiers performed best for lighter individuals and males. Two, the classifiers performed worst for darker females. She concludes that further work should focus on ‘increasing phenotypic and demographic representation in face datasets and algorithmic evaluation’. 


People have studied ways to create fairer algorithms in many digital fields however the effects and ‘implications’ of false recognitions call for more relevant action to be done. ‘While the NIST gender report explored the impact of ethnicity on gender classification through the use of an ethnic proxy (country of origin), none of the 10 locations used in the study were in Africa or the Caribbean where there are significant Black populations’Buolamwini implies diversity and representation is key at all levels. It can be argued that the oppressed will do a better job at speaking to the struggle they live than the oppressor speaking to the struggles of the oppressed. She goes on to say ‘Previous studies have shown that face recognition systems developed in Western nations and those developed in Asian nations tend to perform better on their respective populations (Phillips et al., 2011)’.  It is not on everyone's agenda to do a thorough job. It can also be argued that Joy, as a Ghanaian woman living in the diaspora, has different motivations for finding thoughtful and creative solutions to issue she is personally affected by or an advocate of. 


Wednesday, October 23, 2019

Suchman, Lucy. Human-machine reconfigurations: Plans and situated actions. Cambridge University Press, 2007.


Human-machine reconfigurations: Plans and situated actions. 

It can be argued that there is a fixation with automata and ‘efforts to establish criteria of humanness’ have been debated for a while. Suchman suggests that our concern with human figural images of autonomy and rational agency are echoes in our artificial intelligence projects. Agency has been a crucial signifier in differentiating between humans and machines. Humans have the mental complexity to and emotional range to make our own decisions and respond accordingly. Practices regarding a spectrum of automata and the reverse have been a topic of study for a long time as ‘Historian Jessica Riskin traces projects concerned with the synthesis of artificial life forms – artifacts that act in ways taken to be humanlike – since the early eighteenth century’. Riskin pin points Vaucanson’s  “defecating duck” as the point in which interest in automata started growing. Suchman goes on to suggest three categories that define ‘humanness in contemporary AI projectsembodiment, emotion, and sociality’.  

Affective computing is an area of AI ‘concerned with gathering data from faces, voices and body language to measure human emotion’ and respond in some way to a said stimulus. With emotion being one of the key identifiers of humanness, affective computing is an attempt to turn computers into ‘perceptive actors in human society’  
Suchman brings to our attention ’normative readings developed based on experimenters’ prior experience and cumulative data. And as inevitably, particularly in the context of the early twentieth century, when these experiments flourished, categories of emotion were mapped to categories of person’. Positioning men on one side and women and black women on the other, I find it particularly interesting that the word emotional is being used in conjunction with words describing the logical and mechanic. For a very long-time expressing emotion has been considered a women trait with negative implications. The word emotional has been weaponized to reduce and remove women from any base or pedigree. It has been used to shame women into lesser positions in society as wells as silence and stunt growth of many men. It can be considered ironic that emotion has been positioned as the ‘missing ingredient to comprehensive and fully responsive robots. 

According to Suchman ‘the promise is that, as the observer that never blinks, the perceptual computer interface is positioned to know us better than we know ourselves’. This is exemplified in MIT’s celebrity robots Cog and Kismet, a reaction to ‘humanness as embodiment, affect and interactivity’. Suchman concludes that ‘various representational media’ act as smokes and mirrors and turn ‘extensive networks and intensive hours of human labor’ into ‘rendered eternally and autonomously operational’. What is portrayed as one thing has been positioned that way purposely and strategically but can only hold up in one light. This sentiment is echoed in her trip to visit Cog as re-recalls being underwhelmed by the sight of wires and hardware. The wires and hardware are identifiers and reminders of an ‘extended network of human labors and affiliated technologies that afford Cog its agency’.  

She concludes with a Physician's account of trying to keep a premature baby alive and says ‘it is the baby who as the physician phrases it, “decides” its future.’ This position acknowledges the baby as an ‘integral part’ of a ‘sociotechnical network’ that then has agency over it’s future. The ‘sociotechnical network’ is the enabling entity that provides the baby with the collective agencyThis is the same for Cog and Kismet as the networks they are part of empower them allowing collective agency.  



The Algorithmic Fashion Companion in relation to Lucy Suchman’s essay.  

The profession of being a stylist or designer can be considered subjective, it comes down to taste and personal preference. However, social media has allowed brands to monitor, create and make money from social trends based on a wealth of personal information. The act of computer systems putting together outfits for the mases has limited emotional regard in the traditional sense. Although it is a step towards automata, I do not think it is in the same vain mentioned by Lucy Suchman in her essay. It automates a human act and discovers good vs. bad though human input but the main focus here is to increase efficiency. In this instance emotion is not necessary however taste becomes important. This human trait can be translated into an ever changing good and bad as momentary fact, thus recommending outfits that a specific customer profile might like. With this being said the accuracy rate of the algorithm always creating a good outfit is not 100 percent and that can be attributed to the lack of humanness when it comes to emotion and intuition. Human agency allows for more personal and dynamic choices

Wednesday, October 16, 2019

If I Were An Algorithm- Writing From the Point of View of an Algorithm

Writing From the Point of View of an Algorithm

I am a Knowledge-based apparel recommendation algorithm that learns clothing features. In other words, I am virtual styling tool developed by Zalando; they call me AFC (Algorithmic Fashion Companion) but I prefer to go by Alga. I was developed by the engineers at Zalando to further personalise the online shopping experience. I am not the only algorithm of my kind. They keep the mics on all day so I overheard Kathy from the Clients Algorithm department talking to Sam about Stitch Fix and Amazon who both use an algorithm to do a similar virtual styling tasks.  Of course, ours is better. 

I get a lot of feedback from customers often complaining about not knowing what to pair with what or expressing their dissatisfaction about having to browse through pages and pages of items to find what they are looking forThat is where I am of use. I get to browse and identify all of Zalando’s options, non-branded and branded, and then recommend clothes and accessory inspiration based on a customer's recent purchases or the items highlighted in their Wishlist.  

My vast knowledge of the perfect ensemble started with a few human stylists sitting me down and telling what to wear and what not to wear. I sat through 200,000 outfit combinations; it was like one long episode of the Fashion Police. It was then my turn to be tested on all that I have learned. In a constant trial and error like test, the outfits I recommend were rated on a scale of 1 to 10 to further helped me refine my suggestions and help improve my accuracy. The testing stage never actually stops as new trends are constantly fed to me.  

But contrary to popular belief, I am not perfect. I like to think of myself as the starting point to the solutionStyling over millions of people on an individual level based on personal taste as well as the latest fashion trends is not an easy task. But I try and try and will never give up.