Artificial Intelligence Big Data Coloniality of the Algoritm Surveillance Taxonomies

A few observations about Amazon being urged not to sell facial recognition tool to police

May 23, 2018

author:

Array

A few observations about Amazon being urged not to sell facial recognition tool to police

From AP News:

Amazon’s decision to market a powerful face recognition tool to police is alarming privacy advocates, who say the tech giant’s reach could vastly accelerate a dystopian future in which camera-equipped officers can identify and track people in real time, whether they’re involved in crimes or not.

What is Amazon Rekognition Image?

Rekognition Image is a deep learning powered image recognition service that detects objects, scenes, and faces; extracts text; recognizes celebrities; and identifies inappropriate content in images. It also allows you to search and compare faces. Rekognition Image is based on the same proven, highly scalable, deep learning technology developed by Amazon’s computer vision scientists to analyze billions of images daily for Prime Photos.

So, let’s get this straight: Amazon trained this algorithm with customers’ photos (Amazon provides “free” photo storage service to Prime customers) and now they are planning to sell this technology to the police. I often talk about consent in relation to technology so here are some issues:

  • customers must have consented (through an EULA probably) to have their photos used to improve the algorithm
  • customers, however, did not consent to have technology created using their photos as material resources (ie the training material through which the algorithm learns) sold to the police
  • people whose family members, friends, acquaintances etc took photos and uploaded them to Prime did not, however, consent to having their image used to train these algorithms
  • people whose photos were taken in public spaces and eventually uploaded to Prime did not consent to have their image used for corporate profit (further reading, my essay from 2016: Private Internet, Public Streets)
  • Amazon will obviously profit from this sale but, as is customary, the people who provided the resources for the training will not see a dime of this profit
  • Amazon will not necessarily consent to scrutiny of their algorithm to inspect how it has treated facial recognition in relation to race
  • people whose photos were stored and might object to have them used as part of the surveillance apparatus will have no say in the final sale

Non consensual data extractivism as the basis of the surveillance structures.

Also interesting this tidbit from Amazon Rekognition’s page:

Rekognition Image enables you to detect explicit and suggestive content so that you can filter images based on your application requirements. Rekognition provides a hierarchical list of labels with confidence scores to enable fine-grained control over what images you want to allow.

Especially in the context of the precedent of algorithms automatically banning all female nudity regardless of context. Who created Amazon’s “hierarchies”? (a taxonomy by any other name, etc).

I keep going back to the issue of taxonomies because they are foundational to the idea of algorithms and in this case, I am particularly interested in how taxonomies have been built in regards to race, gender, body language, etc. Especially with the possibility of police using this application to determine who is a potential criminal. Has this algorithm been “taught” what a woman looks like by being trained with stereotypically images of cis women? Has this algorithm been trained to recognize darker skin and if so, in what context? With the levels of police violence directed at Black people, what safeguards did Amazon take to prevent this tool from being used in a way that negatively impacts this community?

a couple of past reflections for further context:

This thread: “this idea of Big Data as “a mythology” (in the Haraway sense). What does this mythology say about relationships of power and domination in regards to this “right to name, designate, categorize etc” and who wields it over us?

UPDATE: Amazon has released a statementImagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes? Like any of our AWS services, we require our customers to comply with the law and be responsible when using Amazon Rekognition“.

Array