A skin-tone bias lingers from the early days of photographic technology

To me, the camera, whether film or digital, was an impartial observer that simply recorded the truth.
To me, the camera, whether film or digital, was an impartial observer that simply recorded the truth.

Summary

  • Old orthochromatic photo films had a technical flaw traceable to a socio-cultural context that favoured lighter skin. Today, AI may help resolve a bias that has survived into the digital era of facial recognition.

As an amateur photographer, I have long been interested in the interplay of light while making images, especially of the human face. I had long thought that the medium of film, which I used from the 1980s and through the first decade of the 21st century, and the digital sensors I used thereafter were unbiased recorders of light (or the lack thereof) and of colours and their interaction with light.

Today’s modern camera sensors use a Bayer filter (named after its inventor Bryce Bayer) to register pixels as shades of red, blue or green (bit.ly/3SOJJQk). The combination of colour that each pixel gives off in relation to its neighbouring pixels results in photographs that accurately represent colour. White, which reflects almost all light that falls on it, registers as white, and black, which absorbs a lot light, registers as black. The logarithmic average of the two polar ends of this colour range (black to white) registers as 18% grey, which is the standard used by most photographic light-meters.

I have written before about the problems of facial recognition technology, especially with respect to intentional biases (such as perpetrated by China’s government on its own citizens) as well as unintentional ones that have found their way into it. These problems have proven so intractable that in 2020, three American Big Tech companies announced that they would pull back their facial recognition programmes. Amazon, IBM and Microsoft all said that they would either cancel their initiatives or place holds on police departments using their algorithms to identify people.

Many observers had then welcomed it. As the author of an article in Forbes (bit.ly/3UsVugl) said: “I don’t think we can overstate the importance of IBM, Microsoft and Amazon and their roles in influencing other tech companies to take a stronger stand on human rights and anti-discrimination…. Let’s hope other tech companies start screening their technology through similar human rights and anti-discrimination glasses and follow their lead."

A review of that technology was prompted by a startling set of findings. For instance, in mid-2018, Rekognition, Amazon’s open application programming interface (API) for facial recognition made news with its results on a test run by the American Civil Liberties Union (ACLU). It test-scanned the faces of all 535 members of US Congress against 25,000 public mug-shots (of arrested people and/or criminals). None of the members of Congress was in that set of photographs, but Amazon’s system threw up 28 false matches, with obvious implications. At the time, Amazon said that the ACLU’s tests were run at its default confidence threshold of 80%, and not at the 95% that it recommends for law enforcement, where false identification can have serious consequences. Such nuanced arguments are passé. Rekognition and others like it were withdrawn for a while.

My impression was that these biases were due to faulty data sets fed to facial recognition systems, and in the case of racial discrimination, programmer bias against non-Caucasians. To me, the camera, whether film or digital, was an impartial observer that simply recorded the truth. But I was startled to find recently that I was wrong. Richard Dyer, in a 1997 book, White (Routledge), includes an essay on how the development of films and early photographic techniques used for lighting were contributors to this bias right from the early days of photography.

Initially, photographic films were orthochromatic, sensitive to blue and green light but blind to red. This technical limitation meant that lighter skins were rendered in more flattering tones, while darker skins appeared unnaturally dark and lacked detail. The fact that “white" skin isn’t really white but is a form of pink meant that this (lighter) tone of red was not reproduced faithfully on orthochromatic film. Dyer highlights how these early photographic biases were not merely technical flaws, but were embedded in a socio-cultural context that favoured lighter skin. Early photographers found ways to correct for this, including by using heavy make-up on their human subjects as well as carbon arc lights instead of natural light, which made fair complexions appear “whiter." Dyer also attributes a colonial bias to it, with non-Caucasians seen as “the other." This was perpetuated by film development processes that catered to Caucasian subjects. “Shirley cards," used by photo labs to calibrate skin tones, featured a Caucasian woman to set standards for colour balance and contrast, but it did not account for a diverse range of skin tones.

The implication is clear: Photographic technology was developed with a specific demographic slice in mind.

Orthochromatic film was in general use from roughly 1873 till about 1906, when panchromatic black-and-white film (which could also register red light) was invented. It took some years before panchromatic films took hold and the partially blind version became a thing of the past in black-and-white photography. The bias crept up again in the 1960s, when colour film came into widespread use but the standards used for its product development were again focused on the skin tone of Caucasian subjects.

Dealing with a bias this insidious is no small task. Merely stopping work on facial recognition is not the answer. We need some combination of regulation, data-set cleaning and inclusivity-oriented technological advances to eliminate this bias (hopefully with the help of Artificial Intelligence).

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

MINT SPECIALS