Racked is no longer publishing. Thank you to everyone who read our work over the years. The archives will remain available here; for new stories, head over to Vox.com, where our staff is covering consumer culture for The Goods by Vox. You can also see what we’re up to by signing up here.
For women of color, she's right. Instagram users can choose from over 20 filters, but as subjects, we don't have a choice in how our images are processed once a filter is in place. In the name of enhancing or beautifying our photos, filters inevitably alter our appearances beyond recognition.
In the name of enhancing or beautifying our photos, filters inevitably alter our appearances beyond recognition.
People often think of technology as inherently unbiased, but photography has a history of racism. In Technologies of Seeing: Photography, Cinematography and Television, British academic Brian Winston writes, "Colour photography is not bound to be ‘faithful' to the natural world. Choices are made in the development and production of photographic materials." In other words, what you see in a photo is never pure reality—it's the world as someone has chosen to depict it. And for the first hundred or so years of filmmaking, camera technology chose to ignore people of color entirely, leaving photographers' tools with built-in biases.
The way that racism operates aesthetically is to neglect or, in extreme cases, erase whoever is not white. In the 1950s, for example, Kodak measured and calibrated skin tones in still photography using a reference card featuring "Shirley," a white model dressed in high-contrast clothing. Ultimately, Shirley ended up being the standard for image processing in North American photography labs. It didn't matter if the photo in question contained entirely black people; Shirley's complexion was still treated as the ideal.
Kodak's film was so bad at capturing the different hues and saturations of black skin that when director Jean Luc Godard was sent on an assignment to Mozambique in 1977, he flat-out refused to use Kodak on the grounds that its stock was "racist." Only when the candy and furniture industries began complaining that they couldn't accurately shoot dark chocolate and brown wood furniture did Kodak start to improve its technology.
What was the problem, exactly? London-based artist Adam Broomberg, who co-produced a 2013 show of photos taken with old Polaroid film, explained to The Guaridan that black skin absorbs 42% more light than white skin. Picture a photograph with two women, one black and one white. If the photographer adjusts the light so that the black woman doesn't resemble a dark blob with white teeth, the white woman, as a result, will become so light that her intensity will perhaps be blinding.
The way that racism operates aesthetically is to neglect or, in extreme cases, erase whoever is not white.
Fortunately, today photographers and cinematographers are working to better represent people of color. In a BuzzFeed essay, Syreeta McFadden details how she primarily photographs in color, using cross-processing slide film in order to subvert the light-dark skin bias. And in a Washington Post article about lighting black skin in the movies, Montré Aza Missouri, an assistant professor of Film at Howard University, explains how she teaches her students that "the tools used to make film, the science of it, are not racially neutral." Missouri says there are ways to illuminate the complexity of darker skin tones without over-saturating them, such as opening a camera's apertures one or two notches to allow for more light to permeate the lens so that it reflect off the subject's skin.
Still, most American film stocks were not built by or made for people of color, so what does this mean for Instagram, which prides itself on allowing its users to filter photos so that they look like they were taken from a Polaroid or Kodak? Kodak is now bankrupt, but is Instagram continuing its predecessor's myopic vision?
To test this theory, I decided to gather a few models of various ethnicities to see how the filters affected their skin tones and inevitably changed their appearances. When I asked T, a blonde white woman, to send me a selfie, she happened to choose one in which she's wearing a black dress that offers high contrast to her pale skin, similar to "Shirley." I applied the Reyes filter, which Instagram describes as bringing a "dusty, vintage look to your moments." When I showed T this side-by-side collage, she said, "Oh, it lightens me more than I realized!" But there was nothing about this alteration that startled her. As a white woman, seeing her skin lightened doesn't carry much cultural baggage or threaten her privileged place within society.
"If someone didn't know me, they could mistake me for being much more fair skinned than I am. I don't like it."
However, when M, an African-American woman, applied the Reyes filter to her original image, the result was astounding. When I asked M what she thought her filtered image, she replied, "Ew. This is completely white-washed. The colors of my lipstick and dress are very muted, and I look entirely too bright. If someone didn't know me, they could mistake me for being much more fair skinned than I am. I don't like it."
What the Instagram filter fails to grasp is the different shades of brown in and around M's forehead, down her cheeks, and above her top lip. Instead, it erases these shades and mutes them so that the final product is an image that leans more towards whiteness. But then again, isn't that the objective? Instagram did in fact state that the Reyes filter gives "a dusty, vintage look." Since traditional photography used a white woman as the prototype, M's "new" image is a direct reflection of this exaltation of the white body, and the consequent denigration of the black body.
And Z, a woman of Haitian descent, appreciated how the filters cleared her dark circles and blemishes on her face, but she wished that her color was retained. "Nothing about looking at these pictures surprises me," she said. "I don't really think filters give you much of a choice but to be at least a little lighter."
Lark is another relatively new Instagram filter that's meant to "brighten and enhance" photos. True, Lark did in fact brighten the tones of another black woman, J, but the enhancement is debatable. J's skin looks much more fair, with golden undertones, almost like the 2008 L'Oreal ads in which the beauty company was accused of whitewashing Beyoncé, or the 2013 India Arie promo in which the singer was accused of changing her skin color. The same result happens for D, an Indian-American woman, when she uses the Amaro filter. It's hard not to look at both photos and recall the history of American discrimination of those with darker skin, and the long legacy of "lightening" celebrities via Photoshop.
With only five subjects, my experiment might not be conclusive, but it's shown me that when Instagram filters brighten skin tones, those changes have both racial and cultural implications.
So what is the next step? Should we forego filters altogether? Studies show it's not that simple. We look for likes and comments on our IG photos to demonstrate interest and engagement in our lives. According to Georgia Tech and Yahoo! Labs researchers, filtered photos are 21% more likely to be viewed than unfiltered photos, and 45% more likely to receive comments.
When it comes to Instagram, the solution should not be to remove filters altogether but to make them more accommodating towards people of color. There's a big difference between a filter that blurs blemishes or other skin aberrations, and one that completely changes skin color in the name of "enhancement" or "refinement." After all, what good is it to fine-tune a photo so much that the result is only a shadow of the person who exists in the image?