Andrew NakamuraDaily.

65. The atomic number of the element terbium. The amount of known Euler’s idoneal numbers. The traditional age for retirement in America. And the grade my small chin, wide interocular distance and asymmetrical face received on is an online beauty calculator that calculates a score from 0-100% based on the measurements of the user’s face. The user uploads a headshot and plots various coordinates on the face to measure the size of each feature and their distance between each other. Once the algorithm has obtained enough data to outline each facial feature, it then produces a score, along with a list of pros and cons for the face. I tried the website out of curiosity and amid the long list of cons, one flaw stood out to me: “wide nose.”

More than hurt or outraged, I was confused because it had never occurred to me that my nose was a negative physical trait. What makes a wide nose flawed? Who gets to decide whether they are flattering or not? Nose shapes are often used as examples to illustrate the divide between the natural features of people of color and people who fit the Eurocentric beauty standard because, overall, ethnic noses are wider and flatter than European noses. While identifies features prevalent in people of color as ugly, it exposes itself as a software that upholds racial biases and Eurocentric beauty standards. In order to confirm my speculation, I ran a few more tests through the calculator using stock images of men, each from a different race. Sure enough, all images except for the white man yielded a “wide nose” result. 

Interestingly, Googling the term “ethnic noses” yields a plethora of resources regarding corrective plastic surgery, whereas the search term “European nose” produces much fewer articles relating to surgery. All of these websites find that ethnic noses are too large, wide or flat — and that they need to be corrected. I would never blame someone for getting plastic surgery to feel beautiful in this soul-crushing society. However, I take immense issue with the pressure placed onto people of color to alter their appearance in order to feel beautiful in accordance with obsolete Eurocentric beauty standards.

As I returned to the homepage after repeating the test multiple times, a stream of tiny text at the bottom of the site caught my eye: “Results are based on complex mathematical calculations performed by a blind computer beauty calculator and could be incorrect.” The framing of the algorithm’s process as “complex mathematical calculations” seems to apply a sense of authority — that the computers’ results are purely objective and thus indisputable. Although studies have shown that humans already trust algorithms more than their fellow humans, computers are not always suited to make subjective decisions. Machines do not exist in a vacuum; facial recognition algorithms are trained on data fed by the humans developing the software. Whether or not this was intentional, upholds Eurocentric beauty standards as ideal by flagging wide noses as a flaw. In addition, there are plenty of other examples of racial bias in facial recognition technology, such as artificial intelligence’s misidentification of Asian faces and Twitter’s algorithm favoring white faces. Without carefully confronting unconscious biases, they will infiltrate our AI, further harming people of color as technology only becomes more important in our society.

The sheer volume of racism pervading our technology stems from the lack of diversity in the computer science field. Many companies in Silicon Valley admit that they “have a lot of work to do” in terms of diversifying their staff, but are they really doing the work? Companies like Google promote Black History Month for engagement, but Black employees only make up 3.7% of their global workforce. The lack of Black, brown and Indigenous employers limits the number of connections available to underrepresented minorities, especially in a career path that relies on referrals. While white executives may vocally support people of color, they fail to use their immense power to open up more opportunities for BIPOC engineers. Our technology is more than capable of conceptualizing and constructing facial structures, yet it becomes a manifestation of its programmers’ conscious or unconscious bias. Until computer science spaces actively work to include underrepresented minorities, racist algorithms will always permeate our technology right under our noses.

MiC Columnist Andrew Nakamura can be contacted at