Culture

AI algorithm scores are only worsening our obsession with beauty

Facial assessment tools rely on exploiting some of our deepest vulnerabilities all in the name of letting us buy our way to beauty.

Shutterstock

Facial analysis tools are all over the place. These apps claim to give an artificial intelligence-enabled "assessment" of your face without sending you to the dermatologist. These tools are being used to give beauty scores based on opaque algorithms riddled with biased notions around race, age, weight, and other metrics.

MIT Technology Review’s deep dive into the world of beauty scores shows how obsessively tools like Face++ and companies like Qoves Studios are assessing people's “nasolabial folds,” under-eye contour depressions, discoloration, puffy eyelids, jawline sharpness, and other oddly specific details to give them a beauty score.

After preying on a user's anxieties, these tools give friendly recommendations about "surgical intervention" and other cosmetic "solutions" for which there are doubtless kickbacks. It’s a noxious mix of surveillance, prejudice, and narrowed notions of what “beautiful” looks like. In other words, all of the worst of the beauty-industrial complex, with an extra dose of capitalism.

Beauty scores in a beauty-obsessed culture — According to the MIT Technology Review report, facial assessment tools run on convolutional neural networks. These models are trained on sample data of people’s faces and designed to record skin texture and discoloration, bone structure, symmetry, blemishes, whether there are dark circles under the eyes, and much more. Then they prescribe market “solutions” like “anti-aging” creams and even surgical procedures ranging from budget-friendly to high-end options for “problems” that, in reality, are innocuous and natural phenomena that often come down to simple aging or genetics.

In some instances, the assessments extend from people’s faces to their arms, legs, fingers, and even their eyelashes. Determination of attractiveness is based on the data used to train the model, and on human assessments of it. Of course, that means the training includes the biases and prejudices of the trainers, which then manifest in the “scores” given to users.

By being trained on sample data that represents lighter skin tones and European features, according to the MIT report, beauty tools like Face++ rank women with darker complexions and wider noses as less attractive than women with Eurocentric features, lighter complexions, and smaller noses.

Old dog, new tricks — None of this is to say that the obsession with beauty didn't exist before these AI tools showed up in the market and on social media. For example, in 2020, TikTok owner ByteDance came under fire for a memo calling to suppress videos featuring people who were deemed “ugly” because of their age, weight, or skin texture. But these scores use algorithms to exacerbate an existing problem that already burdens countless people around the world with unrealistic expectations for — and standards of — attractiveness.

Scholars have written about it and warned what this fixation does to one’s self-esteem and mental health, especially that of teenagers, but none of it seems to persuade the engineers of such technology to evaluate their social impact. By exploiting people’s deepest insecurities, the beauty score industry hides ugly cruelty under the guise of “self-improvement,” further incentivized by the promise of lucrative referral or affiliate fees for the companies peddling these rating mechanisms.