Pin It
TikTok filter bold glamour
TikTok/joannajkenny

AI TikTok filters are causing errors in our brains

The uncanny new generation of filters uses machine learning GAN technology to regenerate every pixel on a face

A version of this article was published 27 April 2023

If you’ve been on TikTok recently, you might have come face-to-face with a scarily yassified version of yourself. The Bold Glamour TikTok filter, which has now been used in 46.8 million videos, is a heavy contour make-up filter that changes in response to who’s using it, and never glitches. It is just one example of the new generation of filters which use AI and machine learning to be eerily lifelike and undetectable. Others include the Teenage Look filter, which de-ages users, and the Lite Foundation filter, which removes moles, beauty marks and acne spots, giving people an unnervingly unblemished facade. 

Unlike cookie-cutter AR filters that overlay a one-size-fits-all effect onto a face, the technology that is believed to be used in these newer filters – Generative Adversarial Networks (GAN) – is machine-deep learning tech that regenerates every pixel on a face based on a dataset of images. (TikTok’s parent company, ByteDance, hasn’t revealed the exact technology behind these new filters, but many filter creators, technologists and users have speculated this.) People who have used Bold Glamour noticed it changed in response to different hair colours, applied more or less make-up based on how masculine or feminine it deemed one’s bone structure, and changed in real-time based on whether they were hiding their eyebrows or not. 

“AR filters are like watching an old movie with outdated special effects – you know it’s not real,” says Daniela Gustafson, 27, an Atlanta, Georgia-based beauty enthusiast, who posted a TikTok recently demonstrating how the Lite Foundation filter removes her mole, pixel by pixel. “But just with how special effects improved in film, so do the filters on TikTok.” Since Snapchat first introduced AR filters in 2015, we’ve seen them evolve from puppy filters and flower crowns to something more insidious. Many set a new unreachable beauty standard, perpetuating colourism and Eurocentric ideals. The result has been unprecedented surges in plastic surgery and cosmetic procedures, and a subsequent global mental health crisis. Experts warn that the realistic nature of this new generation of filters poses other threats, which will only compound current issues.

Early Education Specialist Rory Gascoigne tells me over the phone that, according to a new theory in neuroscience known as predictive processing, our brains make sense of the world by predicting what we will see and then updating these predictions as a situation changes. Our brains build predictive models about everything. For example, as a child, people build predictive models about the laws of physics by bouncing a ball against the ground. This allows us to interact with the world around us successfully and anticipate future occurrences from patterns. 

“The brain’s primary focus is to avoid prediction errors,” he explains. When prediction errors occur, we struggle to comprehend the difference between reality and our perception of reality. “It’s very similar to how eating disorders manifest. Someone may have an internal model that sees themselves as fat. So they look in the mirror, and that's what they see, even though they may be anorexic because the brain is able to gloss over the details that your mind doesn’t want to see.”

Similarly, if you spend enough time using filters like Bold Glamour, you may begin to develop a predictive model of yourself where your brain thinks you look like that. When you see your actual face in real life or in photos, your brain will experience a prediction error that isn’t just viscerally uncomfortable but can lead to mental health issues, such as dysmorphia or depression. “Your brain is constantly engaged in the process of creating feedback loops with the environment where you build a model of reality,” he explains. “So if you start having an idea of yourself that shifts your reality in one direction or the other, you begin to behave in ways that seek validation for that reality.” That might include only ever sharing photos of yourself using filters or going as far as emulating your filtered self in real life. 

Gascoigne points to the example of how before filters, people brought photos of celebrities into a plastic surgeon’s office. Now they bring in filtered images of themselves. “This happens because they’re starting to build this internal model that states that this is the only valid way to look. When they see themselves without the filter, they’re getting further and further and further away from something that matches their reality.”

He isn’t sure what the ultimate result of this will be but says “the extent to which these filters are used by young people at a time in their life when they are first building these models of self-identity is worrying. If they’re corrupted to think that they should look like a filter, that’s setting them up for a difficult time in life.” This cognitive dissonance between what we think we should look like and what we actually look like positions people to be stuck in a loop of hopelessness and feelings of failure.

TikTok creator Laura Gouillon leverages her background in computer science, creative direction and film to create viral filters. She has made over 100 effects on TikTok, garnering over 11.4 billion views worldwide. Gouillon is excited by the technological advancements we’re witnessing and doesn’t think the technology itself is necessarily dangerous. “I think the challenge for existing and future social media platforms will be providing context on the presence of photo and video manipulation,” she says, “in an effort to reduce the potential negative psychological impact on users.”

But are disclaimers about who is using a filter good enough? France seems to hope so: this month the country’s lower chamber passed a bill to make it mandatory for influencers to label when images use filters or Photoshop. In a Dazed article, however, beauty culture critic Jessica DeFino posited that transparency isn’t a net positive, pointing to a study which found Photoshop transparency on advertising and marketing images was ineffective.

At the end of March, leaders in AI raised grave concerns about the rate at which AI is developing. In an open letter with more than 1,100 signatories – including Elon Musk, Apple co-founder Steve Wozniak, and Skype co-founder Jaan Tallinn – they called for a moratorium on state-of-the-art AI development. They warned that “AI systems with human-competitive intelligence can pose profound risks to society and humanity”.

“I am worried that with such great advances with machine learning filters, we might be headed towards a proliferation of deep fakes where users will be able to ‘wear’ the digital face of another person in a hyper-realistic way, which would make it harder to combat catfishing and disinformation,” says Maria Thuý Hiên Than, who’s a filter creator, creative technologist, and activist. 

Despite these fears, however, Gascoigne assures me that there is some hope. “The AI itself is neutral. How we use it as a social or cultural artefact causes the problems.” Thus to use it properly, we must first destroy our biases, conquer patriarchal beliefs about what and who is beautiful, and eradicate the white supremacy etched into the use cases of our technology. 

For Than the solution to combating this is simpler than ridding the world of prejudice – it’s human creativity. “I want to see the crazy shit,” she tells me. “I know some people are already using generative AI to look at more creative, ethereal, and imaginative applications of AI filters, I want to see more of that. The more tech advances, the more technical boundaries we are breaking, and that are pushing us to think outside of the box visually, and that’s what is so exciting about this all.”

Join Dazed Club and be part of our world! You get exclusive access to events, parties, festivals and our editors, as well as a free subscription to Dazed for a year. Join for £5/month today.