You’ve seen them. The eyes that look a little too sparkly, the skin that’s suspiciously poreless, or maybe that one stray earring that melts into a cheek like some sort of glitchy Dali painting. When you visit This Person Does Not Exist, you aren't looking at a database of headshots. You are looking at a mathematical hallucination.
It’s weird. Honestly, it’s a bit haunting.
The site launched back in early 2019, and it basically broke the internet for a week. We weren't used to AI being that good at faces. Software engineer Philip Wang created the site to show off what a specific type of AI—the Generative Adversarial Network—could actually do. He didn't build the algorithm himself; he used the StyleGAN code released by NVIDIA. But by putting it behind a simple "refresh" button, he turned a complex research paper into a cultural moment.
The "Two AIs Fighting" Trick
Most people think the site just mashes photos together. It doesn't. There is no library of noses and eyes sitting on a server somewhere in California. Instead, you have two neural networks locked in a constant, digital boxing match.
Ian Goodfellow, who was a researcher at Google at the time (and later Apple), pioneered this GAN concept. Think of it like an art forger and a detective. The "Generator" tries to create a face from complete random noise. At first, it’s just a blob of gray pixels. It’s terrible. The "Discriminator" is the critic. It looks at the fake and looks at a massive dataset of real humans—specifically the Flickr-Faces-HQ dataset—and tells the Generator, "Nope, that’s fake. Humans don't have three nostrils."
The Generator goes back to the drawing board. It tries again. Millions of times. Eventually, the Generator gets so good at mimicking the statistical patterns of a human face that the Discriminator can't tell the difference anymore. That’s when you get the image on your screen.
It’s math. Just layers and layers of linear algebra and probability.
📖 Related: Why Chrome Download for Mac is Still the Best Move for Your MacBook
Spotting the Glitches in the Matrix
Even though the tech has improved, these "people" still have tells. The AI is great at the "main" features—eyes, nose, mouth—but it struggles with the stuff it considers background noise.
Check the ears. Often, one ear will have a stud and the other will have a hoop, or the earlobe will just... vanish. It’s because the AI doesn't understand "symmetry" as a rule; it just knows that ears usually go on the sides of heads. If you look at the background, you’ll see what people call "space demons." These are blurry, distorted shapes that look like half-formed people or melting furniture.
The hair is another giveaway. While the strands look real, they often don't follow the laws of physics. A lock of hair might sprout directly out of a forehead or merge into a sweater. Glasses are also a nightmare for the AI. The frames might be thick on the left side and paper-thin on the right, or they might not actually wrap around the ear.
Why This Actually Matters for the Real World
This isn't just a fun toy for making fake Tinder profiles. The implications are actually pretty heavy. On the positive side, game developers and digital artists use this tech to create crowds of background characters without having to hire 5,000 extras or spend months modeling unique faces. It saves a massive amount of money.
🔗 Read more: Will Alarm Go Off on DND? What Most People Get Wrong About Phone Silencing
But there’s a dark side.
We've already seen "non-existent" faces used for bot accounts on LinkedIn and X (formerly Twitter). In 2019, an AP report highlighted a profile for "Katie Jones," who claimed to work at a top think tank. She didn't exist. Her face was AI-generated. She was being used for a suspected espionage effort to connect with high-level political figures. When anyone can generate a believable human face in 0.1 seconds, the "eye test" for verifying who you're talking to online becomes totally useless.
Then there’s the bias issue. Because the AI learns from a dataset (Flickr-Faces-HQ), it inherits all the biases of that data. If the dataset is 70% white people, the AI is going to be much better at generating white faces than faces of color. Researchers like Joy Buolamwini have spent years shouting about this. If these generated faces are used to train facial recognition software, and that software is biased, we end up with real-world problems in policing and security.
The Tech Behind the Curtain
The specific model used by This Person Does Not Exist is StyleGAN2. NVIDIA researchers Tero Karras and his team figured out how to separate the "styles" of a face. They realized they could control the "coarse" features (pose, face shape), "middle" features (facial features, eyes), and "fine" features (skin color, hair texture) independently.
👉 See also: A New Lease on Death: Why the Ethics of Digital Immortality Are Getting Messy
If you want to see the math, it involves a mapping network that takes a latent code $z$ and transforms it into an intermediate latent space $W$. This allows the AI to avoid the "average face" problem where everything looks like a blurry composite. Instead, it creates sharp, distinct identities.
Actionable Steps for Navigating an AI-Generated World
The reality is that we are moving into an era where "seeing is believing" is a dead concept. You have to be more skeptical than your parents were. Here is how you can actually handle this:
1. Check the edges. If you suspect a profile photo is fake, don't look at the eyes. Look at the hairline and the background. If the background looks like a psychedelic oil painting or the hair blends into the neck, it’s a GAN.
2. Use reverse image search. While This Person Does Not Exist creates unique images, many scammers reuse the same "good" ones. Tools like PimEyes or Google Lens can sometimes catch these if they’ve been scraped and used elsewhere.
3. Test the "context." AI-generated faces are static. If you’re talking to someone and you’re suspicious, ask them to take a photo holding a specific object or making a specific weird face. The AI can't do that in real-time—yet.
4. Understand the copyright. Interestingly, images from This Person Does Not Exist are generally considered public domain or creative commons because they weren't created by a human. If you need a face for a presentation or a mock-up, use them! It’s safer than stealing a real person's photo from Instagram.
The tech is only going to get faster. We’ve already moved from static images to "This Video Does Not Exist" (using Sora or Kling) and "This Voice Does Not Exist." The "uncanny valley" is getting narrower every day, and eventually, we won't be able to crawl out of it.