You’re walking down a busy street in Manhattan. Maybe you’re grabbing a coffee or just rushing to catch the L train. You don’t know the person standing across from you, and they don’t know you. At least, that’s how it used to work. Now, if that stranger has the right app, they can snap a photo of your face and, within seconds, see your LinkedIn profile, your Instagram posts from three years ago, and maybe even your home address. This isn't a Black Mirror pitch. It's the reality documented in Kashmir Hill’s investigative masterpiece, Your Face Belongs to Us.
Honestly, the book is a wake-up call that most of us are hitting the snooze button on. It traces the rise of Clearview AI, a scrappy, somewhat mysterious startup that did what Google and Facebook were too afraid—or too ethically constrained—to do. They scraped billions of photos from the public internet to create a face-search engine.
The title of Hill's book, Your Face Belongs to Us, isn't just catchy. It’s a literal description of a power shift. For the first time in human history, you can’t hide in a crowd anymore.
The Scrappy, Secretive Rise of Hoan Ton-That
Clearview AI didn’t start in a shiny Silicon Valley campus. It started with Hoan Ton-That, an Australian-born techie with a knack for making things go viral, and Richard Schwartz, a man with deep ties to Rudy Giuliani’s political circle.
They realized something big.
While tech giants like Google had the capability to build facial recognition, they held back because of the massive privacy backlash they knew would follow. Eric Schmidt, Google’s former CEO, famously said back in 2011 that facial recognition was the one technology Google built and then decided to withhold. Ton-That didn't have those qualms. He basically decided that if a photo was "public" on the internet, it was fair game.
He wrote a scraper. It started crawling.
It didn't just crawl the easy stuff. It went into the corners of Venmo, YouTube, Facebook, and even the "Wayback Machine." By the time the world realized what was happening, Clearview had a database of billions of faces.
Why This Is Different From Your iPhone's FaceID
It’s easy to get confused. You use your face to unlock your phone, right? That’s 1-to-1 matching. Your phone just checks if the person holding it is the owner. It stays on the device.
Clearview is 1-to-many.
✨ Don't miss: Maya How to Mirror: What Most People Get Wrong
It takes a probe image and compares it against a massive, ever-growing index of everyone else. Think of it like a Google search, but instead of typing "best pizza in Brooklyn," you’re uploading a grainy CCTV still of a person in a park.
The accuracy is what’s truly haunting. Even if you’re wearing a hat, or if the photo is twenty years old, the algorithm is often scarily good at finding the match. This is because facial recognition software focuses on the geometry of the face—the distance between the eyes, the bridge of the nose, the contour of the cheekbones. Those things don't change as much as we’d like to think.
Law Enforcement’s New Addictive Tool
The book Your Face Belongs to Us details how Clearview bypassed the usual bureaucratic channels of selling to the government. Usually, tech companies spend years in "Requests for Proposal" (RFPs) and security audits.
Clearview went rogue.
They gave away free trials to individual police officers. It was a bottom-up strategy. An officer in a small precinct would use it to solve a cold case or a shoplifting incident, and suddenly, the whole department was hooked. They didn't need a warrant to search a database of "public" photos.
Kashmir Hill highlights a case where an officer used it to identify a man who had stolen a sweater. In another instance, it helped identify someone who had been deceased for years. On one hand, it’s a miracle tool for investigators. On the other, it’s a massive dragnet operating without any federal oversight.
There are stories of officers using it on their kids' boyfriends or to identify people at protests. The potential for "mission creep" is basically infinite. If the police can identify everyone at a Black Lives Matter protest or a January 6th riot within minutes, the "chilling effect" on free speech becomes a very real problem.
The Myth of Consent in the Digital Age
"But my profile is private!"
Is it, though? Even if your Facebook is locked down, did a friend ever post a photo of you at a wedding and tag you? Did you appear in the background of someone else’s YouTube vlog?
🔗 Read more: Why the iPhone 7 Red iPhone 7 Special Edition Still Hits Different Today
The core argument in Your Face Belongs to Us is that we have lost control over our biometric data. Unlike a password, you can’t change your face. If your credit card gets hacked, you get a new one. If your face becomes a digital key that links to every stupid thing you did in your twenties, you’re stuck with it forever.
We’ve traded anonymity for convenience.
We wanted to share photos with grandma, and in the process, we gave companies the raw material to build a surveillance state. The weirdest part is that most of this scraping technically violates the Terms of Service (ToS) of sites like Instagram. But Clearview’s stance was basically: "Sue us." And while some have, the data is already out there.
The Global Arms Race for Your Biometrics
While Clearview AI is the American face of this problem, it’s a global phenomenon. In Russia, FindFace allowed people to identify strangers on the street via the social network VKontakte. In China, facial recognition is integrated into everything from paying for groceries to "shaming" jaywalkers on giant public screens.
What makes the story in Your Face Belongs to Us so compelling is the realization that this isn't just about "bad" governments. It’s about the democratization of surveillance.
When a private company holds this power, they can sell it to anyone. They could sell it to authoritarian regimes, or they could sell it to billionaire real estate moguls who want to keep "unwanted" people out of their stadiums (which actually happened at Madison Square Garden).
James Dolan, the owner of MSG, used facial recognition to identify and boot out lawyers who were part of firms suing him. He didn't wait for a law to say he could. He just did it. That's the terrifying "wild west" of biometrics.
Can We Actually Fight Back?
There are a few glimmers of hope, though they feel small compared to the scale of the problem.
- BIPA (Illinois Biometric Information Privacy Act): This is one of the toughest laws in the U.S. It requires companies to get explicit consent before collecting biometric data. It’s why some companies won't even offer certain features to Illinois residents.
- The European Union’s AI Act: The EU is trying to ban "real-time" facial recognition in public spaces, though there are plenty of loopholes for national security.
- Opt-out requests: You can actually go to Clearview’s website and request to see what they have on you, and in some jurisdictions, ask them to "de-index" you. But it's like trying to put toothpaste back in the tube.
The truth is, once the math for these algorithms became widely available, the cat was out of the bag. You don't need a supercomputer to run facial recognition anymore. You just need a decent GPU and a big enough dataset.
💡 You might also like: Lateral Area Formula Cylinder: Why You’re Probably Overcomplicating It
The Psychological Toll of Being "Known"
We don't talk enough about what this does to our brains.
There is a specific kind of freedom in being a stranger. It allows for growth, for mistakes, and for the ability to reinvent yourself. If Your Face Belongs to Us teaches us anything, it’s that the "permanent record" teachers used to threaten us with is finally real.
Imagine a world where a first date can scan you under the table and see your entire litigation history or every tweet you’ve ever liked. It creates a society of "performative perfection." If everyone is watching, everyone is acting.
It’s exhausting.
Practical Steps to Protect Your Identity
Look, you can't wear a mask 24/7. And most of those "anti-facial recognition" glasses with infrared lights don't actually work against modern AI. But you can be smarter about your digital footprint.
First, go to your social media settings and turn off "tag suggestions" or facial recognition features. Facebook actually shut down its own internal facial recognition system and deleted a billion faceprints a few years ago because the legal heat was too high, but other platforms are less scrupulous.
Second, use tools like Have I Been Pwned but for photos. There are "reverse image search" engines like PimEyes. Be warned: searching for yourself on these sites can be a bit of a horror show. You’ll see photos you forgot existed. But knowing what's out there is the first step to managing it.
Third, support legislative efforts like the Fourth Amendment Is Not For Sale Act. This aims to stop the government from buying data from brokers (like Clearview) that they would otherwise need a warrant to obtain.
The Reality of a Post-Privacy World
Kashmir Hill’s reporting suggests we are at a tipping point. The era of the "anonymous stranger" is ending. Whether we like it or not, our faces have become our universal ID cards, and we didn't get a say in who gets to scan them.
The title Your Face Belongs to Us serves as a grim prophecy. The "us" in that sentence isn't the public; it’s the small group of engineers and investors who realized that our most personal data was just sitting there, waiting to be harvested.
Actionable Insights for the Average User
- Audit your "Public" Photos: Google yourself and use the "Images" tab. If you see photos from old blogs or defunct social media sites, try to get them taken down. Use the "Remove Outdated Content" tool from Google if the source site is already gone.
- Privacy Settings are Not Enough: Understand that "Friends Only" doesn't stop a scraper if your friend’s account is compromised or if the platform itself has a leak. Assume any photo uploaded to the cloud is potentially permanent.
- Use "Glaze" or "Nightshade": If you are an artist or someone who posts many photos of yourself, look into tools like Glaze (developed by the University of Chicago). These tools add tiny, invisible pixel-level changes to your photos that "cloak" them from being correctly read by AI models.
- Demand Transparency: If you walk into a venue and see a sign about facial recognition, ask what the data retention policy is. Most people don't ask. If enough people do, it becomes a PR liability for the company.
- Check Local Laws: Know if your state has biometric protections. If it doesn't, write to your local representatives. This is one of the few areas where bipartisan support for privacy actually exists.
The technology isn't going away. The algorithms are only getting faster, cheaper, and more accurate. The question isn't whether facial recognition exists—it’s who gets to hold the remote control. Right now, that remote is in the hands of a few companies that believe your face is just another piece of raw data to be mined. It’s time we started treating our biometrics like the high-stakes assets they actually are.