Yuval Noah Harari has a way of making you feel like a tiny, insignificant ant in the gears of a cosmic machine. If Sapiens was the story of how we became the masters of Earth, his follow-up, Homo Deus: A Brief History of Tomorrow, is the unsettling roadmap of where we go next. It isn't just a book about gadgets. It’s a philosophical autopsy of the human soul in an age where algorithms might know us better than we know ourselves.
We’ve spent thousands of years worrying about famine, plague, and war. For the first time in history, more people die from eating too much than from eating too little. More people die of old age than from infectious diseases. More people commit suicide than are killed by soldiers, terrorists, and criminals combined. Harari argues that since we’ve mostly "solved" these ancient problems, humanity needs new goals. Those goals? Immortality, happiness, and divinity. Basically, we want to upgrade ourselves into gods.
It sounds cool. Until you realize what we might have to give up to get there.
The Death of Liberalism and the Rise of the Algorithm
Most of us grew up believing in the "liberal story." You know the one: the customer is always right, the voter knows best, and beauty is in the eye of the beholder. We’re taught that our free will is the ultimate authority. But Homo Deus: A Brief History of Tomorrow takes a sledgehammer to that idea.
Harari points out that life sciences are increasingly viewing organisms as biochemical algorithms. If you think about it, your "gut feeling" is really just millions of neurons calculating probabilities based on past experiences. If an external algorithm—like the ones used by Google or Meta—can monitor your heart rate, your hormone levels, and your browsing history, it doesn't just predict what you’ll do. It knows what you’ll do before you’ve even felt the impulse.
💡 You might also like: Why a Railroad on the Moon Is Actually Our Best Shot at Staying There
The scary part? We’re already handing over the keys.
Think about how you navigate a new city. You don't look at the stars or a paper map. You follow the blue line on GPS. If the GPS tells you to turn left into a lake, you might just do it because you’ve stopped trusting your own senses. This shift from "trusting yourself" to "trusting the data" is what Harari calls Dataism. In this new religion, the most important thing is the flow of information. If something isn't recorded or shared, it didn't really happen.
The "Useless Class" and the Biological Divide
One of the most controversial takes in the book involves the job market. We’ve seen automation replace physical labor before. The Industrial Revolution didn't kill jobs; it just moved people from farms to factories. But this time is different. AI isn't just replacing muscles; it’s replacing brains.
💡 You might also like: YouTube Membership Only Videos: Why Most Creators Are Doing It Wrong
Harari suggests we might see the rise of a "useless class." This isn't an insult. It refers to people who are not just unemployed, but unemployable because they lack the skills to compete with AI in any meaningful way. If an AI can diagnose cancer better than a doctor or write code faster than a software engineer, what’s left for the average human?
While the masses might struggle to find purpose, the elite—those with the capital—might use biotechnology to upgrade their DNA. We’re talking about a future where the rich aren't just richer; they are biologically superior. They could live longer, be smarter, and remain more resilient to disease. Throughout history, the gap between rich and poor was legal or economic. In the world of Homo Deus: A Brief History of Tomorrow, that gap becomes biological. Once you have a species split where one group is literally "more human" than the other, the liberal idea of equality completely collapses.
Silicon Valley's New Religion
Silicon Valley doesn't just build apps. It builds worldviews. Harari describes how Dataism is becoming the first new religion since the days of the Enlightenment. In traditional religions, God’s plan was the secret to everything. In Dataism, the "Global Basin" of data is the new source of authority.
The mantra is simple: Output. If you go for a run but don't track your heart rate or post it on Strava, did it even count? If you see a beautiful sunset but don't take a photo, did you really experience it?
👉 See also: The Kindle Fire HD 7: Why This Older Tablet Still Refuses to Die
We are becoming chips in a giant system that we don't fully understand. We feed the system data, and in return, the system gives us convenience. But the system doesn't care about our individual experiences. It cares about patterns. This is a massive shift from the human-centric world we’ve inhabited for the last few hundred years. Honestly, it’s a bit of a bummer if you value things like "the human spirit" or "mystery."
Why the Predictions Aren't Set in Stone
It's easy to read this book and want to crawl into a hole. But Harari is very clear about one thing: he isn't a fortune teller. He’s a historian. History is the study of how things change, not a script for what must happen. By pointing out these possibilities, he’s giving us a chance to change direction.
If we don't like the idea of a "useless class," we can change how we tax robots or fund education. If we're worried about algorithms controlling our votes, we can regulate how data is used in politics. The future is a set of possibilities, not a single destination.
Critics often point out that Harari oversimplifies complex biological processes. Biologists like PZ Myers have argued that treating humans purely as "algorithms" ignores the messy, non-linear reality of how cells and consciousness actually work. Others argue that Harari underestimates the resilience of human culture and our ability to resist technological encroachment. These are valid points. The book is a provocation, not a textbook.
Practical Steps for Navigating a Data-Driven Future
The world described in Homo Deus: A Brief History of Tomorrow is already arriving in bits and pieces. You don't have to wait for a brain-computer interface to start thinking about your relationship with technology.
- Audit your dependencies. Pay attention to how often you outsource your decision-making to an algorithm. From what you eat (Yelp) to who you date (Tinder), try making a few choices "offline" just to keep your intuition sharp.
- Invest in "Human-Only" skills. AI is great at optimization but currently sucks at genuine empathy, complex ethics, and physical tasks in unpredictable environments. Skills like nursing, high-level negotiation, or artisanal craftsmanship are likely to hold their value longer than data entry or basic analysis.
- Guard your biological data. Your DNA and biometric data are the most valuable assets you own. Be skeptical of "free" services that require you to hand over your genetic profile or constant health monitoring.
- Read the fine print on "upgrades." As we move toward wearable and eventually implantable tech, the line between "healing" and "augmenting" will blur. Always ask what happens to your data when it's being streamed from inside your body.
- Cultivate focus. In a world that wants to fragment your attention into data points, the ability to focus on a single, deep task is becoming a superpower. Practice being "useless" to the algorithm by spending time in ways that can't be tracked or monetized.
The future isn't about a robot uprising with laser guns. It's about a quiet, convenient slide into a world where we are no longer the protagonists of our own stories. Whether that’s a tragedy or a triumph depends entirely on the choices we make today.