In 1982, video games were untouchable. They were everywhere. You couldn't walk into a grocery store or a pharmacy without hearing the digital chirps of a Pac-Man machine tucked into a corner. It was a gold rush. But by 1985? The industry was a ghost town. Revenues had plummeted nearly 97 percent, dropping from a peak of roughly $3.2 billion to a measly $100 million. People call it the 1983 video game crash, and if you ask the average person what happened, they’ll probably point a finger at a pixelated alien in a red hoodie.
They’re wrong. Well, mostly.
The story of the 1983 video game crash isn't just about a bad movie tie-in buried in a New Mexico landfill. It’s a messy, complicated tale of corporate greed, a flooded market, and the rise of the home computer. It’s a lesson in what happens when a business forgets that quality actually matters.
The "E.T." Scapegoat and the Landfill Myth
Let’s get the big one out of the way. Everyone loves to blame E.T. the Extra-Terrestrial on the Atari 2600. It’s a great story. Atari paid Steven Spielberg a fortune for the rights, rushed the development into a staggering five-week window so it would be ready for Christmas, and then produced millions more cartridges than there were actually consoles in homes.
It was a disaster. The game was confusing and, frankly, boring.
But E.T. didn't kill the industry alone. It was just the most visible symptom of a much deeper rot. By 1983, the market was absolutely suffocating under a mountain of "shovelware." Because third-party developers like Activision had successfully sued for the right to make games for the Atari 2600, every company on the planet thought they could make a quick buck. You had Quaker Oats—yes, the oatmeal company—releasing games like Sneak 'n Peek. Ralston Purina made a game. It was pure chaos.
🔗 Read more: Venom in Spider-Man 2: Why This Version of the Symbiote Actually Works
There was no "Seal of Quality." There was no barrier to entry. If you had a basement and some coding knowledge, you could ship a game.
Retailers were overwhelmed. They had shelves full of titles that nobody wanted, and because there were so many consoles—the Atari 5200, ColecoVision, Intellivision, Magnavox Odyssey²—the average consumer had no idea what to buy. It was a classic "Paradox of Choice." When everything looks like junk, people stop buying everything.
Why the 1983 video game crash was actually a business failure
Atari was the king, but it was a king with no clothes. Ray Kassar, the CEO at the time, came from the textile industry. He viewed games as "software" in the most literal, commodity sense. He didn't understand the creative spark required to make something people actually wanted to play.
The numbers are staggering. In 1982, there were maybe 20 different consoles on the market. Imagine trying to explain to your parents today why you need an Xbox, a PlayStation, a Nintendo Switch, and 17 other boxes that all do basically the same thing but worse.
Price wars eventually broke out.
💡 You might also like: The Borderlands 4 Vex Build That Actually Works Without All the Grind
Commodore, led by the aggressive Jack Tramiel, started slashing prices on the Commodore 64. He famously said, "Business is war." He wanted to kill the competition, and he did. When a home computer that can do your taxes and help with homework costs the same as a dedicated "toy" like an Atari, the toy loses every single time.
The PC vs. Console War
- The Computer Advantage: Machines like the Apple II and the C64 offered depth. You could program on them.
- The Console Weakness: They were viewed as static. You bought a box, you played a game, you got bored.
- The Price Point: By late '83, you could get a C64 for around $200. The consoles couldn't compete with that utility.
The role of the "Great Surplus"
By the time the 1983 video game crash hit its stride, retailers were desperate. They started "dumping" stock. Games that originally retailed for $35 or $40 were being tossed into bargain bins for $2.
Think about that.
If you are a developer trying to sell a new, high-quality game for $40, but the bin next to you is overflowing with $2 titles, you’re dead in the water. The perceived value of a video game hit zero. This created a death spiral. Developers couldn't make money, so they went bankrupt. Retailers lost money, so they stopped dedicating floor space to games. By 1984, many toy stores moved video games to the back of the shop, right next to the clearance items that nobody wanted.
It looked like the end. Seriously. Analysts at the time genuinely believed video games were a fad, like Hula Hoops or Pet Rocks. They thought the "video game era" was over.
📖 Related: Teenager Playing Video Games: What Most Parents Get Wrong About the Screen Time Debate
How Japan saved the industry (and changed the rules)
While the US market was smoldering, a company in Kyoto was watching very, very closely. Nintendo had seen the wreckage of the 1983 video game crash and they learned every single lesson the hard way.
When they brought the Famicom to America as the Nintendo Entertainment System (NES) in 1985, they didn't even call it a video game console. They called it an "Entertainment System." They designed it to look like a VCR. They bundled it with a robot (R.O.B.) to convince toy stores it was a "toy," not a "video game."
But their smartest move? The lockout chip.
Nintendo invented the 10NES chip, which meant no one could play a game on the NES unless Nintendo allowed it. This ended the era of Quaker Oats making video games. They introduced the "Official Nintendo Seal of Quality." They controlled the inventory. They made sure the market was never flooded again.
Actionable insights for the modern era
We often think we are past the dangers of the 1983 video game crash, but history has a funny way of repeating itself. Today’s digital storefronts are often flooded with low-effort "asset flips," and the rise of microtransactions has some fans feeling the same fatigue that gamers felt in '82.
If you are a developer or a business owner in the tech space, here is how to avoid the "Atari effect":
- Curation is king. Don't assume that more choice is better for the consumer. Too much noise leads to total disengagement.
- Protect your brand value. Once you let your product become a $2 bargain-bin item, it is incredibly difficult to convince people it is worth a premium price again.
- Hardware is nothing without software. Atari had the name recognition, but they didn't have the "Must Play" titles to sustain the interest once the novelty wore off.
- Watch the platform shifts. Just as the PC disrupted the console in '83, mobile and cloud gaming are shifting the ground today. Stay agile.
The crash wasn't the end of gaming. It was the end of the "wild west" era. It forced the industry to grow up, to implement quality control, and to realize that gamers aren't just consumers—they’re an audience that demands to be respected. Without that $3 billion collapse, we might never have gotten the refined, narrative-driven, and high-quality industry we have today. It was a painful, necessary evolution.