If you walked into a Sears or a J.C. Penney in late 1982, you would have seen a wall of colorful boxes and joysticks. It looked like a gold mine. Atari was the fastest-growing company in American history. People were obsessed. But by 1983, those same stores were clearing out their inventory for five dollars a pop. The video game crash of 1983 wasn't just a bad quarter for business; it was a total systemic failure that nearly deleted gaming from the map.
It’s easy to look back and blame a single bad game about a cute alien. That’s the "popular" history. But honestly? The real story is way messier and involves a lot of corporate greed, a flooded market, and a complete lack of quality control.
The Myth of E.T. and the Reality of Too Much Trash
Let's get this out of the way. Everyone talks about E.T. the Extra-Terrestrial on the Atari 2600. They talk about the New Mexico landfill. They talk about Howard Scott Warshaw, the programmer who had to crunch the game out in five weeks. And yeah, that game was a disaster. It was buggy, frustrating, and ended up being a literal symbol of failure. But E.T. didn’t kill the industry by itself. It was just the straw that broke the camel’s back.
The real problem was "shovelware."
Back then, there were no gatekeepers. If you had the money to manufacture a cartridge, you could put out a game. Companies that had absolutely no business making software started jumping in. Purina—the dog food company—made a game called Chase the Chuck Wagon. Quaker Oats had a gaming division. Imagine buying a game today made by a cereal brand and expecting it to be E Elden Ring. It was nonsense.
Because there were hundreds of terrible games clogging the shelves, parents stopped trusting the brands. They’d spend $40—which was a lot of money in 1983—on a game that turned out to be unplayable garbage. The trust was gone. Once you lose the parents' wallets, you lose the kids' attention.
✨ Don't miss: Ben 10 Ultimate Cosmic Destruction: Why This Game Still Hits Different
How the Market Actually Collapsed
By 1983, the revenue for the home console market dropped by roughly 97 percent. Think about that number. It went from a $3.2 billion industry to around $100 million in just a couple of years. It was a bloodbath.
Companies like Magnavox, Coleco, and Mattel (who made the Intellivision) were bleeding cash. They couldn't give these consoles away. The "Great Videogame Glut," as Creative Computing magazine called it at the time, meant that there were more games than people who actually wanted to play them. Retailers were furious. They had no more shelf space for toys because they were stuck with thousands of copies of Pac-Man ports that didn't even look like Pac-Man.
The PC Threat
While the consoles were dying, home computers like the Commodore 64 and the Apple II were rising.
"Why buy a toy that only plays games," the advertisements asked, "when you can buy a computer that helps your kid with homework and plays games?" It was a killer argument. Computers had more memory, better graphics, and—crucially—you could pirate the games easily. This made the $200 Atari look like an expensive paperweight.
The video game crash of 1983 was basically a perfect storm where the technology was being outpaced and the content was becoming worthless at the same time.
🔗 Read more: Why Batman Arkham City Still Matters More Than Any Other Superhero Game
Nintendo Saved the Day (But With a Catch)
Most people think gaming just stayed dead until the NES came out in 1985. Not quite. It stayed dead in America. In Japan, the Famicom was doing just fine. But when Hiroshi Yamauchi and Howard Lincoln tried to bring that machine to the U.S., American retailers basically laughed at them. They didn't want "video games." Video games were a fad that had ended, like hula hoops or pet rocks.
So, Nintendo got clever.
They didn't call the NES a video game console. They called it an "Entertainment System." They designed it to look like a VCR so it would fit in a living room setup. They even bundled it with R.O.B. the Robot just so they could market it as a "toy."
But the most important thing they did—the thing that prevented a second video game crash of 1983—was the "Official Nintendo Seal of Quality." They locked down the hardware. You couldn't just make a game for the NES without their permission. They controlled the supply. They limited how many games a developer could release in a year. They essentially became the gatekeepers that Atari forgot to be.
Why 1983 Still Matters Today
You see echoes of 1983 all over the modern industry. When you see people complaining about "live service" fatigue or the absolute flood of low-effort assets on Steam, that's the ghost of the '83 crash.
💡 You might also like: Will My Computer Play It? What People Get Wrong About System Requirements
We live in an era where thousands of games are released every month. The difference now is that we have digital storefronts and reviews. In 1983, you just had a box with cool art on it and a prayer that the game inside didn't suck.
The industry survived because it learned that quality matters more than quantity. But as we see massive layoffs and ballooning budgets in the AAA space today, some historians, like Chris Kohler, have pointed out that we might be reaching another "correction" period. Maybe not a total 97% collapse, but a shift.
What We Can Learn From the Rubble
If you’re a developer or just a fan, the takeaway is pretty clear: reputation is everything. Atari was the king of the world until people realized they were selling mediocrity in a box.
If you want to dive deeper into this era, there are a few things you should check out:
- Watch the documentary Atari: Game Over: It actually shows the excavation of the New Mexico landfill and debunks some of the more extreme myths.
- Read The Ultimate History of Video Games by Steven L. Kent: It gives a play-by-play of the boardroom meetings where these companies decided their own fate.
- *Play the original Pitfall!:* Contrast it with the bad games of that era. You’ll immediately see why some companies survived while others folded—talent and polish were rare commodities back then.
The video game crash of 1983 wasn't the end of gaming. It was the end of the industry's childhood. It was a painful, expensive lesson in why you can't just treat your customers like a bottomless piggy bank.
If you're looking for actionable insights today, look at the "indie" scene. Small teams are currently doing what the pioneers of the early 80s did before the suits took over—innovating because they love the medium. Support the developers who prioritize the player experience over "monetization loops," because as history shows, when the loops become more important than the fun, the whole thing can come tumbling down.
Stay skeptical of hype. Demand quality. That’s how we keep the lights on.