When we talk about the late, great Stephen Hawking, the image is burned into our collective memory. The black wheelchair. The silver screen. That iconic, robotic voice that sounded more like the future than the future does now. But there's a specific piece of the puzzle people often gloss over: how did he actually do things? I’m not talking about calculating the entropy of a black hole. I mean the "everyday" stuff. Turning on the TV. Dimming the lights. Opening a door.
Basically, for a man who couldn't move a finger for the latter half of his life, he had more control over his physical environment than most of us do while sitting on the couch with a standard remote.
The "remote control Stephen Hawking" setup wasn't a single plastic clicker you’d find at Best Buy. It was a sprawling, custom-built ecosystem of infrared sensors, radio links, and an incredibly patient piece of software called ACAT. Honestly, it’s one of the coolest hacks in the history of human engineering.
The Infrared Magic in His Glasses
You might have noticed a little black box dangling from the frame of Hawking's glasses. That wasn't a fashion choice. It was an infrared (IR) sensor.
Here is how it worked. The sensor projected a tiny, invisible beam of light onto his cheek. When Hawking flexed his cheek muscle—a movement he retained long after the rest of his body had gone quiet—the intensity of the reflected light changed. The computer interpreted this "twitch" as a mouse click.
One twitch. That was his entire input.
Think about that for a second. Every email, every book, every joke he told on The Big Bang Theory, and every command to his home appliances started with that one tiny muscle. It’s like trying to operate your entire life using only the spacebar on your laptop.
How He Controlled His House
Hawking didn't just use his computer to talk. He used it as a master hub for his entire environment. Through his wheelchair-mounted Windows tablet (usually a Lenovo ThinkPad for those curious about the hardware), he could access a "remote control" menu.
- The TV: He could scan through channels and adjust the volume.
- The Doors: His office and home were fitted with automated openers he could trigger.
- The Lights: He had a customized home automation system that predated the "Smart Home" craze by decades.
Because the computer was already doing the heavy lifting of interpreting his cheek twitches into digital commands, adding remote control functionality was just a matter of installing IR transmitters and radio frequency (RF) bridges. He was essentially a pioneer of the Internet of Things (IoT) because he had to be.
The Software: Intel’s ACAT
Intel worked with Hawking for over 25 years. They eventually developed the Assistive Context-Aware Toolkit (ACAT). This software was the "brain" that made the remote control aspect possible.
The screen would constantly "scan." A cursor would move across rows of letters or icons. When the cursor hit the thing he wanted—say, the "Media" icon—he’d twitch. The software would then dive into a sub-menu for the TV or the lights.
It was slow. Painfully slow. We’re talking about one or two words per minute at his slowest. But the predictive text (developed with help from SwiftKey) was world-class. If he typed "The black," the system would immediately suggest "hole." It learned his voice, his vocabulary, and his habits.
Why the "Robot Voice" Never Changed
Technically, Hawking could have upgraded his voice dozens of times. He could have had a voice that sounded like George Clooney or a soft-spoken British academic. But he refused.
The hardware for his speech synthesizer—the CallText 5010—was ancient by tech standards. When the original boards started failing, his team actually had to reverse-engineer the software and emulate it on modern hardware because Hawking identified with that specific, 1980s-era "robotic" sound. It was his voice.
What This Means for Us Now
What’s wild is that Intel eventually made the ACAT software open-source. Anyone with a webcam or an IR sensor can use the same code that Hawking used to navigate the world.
If you are looking to implement similar accessibility or "remote" tech for a loved one, here is what actually matters:
- Consistency over power: You don't need a supercomputer. You need a sensor that can reliably pick up the one movement the person can still make.
- Predictive power: The "remote" isn't the hardware; it's the software that predicts what you want to do next.
- Customization: Hawking’s GAs (Graduate Assistants) spent hours "tinkering." There is no out-of-the-box solution that fits everyone.
Hawking proved that "remote control" isn't about the device in your hand. It's about the interface between the mind and the machine. He lived in a body that was essentially locked, yet he reached out and touched the stars—and his living room light switch—with nothing more than a twitch of his face.
👉 See also: 3d home maker software: Why Most People Choose the Wrong One
Next steps for exploring this technology:
- Check out the Intel ACAT repository on GitHub if you're interested in the actual code behind his communication system.
- Research eye-tracking sensors like Tobii, which have largely superseded the cheek-twitch IR sensors for modern ALS patients.
- Look into IFTTT (If This Then That) integrations for smart homes, which allow single-trigger actions to control complex chains of devices.