In 1992 I'd already been on the Internet for a couple of years, but most people had never heard of it. Personal computers with mice and windows were widespread, but the Web was just being invented, and didn't have images on it yet.
That year, the U.S. Public Broadcasting Service produced a five-part documentary series called The Machine That Changed the World, which was about the history and impact of electronic computers. I would have loved that show, but I never saw it at the time. Fortunately, Andy Baio of waxy.org has digitized the whole thing and made it available for download. I watched it over the past few days.
The three history episodes that begin the series are still very relevant, and particularly useful because they include interviews with many early computing pioneers, some of whom have died in the intervening 15 years or so. The fourth, about artificial intelligence, is both refreshing and dismaying, because in a decade and a half very little has changed in the field of AI—making computers think seems to be a harder and harder problem the more we learn.
You'd think that the final episode, about computer networks, would be the most out of date. After all, the '90s and our current decade are in many ways defined by the rise of the Internet (which gets a brief mention on that show, in the context of how it helped debunk cold fusion in the scientific community). Yet that show is intelligent and has good foresight about how electronic communications can change human societies.
The two things I noticed the most about the series are that the concepts of pirates and malware and cyber-attacks are hardly mentioned—although privacy concerns make up a big part of the final episode—and that pretty much everyone types on clackety, loud, heavy-duty keyboards that many modern geeks would lust after.
What is clear from The Machine That Changed the World is that the computer industry had largely found its solutions to fundamental hardware and software problems by the early 1990s: the Macs and PCs everyone used by then are obvious close relatives of the computers we still use today. By contrast, personal computers 15 years before that, in the late '70s, would be unrecognizable to my kids, and even those were terrifically advanced compared to their room-filling vaccum-tube predecessors of the '40s and '50s.
What changed is now we use them. The final episode takes pains to explain what a modem is, and how it connects to a phone line. It reveals in amazement that "over a thousand" senior citizens in North America communicated by email over "SeniorNet," and that a modem user could connect to Japan, Estonia, and Norway in the course of ten minutes—today we could do that in a web browser with a few clicks, but more remarkably, we probably wouldn't know or even care where the computers we connect to are.
I watched most of the show on my iPod Touch, which was appropriate. Watching the early programmers wrestle with mazes of wire to set up ENIAC, I tried to imagine how much more computing power I was holding in my hand, but the multiplier was too large. Even more amazing, in a way, was that despite their familiarity, the hulking, multi-thousand-dollar desktop computers on the desks of people interviewed for the show were still less powerful than my little handheld media player.