I’m going to be building an [ 8-bit computer ](https://www.tindie.com/products/semachthemonkey/rc2014-classic- homebrew-z80-computer-kit/) from the chips-up for educational, therapeutic and of course fun reasons, and this reminds me of thoughts I have around alternate computing history.
While working on a film project inspired by some mid-century science fiction stories, I thought about how to build the computing systems they described using the technology available in the future they described. I’m doing a bad job of explaining this but the gist is, how would applications we now do using very powerful digital machines be recreated using lower-power computers combined with other technologies?
For example, consider things like digital audio and video. These work by converting an analog signal into digital data, which requires considerable computing power, network bandwidth and data storage to reproduce what can be done in the analog domain using a can wrapped in foil , or a strip of film, a lamp and a lens (I’m oversimplifying but hopefully you get the idea).
The digital techniques require complex computers, which in turn require complex design, manufacturing and maintenance, both on the hardware and software sides. The result is, in many cases inferior to the best analog techniques, but it offers convenience and flexibility that is hard to match in the purely analog domain.
But computers don’t have to be this complex to be useful, especially when they are cheap, flexible and plentiful. The computers that emerged in the dawn of the personal computer revolution proved this by giving birth to such a vast range of applications that most of us know computers only defined by these applications. But in 1970 the definition of what a “real computer” is was very different, and the people pursuing these tiny personal machines were dismissed by most of the “professional” computer industry.
I’m getting sidetracked here.
What I’m imagining is an alternate history where instead of making computers more complex and dragging everything into the digital domain, computers remain simple enough that a motivated individual could build one from basic electronics while still being capable of delivering the applications we’ve come to value. In many cases this isn’t too hard to imagine.
For example, when you consider how most people use computers to communicate, most of the information shared is encoded in small amounts of text. From a strictly Shannon perspective, Twitter, Facebook, etc. could mostly be implemented in text mode on an 8-bit computer.
But what about audio & video? These things are clearly outside of what 8-but computers could handle.
This requires more imagination, and deeper thought into what it is that we actually enjoy about these applications. Audio, video and other high-bandwidth media are pervasive in contemporary computer applications, but I think this is as much a function of their availability as their necessity. If you think hard about when these media are worth their complexity, you come up with a much smaller number of applications, and from there you can get creative about how you might implement them in a way that is compatible with simpler computers.
Remember that all of these forms of media existed before digital computers, and for a long time the quality of their analog forms surpassed their digital forms (in some cases this is true still). So it’s not simple the reproduction of these media digitally that justifies complex computers. In the cases I’ve studied digital forms have provided two distinct advantages over the analog forms in the past:
They allow the consumer more control over their consumption
They allow more people to produce and distribute media to a larger audience
If these two aspects can be re-imagined using simple computers then most of the advantages of complex computers evaporates. The question then becomes: how could simple computers do this?
I believe the answer lies in hybrid analog-digital designs.
Like most ideas this isn’t completely new, and there are numerous historical examples of such machines, but like any field of technology some paths are followed while others drop-off, and analog-digital hybrid computer development tapered-off as computing capacity became available to pull analog signals into the digital domain. This doesn’t mean that the all-digital path taken was superior, just that forces guiding the choices pushed the work in the all- digital direction. These forces are not all technical, and don’t always take into consideration everything that is important to everyone.
So I’m revisiting the hybrid path, studying the remains of it and imagining where it might lead if we consider limiting the complexity of computers important.