Tales from the Chad Box

Reading about chad in the newspaper is for me like hearing a Beatles song
on the radio --- both stir deep memories of sights, smells, and sounds
that were part of my life 30 or 35 years ago. Even as the capacity of
computers for audiovisual entertainment has grown, computers themselves
have, by and large, lost their physicality. Not a bad thing, as the
mechanical parts used to break down the most, but something is gone from
the time when you could see a single bit with the naked eye.

Chad itself, first of all. I say "chad itself" because in the 1960s "chad"
was a collective noun, like "chaff." In fact, though the etymology is
obscure, I'll bet the word is related to "chaff," which a fistful of chad
resembles, except that the sharp edges of the little cutout pieces make it
sensible for the word to end with a dental rather than a fricative. But
nobody ever talked about a single chad. Some MIT friends have told me that
they called the individual pieces "chits," but I never heard that at
Harvard.

There is a charming folk etymology: that there was a kind of punch that
made U-shaped slits rather than holes, and it had been invented by a Dr.
Chadless, so chad was what a Chadless punch did not create. This is,
sadly, a better example of geek humor than of word history. No one named
Chadless is in the US or UK patent databases, or even in the LDS Church
family history records.

The punches that did create chad were marvelous machines. I grew up on the
PDP-1 and PDP-4 computers in Cruft and William James Hall. They used
1-inch paper tape. The punch created a line of 8 holes across the tape, at
a rate of 60 lines per second. So the computer would produce a lot of chad
in the course of a day. The chad fell from the punch into the chad box,
which you had to empty every now and then. The punch made a high-pitched
whining sound that varied somewhat with the pattern of holes punched. If
the sound became muffled, you'd forgotten to empty the chad box and the
chad was backing up into the punch --- you had to get there before the
sound changed to a grind!

Paper tape was fan-folded. The paper tape reader pulled it horizontally
across a row of sensor lights; the unread and already-read parts of the
tape were in fan-folded sections, standing vertically on the folds in bins
to the left and right of the sensors. There was a gentle rhythmic sound as
the segments unfolded from one bin and refolded in the other: sswhish,
sswash, sswish, sswash. Unless, of course, you didn't set the tape in the
take-up bin properly at the beginning, or there was more tape than the bin
was designed to handle; in that case the rhythmic sound dissolved and the
tape quickly formed an ugly mess of loops and squiggles.

There were no video terminals; the keyboards were big heavy Teletype
machines, adapted from telegraphy, with hundreds of ingeniously
interconnected levers and springs inside. During a three-year period I
moved up from KSR-28s to ASR-33s to ASR-35s. These had three different
kinds of hammering mechanisms; the fanciest had the type in a rectangular
array which shifted around in front of a fixed plunger going in and out to
make successive imprints. I can still hear the distinctive sound of each
of these devices, including the special whirring clunk when a 28 shifted
between letters and numbers mode. The Flexowriter was used for preparing
punched paper tapes by hand, or printing a punched tape on paper; it had
hammers like a mechanical typewriter, one per letter, and went thwapp!,
thwapp!, thwapp!, with a big smashing sound when the carriage returned.

No one got RSI in those days, and I am not sure why, as many of us worked
on these computers every minute we could, all night long if possible. I
think it may have been because you just couldn't type very long without
having to do something else. You had to load the paper tape reader, empty
the chad box, find another box of paper tape and load it in the punch, run
over to the Flexowriter and print a listing of the tape that was just
punched, get out a single-edged razor blade and splice in a few rows of
holes if you wanted to change one instruction in your program, etc. These
activities provided breaks and exercised different muscles.

All the moving around made it hard to keep track of different parts of the
machine; you needed tricks to guess what was going on at the console if
you were busy elsewhere. In those pre-miniaturization days, the ordinary
operation of the central processor used enough energy to generate some
radio frequency radiation. This meant you could put a radio on the console
and tune it in between stations; from the other side of the room, the tone
of the static indicated whether the machine had crashed or not.

We were constantly fussing with individual bits in memory. There were rows
of toggle switches which you set up or down to specify binary data, and
rows of lights in which the data in memory were displayed. You spent a lot
of time setting, reading, and interpreting those patterns; memory was so
scarce and programming tools so weak that programmers needed to fiddle
with numbers a lot. We worked in octal, taking three bits at a time. To
paraphrase the immortal Tom Lehrer, octal is like regular counting, if
you're missing two fingers: 1, 2, 3, 4, 5, 6, 7, 10. Setting switches
required mental and manual dexterity, and eventually I just thought in
octal. During those years I happened to notice when my '63 Dodge was
going over 77,777 miles, and I suddenly felt sickened that something was
wrong because the odometer didn't roll over to 100,000.

Before the integrated circuit, you could look at a core memory plane ---
little magnetic donuts strung at the intersections of a grid of thin wires
--- and see where the bits were stored. There were wires, lots of wires,
connecting things. And people had to patch wires all the time, replace
individual components, and plug things in and out. The manufacturers were
always releasing hardware fixes, and any research required some custom
hardware fiddling. Of all this I have olfactory memories: the ordinary
smell of hot solder, and of burning insulation, during routine repair work
or customization; and every now and then, the evil smell of some costly
piece of electronics frying inside the running computer --- on one
occasion a newly installed 64K memory module that had cost a fortune.

If you take a computer apart today, you won't find much inside; it all
looks the same. The 1960s were a sort of Cambrian era in the design of
devices, with an explosion of experimental forms and materials. I got
married straight out of college and went to work as a systems programmer
at a national research lab. This was a fool's heaven for an obsessive
hacker; if the machine wasn't working, no one could use it till I fixed
it, and I got to say whether the machine was working or not. During one
marathon stint I was fooling around with some components with liquid
mercury inside. When I came home late at night my wife asked me, "How was
your day? And just why are you wearing that silver ring rather than your
wedding ring, dear?" The mercury had leaked and formed an amalgam with the
gold. It was a nice project to figure out how to undo all aspects of this
situation. But the boiling point of mercury is less than the melting point
of gold, and a friend of mine had a jeweler's oven, so I was able to cook
the ring back to its original state. Ring and marriage are both still
intact.

Except for what one sees on the screen or hears from the speakers, one has
little sense of anything actually happening inside a computer today. The
integrated circuit changed the world in the 1970s, and the mechanical
marvels of the early years have given way to electronic or magnetic
substitutes. The abstraction of function from mechanism is a triumph of
computer engineering. But as the courts contemplate what's a 0 and
what's a 1 on those Florida punch cards, I'm raising a toast to the good
old days when bits were something you could see, touch, hear, and
sometimes even smell.

- Harry R. Lewis '68, PhD '74, Gordon McKay Professor of Computer
Science and Dean of Harvard College