Rick's Tech Talk

All Tech Talk, Most of the Time

Praising "Code" by Charles Petzold

Last year (2013), I finished a most remarkable computer book titled Code. It's by Charles Petzold, who wrote the venerable Programming Windows 95, among many others. "Code" takes an esteemed position on my computer bookshelf, next to Soul of a New Machine, and my copy of K&R.

Code is a book about how computers work, but it’s not a simplistic explanation of the major components of a computer. In today’s world, many people have a passing familiarity with memory, disk and CPU. Instead, "Code" is a thorough and deep look at the basic ideas that enable computing, and reading it thrilled me with its presentation.

The early chapters set up the ideas that is the title of the book. People have always needed to encode text. From using Morse code to send messages by flashlight, to using Braille for people who cannot see, the ideas behind encoding have been around for a long time.

Mr. Petzold relates encoding to counting, then counting to different number systems, then focuses the reader on the binary number system. As he fixes the idea of working in bits, he interleaves chapters on telegraph switches and basic electricity. It’s not a stretch to see these telegraphic pulses as ones and zeroes on a wire.

I was enthralled when he shows how these concepts can be put together to make logic switches. The book reminded me of all those ideas that fascinated me in my early exploration and formal study of computing. Boolean algebra, binary math, and basic circuits. These were the sticks and stones of early computing, and yet our computers today are still made from those same sticks and stones. Of course, the computer manipulates these sticks and stones in nanoseconds, which he writes "are much shorter than anything in human experience, so they’ll forever remain incomprehensible."

Petzold ably guides the reader into how to assemble their very own personal computer. He does so not with prefabricated chips (although that comes later), but rather with an ever-growing complicated maze of basic circuits. He shows how to make memory, how to make counters, and how to make the computer perform logic.

The book was at its most exciting for me at this moment. Petzold’s circuit diagrams are accurate, as I made a few of them using a basic circuit simulation program. How exciting must it have been to see a machine count? Or to see a machine make a logic choice? It was incredibly exciting to reacquaint myself with these ideas.

It also gives great context for why the earliest computers were the size of houses. These simple circuits weren’t small yet! When Petzold recounts the transition from vacuum tubes to the transistor, he recites a long list of men and companies that form the dawn of computing, among them John von Neumann, Claude Shannon, Bell Labs, William Shockley, Texas Instruments, Jack Kilby, and the Fairchild Semiconductor Corporation.

Hardware was king back then, but he writes: "If the hardware of one processor can do something another can’t, the other processor can do it in software; they all end up doing the same thing. This is one of the implications of Alan Turing’s 1937 paper on computability."

The end of the book reminded me of an exhibit I saw in some museum. The exhibit was titled "the timeline of the universe". It was a display that started with "the big bang", and ended with "mankind". Epochal events included formation of our sun, the planets, and dinosaurs. You had to walk the length of the long room to get from "the big bang" to "mankind", and when you got to the end, the label for "mankind" was only an inch or two long. I remember being struck by how tiny "mankind" was, relative the size of the room, relative to the timeline of the universe.

I felt the same way reading these last few chapters. Petzold presses hard on the fast-toward button. He covers the emergence of modern high-level languages, discussing ALGOL, Fortran, and C. He covers the rise of modern graphics, from its meager beginnings with ASCII and the “glass teletypewriter” of the CRT. He relates graphics to scanners, PostScript, Windows, OCRs then jumps to digitized movies and sound. He wraps up with our present: HTTP, and its rise from old-school BBSs and TCP/IP.

At the time I was reading it, I thought about how unfair it was to lump all these topics into two dense chapters, but as I look back on it know, I realize that it is like "mankind" on the timeline of the universe. The epochal event that is the computing age is happening right now. And when you finish this book, you realize how all of it is still a code of ones and zeros pulsing on a wire.

Tags: 

Comments

Submitted by Jamescig (not verified) on

Hello
Nowadays it is so difficult to remain adequate.
So I want to ask, dear, how long will we endure all this.
Never such a thing and here again.