Sunday, October 21, 2012

In the Palm of Your Hand

Back when I was at the university, in the late ’60s, if you had told me that one day my telephone would also be a camera, a record player, a television set, a compass, a bookshelf, a newspaper, a reference library, a stock ticker, a weather report, something called a “video recorder,” something else called a “global positioning system” that could find my exact spot on a local map and guide me to all sorts of nearby stores and services, and, oh yes, also a “picturephone” like the one that was demonstrated at the New York World’s Fair in 1964 and played cameo role in 2001: A Space Odyssey … I’d have said you were crazy.1

I wouldn’t have said this because I was a bumpkin with no imagination. I had been reading science fiction since grade school (the glory years of Asimov, Bradbury, and Heinlein), watched every sci-fi oriented television show (Captain Video, The Twilight Zone, The Outer Limits and later Star Trek), and listened to radio plays (X Minus One). While still in high school I wrote a full-length science fiction novel (not a very good one—I was a late bloomer). And as an undergraduate I took a three-credit course on the future (including various methods of prediction through analysis of historical, economic, and cultural trends). So I lived through my school years with one foot in the future, but I never saw the smartphone coming.2

It’s not hard to see why. At the time, all of these tools and services required vastly different technologies. The telephone ran on wires through a dedicated switching system. Cameras used film impregnated with light-sensitive dyes to produce flat images. Records created sounds by running a needle along wavering grooves impressed into vinyl disks. Television came through the air with pictures broken up into scan lines beamed across a phosphorescent screen. Compasses were metal needles mounted on pins. Books, newspapers, and libraries were heavy wads of paper. The stock ticker was essentially a telegraph, and weather had to be drawn out on large maps. Video recording was done in a studio with a camera the size of an ice chest and a tape machine as big as a desk. And locating places on the Earth by triangulating signals from satellites was, at best, a gleam in some defense analyst’s eye.3

If you had asked an engineer of the time to pack all these devices into one machine and then to make it portable, the result would have been a rig something like a one-man band’s. These self-styled musicians usually have a drum hanging off their shoulders, cymbals between their knees, harmonica on a bracket at mouth level, and washboard, guitar, and accordion slung across their chests, all held together with cinch straps and a belt. In the engineer’s vision of a 1960’s “smartphone,” you would wear a helmet with earphones and a microphone plus a pair of rabbit ears for TV reception, a chest pack porting various lenses for the cameras and spools or cassettes for film, a shelf at the back with a platter and tone arm for playing records, a cathode-ray tube bracketed at eye level for television, and a shelf with reading table across the stomach for those books and newspapers. It would trail long connecting wires that you could plug in for telephone service. Oh, and it would run on really big batteries or need to plug in somewhere else for power.

All of these devices and services—except for books and newspapers, and the film camera—had only one thing in common: electricity. But conceptually their use of that energy was still diverse: low-voltage diaphragms and carbon granules to run the telephone; electric motor and vacuum tubes to run the record player and its amplifier; high-voltage cathode-ray tube for the television; and so on. These were “electronics” in the primitive sense of analog signal propagation and simple amplifying circuits. And for the camera and the books, nothing.

The 1960s was still an analog world. Voices, sounds, and pictures were all captured in real time, stored physically, and recalled and played out as wavelengths. The rising and falling pitch of a human voice or the musical notes from an instrument were presented in infinitely variable gradations from loud to soft and from high to low. And these were captured as deeper and faster—or shallower and slower—zigs and zags in the endless spiral cut into a vinyl disk, or as stronger and more rapid—or weaker and more widely spaced—magnetic pulses in the iron oxide coating on a piece of tape. The light and dark areas that make up a picture were captured as more or fewer grains of dye held in an emulsion on film and turned sideways—or not—when struck by photons from the visible spectrum. The light and dark of a television image depended on the strength of a electron beam that scanned across a glass screen coated with phosphorescent dyes.

Signal strength—and the various ways to capture it, record it, and play it back—ruled the day. And what makes the smartphone and its many functions now possible is, of course, digital technology. Or rather, all of the various digital technologies that have come together in the years since the 1960s.

Sounds, for example—whether telephone voice transmission, music, or anything a microphone can pick up—instead of being captured as an amplitude-dependent analog signal, are now chopped into digital values representing pitch of voice and speed of delivery. Those binary values4 can be transmitted through a variety of signal formats (wire, microwave, radio), or stored electronically in a memory chip or on a magnetic disk, or manipulated through mathematical algorithms to compress the speed of transmission, filter out background noise, or even completely alter the signal content. Images—whether photographs, maps, or video frames—instead of being captured as pulses representing analog signal strength in each of three primary colors (the red, green, and blue of emitted photons) are now interpreted as an array of picture elements coded for hue and brightness, with more codes for coordinating the line scans and refresh rates in video. Simple book text is stored and sent as binary codes, with more codes for coordinating typeface, treatments, layouts, placement of illustrations, display of footnotes, and the reader’s current position in the text stream.

Computers existed in the 1960s, of course. And large amounts of data—mostly words and numbers—were already being interpreted, stored, and manipulated digitally. But the computers themselves were still huge installations, with core memories the size of walk-in closets and storage on banks of spinning tape drives or magnetic disks the size of washing machines. The cost and complexity of these assets restricted their use to large institutions, with access closely controlled and limited to highly trained operators.5 These users typed information on keyboards, which usually transcribed lines of data or code onto Hollerith or “punch” cards, which reading machines would then feed in stacks into the computer’s memory. Users received their output on reams of fanfold paper. Graphical user interfaces, with a high-resolution screen to display data and a mouse to select among it, were only just being invented. Even the science fiction of the time envisioned computers as massive, requiring dedicated rooms full of machinery. Robert Heinlein’s The Moon is a Harsh Mistress is a classic example.

It wasn’t until the next decade that microprocessors—“computers on a chip”—brought computing power down to something you could pack into a suitcase (the Osborne 1) or a typewriter frame (the Apple II). The designers of these original computing chips were not aiming at general-purpose, portable, personal computers but rather controllers for automated machinery. The computer that controls the fuel injection in a modern car or the clock functions of an automatic coffee maker6 is a microprocessor.

From the 1970s onward, we’ve seen digital technology advance and converge with the expanding capacity of computer chips, until today you can hold in your palm a telephone that’s also a camera, a television set, a record player, and so on. It all seems natural and logical in hindsight. But from the perspective of the 1960s, this combination of handheld devices was unimaginable. Even to contemplate it bordered on magical thinking—or madness.

The point of this meditation is simple. What other advances similar to the digitization of analog signals are out there, perhaps already on our intellectual horizon, that will make the world of 2050 unimaginable or magical to the average person of today? Up until this summer, I would have placed large bets on biological advances through manipulation of individual genomes and cellular functions. But with the announced discovery of the Higgs boson in July, I think we may also see a comparable leap in physics and possible applications through the manipulation of gravity, space, and time.

As I’ve noted before, scientific exploration and understanding have put humankind on an express escalator to a future we can’t even imagine. Fasten your seatbelts, it’s going to be a wonderful ride!

1. And if you had told me that one day I would like peanut butter on my grilled chicken and enjoy raw fish wrapped in cold rice and seaweed, I’d also have said you were crazy. But then, I was a suburban lad brought up on the East Coast, and the specialties of Thai and Japanese cuisine were still unknown to me.

2. Still, I and everyone else who read the Sunday supplements knew that by 1990 our cars would fly like George Jetson’s. We all seriously misunderstood the amount of energy and the sophistication of control required for that little trick.

3. The first global positioning satellite, Navstar 1, wouldn’t be launched until 1978.

4. Quick overview of the technology: All a computer knows and can handle is a two-value system—a switch at a certain location is either “on” (for “1”) or it’s off (for “0”). By selecting either 1 or 0 in each placeholder, and recognizing an orderly progression of placeholders, you can count up to any number, in the same way that you can select from among the ten symbols of Arabic numerals for 0 to 9 in one place, move a place to the left and count 10 to 99, move a place further and count 100 to 999, and so on.
       Binary counting, in base two, originally used eight placeholders (that is, eight binary digits—or “bits”—which can be either “1” or “0”) to count up to 256. A group of eight bits is generally recognized as a “byte.” By assigning each of those 256 possible values as code for an upper or lower case letter, Arabic numeral, punctuation mark, or other commonly agreed-upon representation, you can you can turn a string of eight binary digits into a letter, number, or symbol.
       You can then make groups of those bytes into words, sentences, and paragraphs. Or, with a different interpretation of the 1’s and 0’s, those bytes can become the pitch and delivery timing for a sound fragment in a conversation or piece of music; or the color and intensity for a picture element (or “pixel”) in a photograph or drawing; or a location reference on a map; or anything else that you and a community of programmers can agree upon. The programmers can then write software that manipulates those values mathematically in any way that human beings can imagine and absorb.
       Side note: When I was growing up, a long-distance telephone call was very expensive. This was because telephones using analog signals required human operators or mechanical switches to align various lengths of wire between one place and another into a single circuit—a connected span of wire—all the way from the telephone of the person making the call to the phone of the person receiving it, and then hold that circuit open while the two people talked back and forth. It took a lot of effort to arrange a call and dedicated a lot of expensive wire to carrying that single conversation back and forth.
       These days, using digital technology, a computer in the telephone itself (or alternatively in the local switching office) chops the conversation into convenient digital packets, attaches an identifying address to each packet, and sends it out across the system, which is a vast web of circuits, some carried by wire, some by microwave, and some by radio. Automatic switches along the route pass the addressed packets to the receiving office or phone, where they are reassembled into human speech or whatever else went into the handset microphone. Those digitized packets are chopped, sent, and reassembled by computers operating so fast that the process is virtually undetectable to the humans participating in the conversation.
       Those packets can also be shunted and stored on a memory chip (“voice mail”) or sampled and stored by government agencies (“wire tap”) just as easily as they can be reinterpreted through a loudspeaker for the human ear. Digital technology opens vast possibilities for the use and abuse of human speech or any other signal.

5. My alma mater, Penn State, ran most of the university’s accounting and academic recordkeeping with an IBM System/360, located in the heavily air-conditioned basement of the Computer Center. System capacity at the time was laughable by today’s standards: core memory in the thousands of bytes and tape or disk storage in terms of millions of bytes per mount. But then, my entire personal file and transcript—name, gender, home address, campus address, telephone numbers, courses taken, grades received, and payment history—would probably fit into the space of 2,000 bytes. So storing and retrieving the files on 26,000 active students was entirely feasible, although high-resolution photographs were not included.

6. Another question that someone from the 1960s would ask in wonder: “Why does my coffee pot need a clock?

No comments:

Post a Comment