PopMatters is moving to WordPress. We will publish a few essays daily while we develop the new site. We hope the beta will be up sometime late next week.
Books

Home Computers: 100 Icons that Defined a Digital Generation (excerpt)

Alex Wiltshire

Whether you remember waiting for dial-up access, tiny screens, and green lines of text or not, you'll get a kick out of Alex Wiltshire's travel back in time to when computers came with wires. Enjoy this excerpt of Home Computers, courtesy of MIT press, with nostalgia photography by John Short.

Home Computers: 100 Icons that Defined a Digital Generation

Excerpted from the Introduction to Home Computers: 100 Icons that Defined a Digital Generation by Alex Wiltshire, with photographs by John Short. Reprinted with Permission from The MIT PRESS. Copyright © 2020. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

The computer has taken many forms over its long history. Antikythera mechanism; planimeter; arithmometer; Old Brass Brains. Each of these devices took in information, manipulated it according to a series of instructions, and then spat out results, helping to build structures, navigate space, reveal natural phenomena, and express concepts that would otherwise be beyond their creators' minds. They have quietly extended the limits of what humanity can do for centuries. But over a few short years at the end of the twentieth century, the computer experienced a revolution.

This was the moment in which the computer evolved from a tool designed to perform particular tasks for specialists into a general machine for all. It began taking part in everyday life, playing an essential role in homes and offices and changing the nature of work and leisure. It inspired generations of artists, engineers, and designers and helped form new fields of creativity, entertainment, and production. It made fortunes and took them, and it carved the essential foundation for an even wider digital revolution that was still to come.

During these tumultuous and defining years, the computer became electronic and digital, a humming, beige case that barged its way onto desks and trailed wires across rooms. It took over TV screens, presenting its users with the steady blink of an idle cursor and introducing them to arcane new languages which acted as an interface between them.

The microcomputer was another step along the computer's journey from clanking calculating mechanism to ubiquitous digital device, the result of a series of technological advances which brought about a crucial fusion of miniaturization and mass production. But it was also the child of many steps of theoretical development which established ways of representing and transfiguring abstract numbers with minute pulses of electricity.

Apple II (Photography by ©John Short / courtesy of MIT Press)

The key work was done in the first half of the twentieth century by mathematicians such as Alan Turing, Walther Bothe, Akira Nakashima, and Claude Shannon. They began to expand on older theories, such as that by the seventeenth-century polymath Gottfried Wilhelm Leibniz, who, inspired by the I Ching, showed how binary numbers could perform logical and arithmetical functions. They went back to the papers of Charles Sanders Peirce, who had realized at the end of the nineteenth century that electrical circuits could perform logical functions. They explored ways in which logic gates could take binary inputs and produce outputs, and how it could be transported within the circuits of a machine.

The first culmination of their theory was realized by Max Newman and Tommy Flowers in 1943 as they completed Colossus, the first digital electronic computer, and in 1948 when the Manchester Baby became the first electronic computer that could store programs. But many more projects were developing across Europe and North America. Frequently built on the mechanical computers that directed weaponry and decoded communications during the Second World War, there was MIT's Whirlwind I, one of the first computers that could calculate in parallel, and ENIAC, the first general-purpose electronic computer, made for the US Army's Ballistic Research Laboratory.

It seems probable that once the machine thinking method had started, it would not take long to outstrip our feeble powers… They would be able to converse with each other to sharpen their wits. At some stage therefore, we should have to expect the machines to take control.
– Alan Turing

I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.
– Claude Shannon

En masse, these machines inspired a wave of new invention which further accelerated their development. In 1947, physicists working at Bell Labs invented the transistor, a tiny semiconductor which could control electronic signals. Replacing big, hot, and unreliable vacuum tubes, the transistor allowed electronics engineers to build ever more intricate circuits, packing more components closer together and raising their computational power.

By the 1960s, large companies such as IBM and Control Data Corporation had grown to design and build mainframes. Wardrobe-sized and stratospherically expensive, these large and powerful computers were capable of storing and processing vast sets of data such as population statistics and industrial outputs, but they were confined to corporate headquarters and university campuses. It would take another vital advance before the computer could make the jump to the human scale of the garage workbench, office desk, or the floor in front of the living-room TV.

Apple IMAC (Photography by ©John Short / courtesy of MIT Press)

That jump was called the silicon gate. In the late 1960s, Federico Faggin at Fairchild Semiconductor in Palo Alto, California, tried exchanging aluminium control gates in transistors for ones made of polycrystalline silicon, and found they leaked less current, required less space, and worked more reliably. Suddenly, the multiple boards of components that comprised the innards of the previous generations of computers could be compressed into tiny integrated circuits. The microprocessor was born: a single chip which could perform multiple functions at a far lower cost of production.

The microprocessor enabled mass production for the mass market. The first to be commercially available was Intel's 4-bit 4004 in 1971. It held 2,300 transistors and its circuit lines were 10 microns wide, and it was capable of performing 92,600 instructions per second. Two other microprocessors also appeared around the same time: Garrett AiResearch's MP944, which was first used as part of the Central Air Data Computer for F14 fighter planes, and Texas Instruments' TMS 1000. None of them was powerful – the 4004 could only really drive a calculator – and they couldn't remotely compete in pure performance with mainframes.

But they were just the vanguard. Three years later, Intel shipped the 8-bit 8080, which was much quicker, supported a greater variety of instructions, and could interface with other components more flexibly, and it powered the very first generation of kit microcomputers.

Kits comprised circuit designs, build instructions, and the components to make them, and they were the first computers that made their way into family homes. Requiring soldering skills and an understanding of electronics, not to mention a good deal of money, kits such as the Altair 8800 were very much the domain of hobbyists, enthusiasts who tinkered in their garages to explore what a computer could do. The act of building them lent insights into how they worked and gave opportunities to customize and augment them with better parts.

That self-built nature naturally led to dreams of running businesses: if you could make one for yourself, perhaps you could make your own to sell? Especially in places like Silicon Valley, where so much computer research and development was going on, a cottage industry of manufacturers who designed new kits and components grew. Magazines such as Popular Electronics rushed to support it, sharing circuit designs and program listings, reviewing products, and selling advertising.

Commodore (Photography by ©John Short / courtesy of MIT Press)

Clubs started up, the most emblematic being the Homebrew Computer Club, which was founded by Gordon French and Fred Moore in Menlo Park, California, in March 1975. Their mission was to help make computers accessible to anyone, exchanging ideas and know-how over beers. They also published a newsletter that became the voice of Silicon Valley, with such material as a letter from Microsoft cofounder Bill Gates in February 1976 called 'An Open Letter to Hobbyists' that called out the scene about fair pay for his new company's software:

As the majority of hobbyists must be aware, most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?

Is this fair? … The royalty paid to us, the manual, the tape and the overhead make it a break-even operation. One thing you do is prevent good software from being written. Who can afford to do professional work for nothing? What hobbyist can put 3-man years into programming, finding all bugs, documenting his product and distribute for free? The fact is, no one besides us has invested a lot of money in hobby software. We have written 6800 BASIC, and are writing 8080 APL and 6800 APL, but there is very little incentive to make this software available to hobbyists. Most directly, the thing you do is theft.

The Homebrew Computer Club achieved its aim. Inspired by being one of the thirty-two people who attended its first meeting, Steve Wozniak designed and built the Apple I. While showing it off at a later meeting he met Steve Jobs, and together they founded the American company which arguably did the most to explore and expand the microcomputer's potential.

Outside meetings, the newsletter helped to establish the language and shape of this new category of computers, establishing the concept of the 'personal computer': a machine designed for a person's everyday individual use. After all, for most of the 1970s, there was no agreed form to the microcomputer. It wasn't until a more formalized commercial industry began to grow that it started to come in cases or be supplied with integrated keyboards, speakers, or displays. Computers weren't designed, in the sense that they were intended for a particular use. They were, more or less, just computers for computers' sake.

Until, that is, 1977. That year, three companies introduced new computers which were very much designed along the lines of discussions at the Homebrew Computer Club. Two out of the three – the Apple II and the PET 2001 – were specifically marketed as 'personal computers'; in other words, they were aimed at a new market of buyers who weren't looking to self-build or gain great insights into electronic circuitry. This new market, it was hoped, wanted off-the-shelf machines that came with every necessary component and only needed to be plugged into the wall before they worked.

In other words, it was time to popularize the microcomputer. As Jack Tramiel, CEO of Commodore, makers of the PET 20018, once put it: Computers for the masses, not the classes.

Science of Cambridge (Photography by ©John Short / courtesy of MIT Press)

SDC Minivac (Photography by ©John Short / courtesy of MIT Press)

Sinclair Spectrum (Photography by ©John Short / courtesy of MIT Press)

Please Donate to Help Save PopMatters

PopMatters have been informed by our current technology and hosting provider that we have less than a month, until November 6, to move PopMatters off their service or we will be shut down. We are moving to WordPress and a new host, but we really need your help to save the site.


Music

Books

Film

Recent
Books

Peter Guralnick's 'Looking to Get Lost' Is an Ode to the Pleasures of Writing About Music

Peter Guralnick's homage to writing about music, 'Looking to Get Lost', shows how good music writing gets the music into the readers' head.

Film

In Praise of the Artifice in George Cukor's 'Sylvia Scarlett'

George Cukor's gender-bending Sylvia Scarlett proposes a heroine who learns nothing from her cross-gendered ordeal.

Music

The Cure: Ranking the Albums From 13 to 1

Just about every Cure album is worth picking up, and even those ranked lowest boast worthwhile moments. Here are their albums, spanning 29 years, presented from worst to best.

Television

The 20 Best Episodes of 'Star Trek: The Original Series'

This is a timeless list of 20 thrilling Star Trek episodes that delight, excite, and entertain, all the while exploring the deepest aspects of the human condition and questioning our place in the universe.

Music

The 20 Best Tom Petty Songs

With today's release of Tom Petty's Wildflowers & All the Rest (Deluxe Edition), we're revisiting Petty's 20 best songs.

Joshua M. Miller
Music

The 11 Greatest Hits From "Greatest Hits" Compilations

It's one of the strangest pop microcosms in history: singles released exclusively from Greatest Hits compilations. We rounded 'em up and ranked 'em to find out what is truly the greatest Greatest Hit of all.

Music

When Punk Got the Funk

As punks were looking for some potential pathways out of the cul-de-sacs of their limited soundscapes, they saw in funk a way to expand the punk palette without sacrificing either their ethos or idea(l)s.

Music

20 Hits of the '80s You Might Not Have Known Are Covers

There were many hit cover versions in the '80s, some of well-known originals, and some that fans may be surprised are covers.

Music

The Reign of Kindo Discuss Why We're Truly "Better Off Together"

The Reign of Kindo's Joseph Secchiaroli delves deep into their latest single and future plans, as well as how COVID-19 has affected not only the band but America as a whole.

Books

Tommy Siegel's Comic 'I Hope This Helps' Pokes at Social Media Addiction

Jukebox the Ghost's Tommy Siegel discusses his "500 Comics in 500 Days" project, which is now a new book, I Hope This Helps.

Music

Kimm Rogers' "Lie" Is an Unapologetically Political Tune (premiere)

San Diego's Kimm Rogers taps into frustration with truth-masking on "Lie". "What I found most frustrating was that no one would utter the word 'lie'."

Music

50 Years Ago B.B. King's 'Indianola Mississippi Seeds' Retooled R&B

B.B. King's passion for bringing the blues to a wider audience is in full flower on the landmark album, Indianola Mississippi Seeds.

Film

Filmmaker Marlon Riggs Knew That Silence = Death

In turning the camera on himself, even in his most vulnerable moments as a sick and dying man, filmmaker and activist Marlon Riggs demonstrated the futility of divorcing the personal from the political. These films are available now on OVID TV.

Film

The Human Animal in Natural Labitat: A Brief Study of the Outcast

The secluded island trope in films such as Cast Away and television shows such as Lost gives culture a chance to examine and explain the human animal in pristine, lab like, habitat conditions. Here is what we discover about Homo sapiens.

Music

Bad Wires Release a Monster of a Debut with 'Politics of Attraction'

Power trio Bad Wires' debut Politics of Attraction is a mix of punk attitude, 1990s New York City noise, and more than a dollop of metal.

Music

'Waiting Out the Storm' with Jeremy Ivey

On Waiting Out the Storm, Jeremy Ivey apologizes for present society's destruction of the environment and wonders if racism still exists in the future and whether people still get high and have mental health issues.

Music

Matt Berninger Takes the Mic Solo on 'Serpentine Prison'

Serpentine Prison gives the National's baritone crooner Matt Berninger a chance to shine in the spotlight, even if it doesn't push him into totally new territory.

Music

MetalMatters: The Best New Heavy Metal Albums of September 2020

Oceans of Slumber thrive with their progressive doom, grind legends Napalm Death make an explosive return, and Anna von Hausswolff's ambient record are just some of September's highlights.


Reviews
Collapse Expand Reviews



Features
Collapse Expand Features

PM Picks
Collapse Expand Pm Picks

© 1999-2020 PopMatters.com. All rights reserved.
PopMatters is wholly independent, women-owned and operated.