Spreading the Hacker Ethic to a wider audience is highly important, even if it means compromising ideals to ensure wider adoption. The fact that the Silicon Valley “Hardware Hackers”—not the Cambridge “True Hackers”—were the spark for the explosion of the personal computer industry is compelling evidence for this viewpoint.
In theory, the ideological Hacker Ethic purists’ vision of a utopia of freely sharing information and software is appealing, and it still has a very important role in the modern world. These hackers believed there should be no barriers of intellectual property: people should be free to view, copy, distribute, and modify code as they saw fit. This worked well in small enthusiast communities; in the AI Lab, everyone stored tapes of their code in a common drawer, and tinkering on and improving others’ work was encouraged and expected. This decentralized collaboration worked well for systems programming, but it did little to spread computers beyond their building.
The computing movement on the West Coast in the seventies started in much the same way, with the Homebrew Computer Club serving as an easily accessible forum and community for building computers and software. Although many of the members worked for large tech firms like Intel and HP, they had no problem with sharing company secrets in the name of openness. The primary computing innovation during this time (the first half of Part 2 of Hackers) was the Altair 8800 microcomputer, which was crucial in lowering the barrier to entry. One no longer had to be affiliated with one of a few universities or companies to gain access to a computer; they only needed some technical knowhow to assemble their own Altair for $400. However, “technical knowhow” is a bit of an understatement, as many customers struggled to properly assemble their machine (so much that MITS was swamped with requests for assistance), and programming the memory was incredibly tedious and error-prone.
Steve Wozniak and Apple were instrumental in making personal computers really personal. Intentionally aimed at regular households, rather than professionals or hobbyists, the Apple II featured “a sleek, warm, friendly variation of a typewriter, futuristic in its low slope, but not so harshly angled that it looked menacing” (263–64). This was certainly a sign of things to come for Apple, whose products continue to be seen as highly fashionable. Wozniak, who got his start as a hacker, was initially hesitant with the business objectives Steve Jobs emphasized, although he was willing to forgo his idealism somewhat in keeping Apple competitive.
The conflict between the business-oriented and the openness-oriented people is exemplified by the fight over Altair BASIC, when Bill Gates and Paul Allen believed they deserved payment for each copy of their interpreter and many Homebrew members believed software like that should be shared freely. I believe that in principle it would have been more beneficial for Gates to release the software freely, but in reality we have to consider that people need money to survive, and companies need to make money to continue to innovate. It is just impractical for everyone to spend hours and hours on a piece of software for little or no money. The open source movement has shown that this is feasible for some things, however: instead of collaborating on shared tapes passed around a Homebrew meeting, people now collaborate instantly all over the world via online tools.