January 24th, 1984: a seminal date in history, I think. I can so remember, two days before, watching Super Bowl XVII, wherein the reigning NFL champions - the Washington Redskins - were supposed to beat up on the ragtag Los Angeles Raiders. Joe Theisman, John Riggins, Dexter Manley and the boys were going to make short shrift of Jim Plunkett, Marcus Allen, Lyle Alzado and Howie Long (I was rootin' for the Raiders). Instead...the Raiders punished the previous year's Super Bowl CHampions and gave them a beating in front of the world. Its also the day that Ridley Scott's infamous Macinosh debut commercial aired, heralding the arrival of the Macintosh on January 24, 1984 ("On January 24, Apple Computer will introduce the Macintosh. And you'll see why 1984 will not be like 1984).
Of course millions more will remember all this more for the commercial and the significance of the media event more than they will for the significance of the Macintosh. But, for folks like me, Macintosh fundamentally changed our lives. And for the better.
I wasn't a computer geek in 1984...at least, not in the classic sense. I had become a systematic and heavy user of computing for my college work, crunching statistics on an old Sperry Univac housed in the basement of Struve Hall (the Math Building) at UC Davis. You see, back then, if you studied Computer Science, you got a degree in Mathematics (at least at UC Davis...only the leading edge universities in computing, like MIT or University of MIchigan, were offering CS degrees to undergrads back then). As an undergrad in Economics, I was crunching statistical calculations (polynomial regression, for those that care), in a field of Economics called Econometrics...simply put, the part of the field that calculates supply and demand curves. So, we used those big, refrigerator sized computers that required big rooms full of air conditioning so they didn't over heat...and they provided less computing power than exists in your iPhone, shared between dozens of users at a time over a network. Everyone shared a big (yeah...like 5MB!) centralized storage unit (what we now call "the cloud") and we were all delighted that we didn't have to enter data (or programs!) on punch cards. This was the world into which the Macintosh was introduced.
Yes, the Altair had introduced the notion of microcomputing. The Apple II's existence made computing (not only the hardware, but useable software) available to the common man...and IBM was coming off the corporate mountain to help revolutionize the way computing was distributed (and sold) with the IBM PC. But it was the Macintosh, the manic dream of a team of misfits (misfits even within Apple Computer), that fully realized the dream of "an information appliance" (a tip of the hat to Jeff Raskin for coining that term...oh yeah, and for heading up the Macintosh project!). It put the power of computing into the hands of "everyone"" or, at least, made it accessible to everyone, much like the Apple II...but a generation advanced. That seems almost trite today, but it was revolutionary 30 years ago.
Many will point to the GUI (graphical user interface) of the Macintosh as its key defining feature and that which "revolutionized" computing. I'd argue that really reflects the arguments of those driving the businesses of the time and how nascent the whole field of computing was. MS-DOS (the operating system that drove IBM PCs, and later, all the PC clones), CP/M (the other prevalent operating system for micro computers at the time)... MVS, UNIX, etc. were all command line driven interfaces: completely text driven interfaces. In fact, all that computers dealt with, at the time, really was text and numbers. Macintosh was designed, from the processor up, to deal with graphical information, rich information containing a variety of media. The popular argument about the GUI being so revolutionary sort of misses the whole point: it wasn't about being able to drop a pie chart into a newsletter that was displayed WYSIWIG ("what you see is what you get", a term coined by the Macintosh team), it was about managing, presenting, viewing, using information on a computer the way you did otherwise. Force fitting everything into the text-based frame (printed out by a dot-matrix or daisy wheel printer) was not only contrived, but really inefficient. Creating, managing and using information in a fashion more similar to how you did it outside of the computer really needed to be the next breakthrough.
Now, the naysayers claim that Steve Jobs and the Macintosh team ripped off the interface of the Alto (and later the Dandelion) computer from Xerox. But that's just bunk.:
- GUIs were first presented not by Xerox on the Alto...but by Doug Englebart (who brought the whole notion of GUIs to Xerox's PARC facility from the Standard Research Institute) who first demonstrated it in 1968 in The Mother of All Demoes. This same demo also showcased the computer mouse designed by Bill English (upon Englebart's drawings) and developed by Bill Duvall (who wrote the software that sent the first packet over the Internet, developed what became the Macintosh Programmer's Workshop (MPW) and wrote the first C compiler for the Mac (Consulair C)).
- Computer Science, like all engineering, is an evolutionary discipline that builds on the previous developments of others. To blatantly display my engineering bias, I'll state that, by the time the accountants and lawyers can get the licensing in order to allow derivative works, folks like Andy Hertzfeld, Bill Atkinson and Steve Capps are shipping product to customers that fundamentally alter the course of history (sorry, I readily admit that excessive hyperbole dominates those of us who consider ourselves tied tightly to the early days of the Macintosh). In this context "ripping off" is really a term about legal standing with respect to intellectual property and has no more relevance in this context than saying Scott Joplin ripped off his ragtime ideas from Jelly Roll Morton.
- The early GUIs, as demonstrated on the Alto, at SRI's Augmentation Research Center, were, at best, experimental They strove to find their place in the emerging field of human/machine interaction. They were pioneering, seminal, brilliant efforts aimed at evolving the still infant computer interface into something truly useful. The Mac was a mass-market computer aimed at general users. Its graphical user interface, its operating system and applications, its GUI and applications were designed with a totality of the user experience in mind. The interface was an integral part of the value proposition of the Macintosh and established the metaphor represented in icons, a windowing system, the use of a mouse, drag and drop interactivity and the desktop metaphor as the core elements of the next generation of computing experience.<br>