In a sad but expected follow-up to Chris's post from a little over a month ago, this entry marks the passing of Steve Jobs, co-founder of Apple, who has died at the age of 56.
While many renowned leaders of industry are remembered for one big thing, or can have their accomplishments summarized in a cohesive way, Jobs had a career that can hardly be characterized by one--or even two, or three--major accomplishments. From the Apple II, to the original Macintosh, to Pixar studios, all the way to the iMac, iPod, iPhone, and iTunes Music Store, Jobs continually reinvented himself and created new technological landmarks. Perhaps the one thing that these all have in common is the way in which they encourage us to take technology for granted, with the goal of elevating the user experience. As many pointed out during his lifetime, and continue to point out after his passing, this approach was a double-edged sword that often cleft away useful functionality right along with the cruft. But, I think it is also undoubtedly something that will cement Jobs's legacy and importance to the field of computing long after his death. What do you think?
The Washington Post has a very lengthy and insightful obituary of Jobs, and the obituaries in The Guardian and New York Times are also worth a look. It's interesting to compare them with the premature obituary of Jobs that was accidentally published in 2008. You can also read what Jobs himself had to say about his life, and a short but telling piece from an Apple fan on ZDNet.
Recently, the BBC reported that the London Science Museum plans to add to its collection in the history of computing by digitizing Charles Babbage's huge store of design notes on the Analytical Engine. Though the 19th c. Analytical Engine is often pointed to as a machine that presaged the modern computer, a working version was never fully built in Babbage's lifetime (although the notes on the potential machine resulted in the first computer program, written by Ada Byron, Countess of Lovelace). And historians have not been the only ones fascinated with this machine--alternate histories in which the Analytical Engine was successfully built form the bedrock of a significant amount of science fiction, particularly in the steampunk subgenre.
Those of you in or around NYC might be interested in the exhibit series called the Silent Series at the New Museum, which aims to present interactions between contemporary art and technology.
Our Closing Plenary this year will ask panel participants and audience members to consider questions on the theme of Cultures & Communities in the History of Computing, with an eye to exploring where the field of the history of computing has been, and where it is going. We're very fortunate to have a great line-up of speakers, including Tom Misa of the Charles Babbage Institute as moderator, Alex Bochannek of the Computer History Museum, Nathan Ensmenger from the University of Texas at Austin, Eden Medina from Indiana University, Bloomington, Andrew Russell from the Stevens Institute of Technology, and Jeff Yost of the Charles Babbage Institute.
Each participant has been asked to provide a question in advance to jump-start discussion. Click on "read more" below to see the current questions. Feel free to leave comments if there are other issues that you would like to see delved into at the closing plenary--we hope to involve the audience as much as possible.
A recent Wired article on Khan Academy gave me a distinct sense of déjà vu. A pre-programmed set of lessons that are written once, and then can be used by kids anywhere in the country? They allow students to proceed at their own pace, simulating the advantages of one-on-one tutorial instruction? They ensure that a student have mastered a given concept before allowing him or her to move on to more advanced material? Data on student performance is automatically collected for analysis by educators? A claim that all of this is totally new and is going to revolutionize the staid old American education system? Where have I heard this all before...