Computerized Education: Déjà Vu?

Image from Parts of a cell, Khan Academy lesson

A recent Wired article on Khan Academy gave me a distinct sense of déjà vu.  A pre-programmed set of lessons that are written once, and then can be used by kids anywhere in the country?  They allow students to proceed at their own pace, simulating the advantages of one-on-one tutorial instruction?  They ensure that a student have mastered a given concept before allowing him or her to move on to more advanced material?  Data on student performance is automatically collected for analysis by educators?  A claim that all of this is totally new and is going to revolutionize the staid old American education system?  Where have I heard this all before...

Historical Computer Science

Java Logo

Like any well-trained Ph.D. student, I have come to see my own discipline as the master discipline, upon which all other forms of knowledge are based. For instance, I have repeatedly pestered my fianceé, who works in math education, with the idea of teaching mathematics historically. What better way (I enthuse) to teach, say, imaginary numbers than to understand why they were invented in the first place; the historical context that led to their emergence.Still convinced (perhaps quite foolishly) that this is a brilliant idea, I have begun to think recently about how the same concept might apply in computing--how, that is, the history of computing might be used to teach computer science.

Summer reading for historians of computing -- suggestions needed.

Please consider helping the community sharpen its engagement with new ideas. Back in graduate school I read feverishly in labor history, business history, history of technology social history, organizational sociology, etc in preparation for my oral examinations. My classes covered still more eclectic topics, ranging from a "greatest hits" of literary theory to nonparametric methods. Over the ten years since I physically left Penn I've been focused on an ever more specialized set of literatures, primarily the burgeoning history of computing field, which I know in ever more depth. In general I've also been doing more writing and less reading.

The Shock of the Old

Turntable and Record Photo, Licensed under Creative Commons Attribution 2.5 Generic, © 2004 by Tomasz Sienicki

A couple weeks back I discussed Matthew Lasar's article on Ars Technica about the invention of the PC. Lasar has done it again this week with an excellent piece on the surprising persistence of old technologies.  Tech pundits, Lasar notes, are very quick to declare technology dead or obsolescent, when the latest, hot thing comes along:

Sherry Turkle on Furbies

Robot Scrabble

WNYC's Radiolab is a show dedicated to making difficult scientific issues accessible and interesting for a popular audience. I sometimes assign segments of episodes to my history of technology students to reward them after particularly dry or difficult readings, so the recent episode on AI, called "Talking to Machines" caught my eye.

Living technological change

There was an article in the New York Times recently that summarized findings of scientists studying the effects of light on sleep/wake cycles. One of the most interesting findings, for historians of computing, was the fact that the bright, bluish light put out by modern computer screens very effectively suppresses the body's ability to generate melatonin, and therefore to sleep well and regularly. Disturbed sleep, however, was not the only effect observed.

Science Fiction and the History of Computing

Science Fiction and Computing: Essays on Interlinked Domains (Book Cover)

David Ferro recently posted on the SIGCIS mailing list about the release of his and Eric Swedin's new edited volume, Science Fiction and Computing. This is a sequel, of sorts, to a workshop at the Society for the History of Technology meeting in Tacoma last fall. I thought it would be appropriate to re-post this announcement here for further publicity and discussion, given the extent to which the book is a product of this community. As David wrote, the contributors (other than the editors themselves) include Thomas Haigh, Janet Abbate, Paul Ceruzzi, David A.

IBM Turns 100, and Creates a Stir

The IBM Personal Computer

Journalists across the Web (mostly) celebrated the 100th birthday of IBM last week, on June 16th. See for instance, coverage at The New York Times, Wired, and Forbes. My history of computing colleagues at the IT History blog also covered the story, with a business history perspective from Joel West. As a former IBMer, I can't help but feel a small twinge of pride at this milestone.

Preserving individuals' on-line efforts at capturing bits of computer history

There are many excellent individually maintained websites that give bits of computer history, for instance,

E3 Gets Historical

Gamers Playing Atari at Classic Gaming Expo

The annual Electronics Entertainment Expo (E3) is where video game companies have congregated since 1995 to show off their forthcoming gadgets and games to the press. It is a bombastic celebration of the latest and greatest, the newest and shiniest. Any attention to the past is, for the most part, uncomfortably out of place there.

Pages

Subscribe to SIGCIS RSS