Historical Computer Science

Java Logo

Like any well-trained Ph.D. student, I have come to see my own discipline as the master discipline, upon which all other forms of knowledge are based. For instance, I have repeatedly pestered my fianceé, who works in math education, with the idea of teaching mathematics historically. What better way (I enthuse) to teach, say, imaginary numbers than to understand why they were invented in the first place; the historical context that led to their emergence.Still convinced (perhaps quite foolishly) that this is a brilliant idea, I have begun to think recently about how the same concept might apply in computing--how, that is, the history of computing might be used to teach computer science.As an opening first stab at such an approach, I propose that the best way to really understand a programming language is to understand its history. Java provides a case in point: Java was invented by Sun engineers in the early 1990s.  Its original intended purpose was as a language for interactive television, but it was soon evident that interactive TV was not taking off as hoped, while a new phenomenon known as the World Wide Web certainly was.  So Sun re-positioned Java as the language of the Web. This historical context, I propose, can explain many of the features of Java:

  • Why does it have C/C++-like syntax?  Because its creators were steeped in the Unix world of Sun, which made was based primarily on C.  They also hoped to attract others like them with a familiar syntax.
  • Why does it run on a virtual machine? Because it was intended to operate in an on-line environment, where programs would be downloaded over the network onto a variety of different machines.  The virtual machine provided both cross-platform code compatibility and the potential for a secure 'sandbox' where untrusted code from the network could operate.  
  • Why is it object oriented? In part because object-oriented code fit well with a client-server model of networked computing: only the class files needed on the client-side.

The advantage of this approach would be to connect otherwise abstruse concepts like object-oriented programming with human motivations, which might make them easier to learn and understand.  It would also give a better sense of the openness and potential of computing.  Rather than simply seeing Java as a given (this is what we have to learn, this is what programming is) students might get the sense that there are many, many ways to design a programming language. What do people think?  Is there anything to be said for using the history of technology to help teach technical disciplines?