Thursday, July 03, 2014

Computer Science History

Computer science tends to be a very forward looking field. We are constantly looking at the newest things including the future of computers, of programming languages and paradigms, the future of applications and of careers. This is natural with the rapid pace of computing but we should also take care not to neglect the study of computing's past. The old saying that those who do not study the past are doomed to repeat it holds as true in computing as it does in other fields.
We can learn a lot about the future (and the present) by understanding how we got to where we are today and how technology has transformed how we did things in the past. For example, today cloud computing is the trend for the future but it also has parallels with several technologies in the past.
Cloud computing is in many ways similar to the earlier move to client/server architectures. In the past the clients and servers may have been local to each other or even different software on the same computer. Today, the user side is a web browser and the "client" is somewhere out there in a location you may not even know. The client server also has parallels to the early days of mainframe computing. In those days, data entered on block mode terminals or card punch machines was sent to remote systems in climate controlled rooms with limited physical access. Seems a lot like the huge data centers of today doesn't it?
There were also all kinds of issues, both technical and cultural, involved in all of these paradigms. many of them are relevant to today's technologies. But are our students aware of them? Often they are not. The generation that has been involved in computing for many years is reaching retirement and the institutional memory it holds is at risk if we do not pass it on to students today. The problem is that it is difficult to make time to fit computing history into existing curricula or courses. There is little room in an AP CS A course for it, for example. There probably is room for it in the CS Principles course and the Exploring Computer Science course, but we have to do integrate it carefully. A dry memorization of dates, names, and events is dull. It doesn't communicate the richness of the history and it bores students to no end.
There are many resources on the Internet for studying computing history. The Computer History Museum has numerous resources online. The ACM has also made all of the great Turing Award lectures available (in some cases as video) enabling computing pioneers to share what they know with future generations. And speaking of voices of the pioneers, the Computer History Museum has an Oral History collection with interviews and talks by some of the greats.
How can we use these resources in interesting ways? I think it may work best in the context of discussions of the future. Students are very imaginative in their thinking about the future. It would be worthwhile to challenge them to look into previous technologies and developments for parallel changes. Or to ask them to look at the work of pioneers and ask them to project those developments forward to today and beyond. Challenge them to think about the progression of technologies from the past into the future.
We add value to teaching when we go beyond the technology and into the consequences of technology. Students are interested in that. History can help them make sense of the present and the future. And that is something we need to do.
Note: This post is cross posted from the CSTA blog where I post something about every other month as part of my responsibilities as a member of the CSTA Board of Directors. I am cross posting these posts in advance as part of my attempt to take a vacation from the Internet for a few days.

1 comment:

Ray said...

Alfred, Thanks for your post. I am a retired engineer and computer scientist and I am teaching math, engineering and robotics to low income rural kids in Mississippi. Keep up the interesting posts.

http://MississippiRobotics.org
http://FirstMicroprocessor.com

Ray