One of the people I really admire is Erik Meijer who is an absolutely brilliant computer scientist and outstanding communicator. And a real fun guy. One of the things I have heard him say are words to the effect that if you want to know what the next big thing is in computer science look at what was big 20 years ago. I think he is totally correct here. Of course when we reinvent things 20 years later one hopes that we learn the lessons for the previo0us incarnation. But that can’t happen if people are unaware of previous innovations.
I’m pretty lucky in this regard as I started in computer science over 35 years ago so I’ve seen a couple of cycles already. But what about students today? They can’t remember from life experience but are they learning history either? It’s not clear to me that they are. Oh sure most students get a unit or three on history of computers at some point but how much sticks? And how much is more than cursory story telling? I know that what I taught was pretty cursory. Does it matter?
Well take cloud computing for example. Is this absolutely new? Not really. The early days of mainframe computers were basically the same thing as cloud computing in many concepts. One had all sorts of remote terminals (think thin clients) that connected though a network (hardwired or leased phone lines rather than Internet) to some system somewhere managed by some people that you probably have no real contact with. Sounds a lot like cloud computing to me. Sure there are differences but that’s the part that matters. What problems did mainframes have? Lack of user control of applications and data. Dependence on other organizations for management. There were reasons why first mini computers and then PCs took over. How do we avoid that in the future without knowing the problems of the past?
And honestly, between you and I, I think we’ve lost some things from the past as well. While we have great and powerful databases we seem to be short of simple easy to use flat file systems. And command line interfaces used to me easier to use but now we only let experts use them because they pretty much require expertise that they didn’t used to. Most people I know seem to only know two or three operating systems (Windows, Mac and/or Linux/UNIX) This seems pretty limited to me because there was a time when I’d use four or five in the same day. I learned a lot from those days but some of it has been lost.
The Computer History Museum is a great repository of hardware. If you ever get a chance you should visit. And they do have information on software as well. (I like the computer software timeline on their website) But it is had to understand software without using it or at least digging into the documentation. I’m not sure how we avoid losing this part of our history. Especially if when someone says “back when I was programming in ‘76 …” everyone turns off their listening. Perhaps collecting oral histories is a start. Now if we can just get people to listen to them, learn from them, and move us all forward.
No comments:
Post a Comment