Tuesday, June 24, 2014

The More Languages Change The More They Stay The Same

One of the links in yesterday’s interesting links post was to a fun questionnaire for computer science people. Guess the Programming Language asks you to identify programming languages from a small snippet of code. two of the languages in the questions were pretty far out and more joke than useful. I’ll ignore them for the time being. But several were pretty recent real programming languages that are being promoted as new and special.  I have to say though that a lot of them look a lot a like.

At some point one had to work fairly hard to find the unique attributes or syntax between languages. Most of them seem to have a large volume of “stuff” that could be taken right from the definition of the C programming language. Other features, or rather syntax changes, seem to be attempts at obviating the need for the use of letters. OK none of them take it to the extreme that APL did. Am I the only one who remembers this sort of keyboard overly for entering APL programs?
APL-keybd2.svg

Did no one learn from that?

It feels like the early days of programming saw languages that were really different from each other. No one would confuse a FORTRAN program with a COBOL program or a BASIC program. Let alone APL.  Java, C++, C# and more in the C family all look largely alike for the basics. One sort of wonders why people bother creating new languages if they are not really different.

Another thing I have noticed is that while there once was a goal of making programming languages easy for people to understand it seems increasingly like one goal is to make them easier for compiler writers (or at least the people who write the parsers). The use of special characters seems to be going way overboard. Maybe that is through back to or because people didn’t learn from APL? I don’t know.

One thing this has done is encourage the use of block programming languages for beginners. While an (mostly) good idea this makes the jump to more professional languages into a big step. What’s an educator to do?

2 comments:

  1. It seems to be a bigger jump to go from something like Alice or Scratch to a line code language than it is to go between any two modern line code languages. I think my Python kids will have no major problem working in Java next year. Going from block to line is like starting over BUT the kids do understand the logic behind the programming. They do understand sequence, decision, loops, error types and so on.

    ReplyDelete
  2. Michael S. Kirkpatrick10:31 AM

    Yes, some are not designed to be practical. Brainf***, for instance, was actually designed as a thought exercise; it is believed to be the smallest language (i.e., fewest number of symbols) that is Turing complete.

    "Other features, or rather syntax changes, seem to be attempts at obviating the need for the use of letters." Actually, I would argue the opposite is true as a general trend. A lot of the newer languages add letters to replace symbols. For instance, many languages now are adding explicit "end if" or "end while" lines to eliminate the use of { }. Some that are derived from the functional paradigm add "fn" or "fun" or "func" keywords to explicitly declare something as a function. Another common trend is to replace the assignment "=" with either ":=" or "<-" or something similar; the goal here is to eliminate the "=" vs. "==" bug. Also, more languages are supporting loop structures that allow you to say "for x in my_array do" instead of relying on an index variable. So I see the opposite trend.

    "Java, C++, C# and more in the C family all look largely alike for the basics." There are many reasons for that. First, C# is essentially Java for the .Net platform. Blame Microsoft for that. And C++ is an object-oriented superset of C. The reason so many other languages look like C is because of C's entrenched success. You're not going to get a bunch of experienced developers switching to your language if they have to learn Erlang-style syntax. Thus, if you deviate extensively from the C model, you're voluntarily relegating your language to second-tier status.

    "One sort of wonders why people bother creating new languages if they are not really different." The problem with this statement is that it conflates syntax and semantics. All of these new languages differ in a variety of ways, even if the syntax looks similar. One common feature is built-in models for concurrency; C, which relies on add-ons like POSIX threads, is AWFUL for this type of work. Languages like Scala, Rust, Go, Erlang, Haskell, etc., provide structures that eliminate thread management from the programmer's control. The languages also adopt different underlying features of OOP: operator overloading, type inheritance, dynamic typing, generics, object mutability, garbage collection, etc. Lastly, there's the question of the underlying execution mechanism: native compilation, interpreted, compiled to a portable interface (e.g., JVM), use of .Net/CLR, JIT compilation, etc.

    "Another thing I have noticed is that while there once was a goal of making programming languages easy for people to understand it seems increasingly like one goal is to make them easier for compiler writers (or at least the people who write the parsers)." Going back to the first point, I see the opposite trend. A lot of the newer languages (I'm thinking Go, Rust, Scala, Python, Ruby, etc.) are pushing the "rapid prototyping" model that relies less on strict typing and casting than on dynamic type evaluation. E.g., don't force the user to think about how many bits their integer variable needs (no short vs. long vs. unsigned, etc.). One exception might be the functional paradigm (Erlang, Haskell, ML), as the designers of those languages like to think about things like expressiveness, purity, unambiguous grammars, consistent mathematical models, etc. And you're right: those things tend to make writing the compiler/parser easier and less error-prone (which is a good thing, though, because it eliminates a class of bugs that are REALLY HARD TO DIAGNOSE: compilers making bad optimization choices for your particular code). But note that those languages aren't new. They date back to the '70s and '80s.

    ReplyDelete