Teaching only Java is bad?
Today I saw this blogpost: http://blogs.msdn.com/tparks/archive/2005/12/30/508164.aspx, titled 'Java only is bad mmmkay?'. It's a blogpost inspired on some article by Joel: http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools.html.
I wondered: "Would that Microsoft employee have written 'C# only is bad mmmkay', if C# was the major language of choice at universities? We'll never know but I have the feeling that we never would have seen the post. The reason is that the whole debate which languages are used at universities to teach students programming is one of those debates which shows who gets it and who doesn't. Before you point your finger at me and yell "Haha you now painted yourself into a corner, bozo!!!11", let me explain.
Repeat after me: "It doesn't matter which language is used to teach programming principles".
Ok, again and now a little louder.
Great, your neighbours are again very pleased you live next to them, but I hope you get the point. So if a university opts for Java, or whatever other language, it doesn't matter: it's not the language that's taught, but the principle of writing software: from abstract definition of algorithm to a physical representation of the algorithm in the form of working source code.
There's something else though, and it's plain, filthy marketing crap: the languages students work with at universities are often the languages they pick first after they graduate and get real jobs. So, every student who is exposed to Java is potentially a Java user for a long time. This is one of the reasons why universities are a battleground for software language developers, like Sun and Microsoft. It's sad normal developers participate in these marketing campains.
If people want to contribute to the science of software engineering, that's great. But they should start by leaving the marketing poop out of it. If the concepts of 'OO' have to be taught, you can perfectly use Java, as the tools are free, there is a lot of sourcecode available and above all: a lot of articles and books. Also, many books about Object Oriented software engineering are using Java to explain the concepts. What's wrong with that? The university I went to used Lisp to explain interpreters, used Prolog to explain writing AI, used Miranda to explain functional programming and used C to teach basic programming techniques (and a dozen other languages to teach other principles). Why does a student have to learn C++ as well, if Java serves the purpose of teaching OO (for example, you can also use it for teaching programming and OO, as the students by then already know Java so they don't spend time on learning the syntax, but can fully concentrate on the concepts taught).
To close, I'd like to comment on one quote I grabbed from Joel's article:
Instead what I'd like to claim is that Java is not, generally, a hard enough programming language that it can be used to discriminate between great programmers and mediocre programmers.What a pile of BS. Programming has nothing to do with the language you use, Joel. Implementing an algorithm so it can be run on the other hand does, which is an activity performed after the programming phase is finished. The design of the algorithm, that's the real programming and in that phase you'll recognize who's a great programmer and who's a mediocre programmer. The phase after that, which is the one I'd like to call the 'code generator from the stone-age phase' (I just made that up, but it drives the point home ), is the phase where human beings try to be smart and type in text which they think represent the result of the conversion from the abstract algorithm definition to executable source code. That phase can also be done by a program, it's just that the current state of algorithm describing languages isn't mature enough to outclass plain sourcecode but we'll get there eventually.
And why does a computer language have to be 'hard'? What kind of logic is that? "It's not hard enough, I don't want to use it". As if writing software today is so easy that it's hard to make mistakes and include bugs, isn't it, Joel? You sell bugtracking software, so you should know . So, to extend that logic, pick one of the most arcane languages out there and see if someone can write a solid piece of software in it. If s/he fails, it must be a mediocre programmer!
Use this rule of thumb: you once in a while have these fun threads on programmer forums where everyone posts their most complex piece of code they've ever written. If you want to participate in that thread but can't find anything complex looking code, because all code you wrote is straight forward, easy to understand and clear to the point, you can gratulate yourself: you're a great programmer.