As Bjarne Stroustrup once reckoned, there are two types of programming language: those people bitch about, and those no-one uses. This is certainly true, do other professions have flame wars? "You're still using an Acme Brand spanner? Didn't you know that the founder of P. G. Mechanic's (who last built a working car nine years ago, and has been dining off it ever since) called them Blub spanners, that means you're crap! We use SuperAcme Brand spanners - no one knows how to use them properly, but we attract people who want to tighten nuts..."
Of course, all this programming language hype, FUD, etc. is nonsense. I have never seen a project fail for choosing the wrong programming language, I've seen them fail for every other reason, but not that. This doesn't mean that all programming languages are created equal, far from it, you would be insane to choose COBOL for a greenfield project in this day and age. Provided that certain requirements of platform support, tool support, available resources, etc., are met then language does not matter (from a "Get it done!" point-of-view).
Everyone has a favourite, that's natural. Everyone has their dislikes, for the same reason. The anti-hype for Java, however is something else; it's not normal banter, it defies simple explanation, it says something key about the programmer psyche.
Java, despite the fact that everyone uses it, has never been popular. From the start until about 2000, although there are some throwbacks who still cling to this belief, it was "Java is Slow". Long after that was proven to be wrong the critics piped down (except for the sniping of PHP hackers, but no-one listened to them). However, since then, Java's dominant position has led to it being the hate figure for every growing platform, notably it has served as the central demon for the Ruby and Python communities.
The problem is that most of the anti-Java rhetoric is simply invalid, it involves a variety of most logical fallacies, ego fluffing, appeasing an all-powerful god, revisionist history, and (most often of all) downright lies.
The list of charges that have been laid against Java is too long to refute them all, so I'm not going to try. I'm going to highlight some of the more common and nonsensical, because these are the most interesting, but I will be concentrating on the positives, things Java does best.
The twisted version of history presented by the anti-Java lobby is astounding, if nothing else it should guarantee these people a job in the propaganda department of any third-world dictatorship. So much has been said:
- "Java was designed for stupid programmers."
- "Java was designed to make programmers stupid."
- "Java would never have taken off it it wasn't for Sun pushing it."
- "Applets failed due to bloat."
- "J2EE was an attempt to work around platform shortcomings."
Java was designed to be simple, that is true. The language that Java was aiming to beat was C++, which was famous (still is) for being difficult; the fact that a new language had, as a design goal, the aim of attracting a wider audience than the status-quo is hardly a bad thing. Simplicity in design is hardly a cause for hate.
There's no denying the amount of stupid programmers that exist, but they're not language related. If Python were the worlds most popular language they'd still exist, making a hash of things like they always did. Only the C and C++ advocates could claim a certain amount of immunity, due to the sharp edges exposed by those languages stupid programmers are almost certainly going to be found out; but not always, they find other ways of being ineffectual.
Java was initially designed as a platform independent embedded language, a role it has achieved via J2ME on mobile phones, and inclusion in the Bluray spec. Sun, of course, also adapted it to other purposes; the timing of Java on the scene coincided with the rise of web browsers, a bit of quick thinking later and the applet was born. None of this is hardly unethical practices. Company invests in technology, then promotes it. Sun wasn't (still isn't) in any position to force Java anywhere other than Solaris. But, for some reason, people find this offensive...
The revisionist history of Java applets
It's the fake history of Java applets that is the most interesting from a psychological point-of-view.
First the reality of the growth period (very briefly): 1. Sun had created a platform independent high-level programming platform (Java); 2. Netscape was the dominant browser, and was adding features all the time; 3. The two came to a favourable agreement very quickly; 4. Applets became a de-facto standard which was copied by all other browsers.
The revisionist version of the above: 1. Sun created Java as an evil ploy to cripple programmers and sell more hardware; 2. They used their immense power to force Netscape into adding support; 3. They forcibly kidnapped every early web designer and only let them go when the agreed to develop applets.
The reality of the decline: 1. Microsoft licensed Java and tried to make Windows only extensions; 2. Sun sued Microsoft; 3. Microsoft stopped Java development; 4. Microsoft kept a broken Java implementation built-in to IE for five years; 5. Installing the latest Sun plugin meant disabling the IE default version as well (a hassle Flash didn't have), and as such it couldn't be relied upon and the broken version remained common place; 6. Sun was caught napping when the IE disability was finally removed and didn't push to re-establish the ubiquity it once had.
The revisionist version, being as it is revisionist, makes no sense whatsoever. Prior to 1998 Java applets were common place. From 1998 to 2003 there was a distinct dark age in terms of rich internet applications, the IE/Netscape Java incompatibility meant that applets were banned in many web development organisations. It was only after a very large gap that Flash became common enough to rely on it, and even later still that AJAX became common. The death of Applets had nothing to do with the language, or the platform, but everything to do with Internet Explorer (this was the days when Microsoft and IE were proactively evil, rather than a large legacy nuisance); the world wouldn't have given them up without an equivalent if they had any choice, and certainly wouldn't have dropped an open platform for a proprietary one. But it doesn't stop there, the most rabid anti-Java advocates proclaim this to be evidence of some sort of divine intivention, which proves that: a) improving Java's performance, features, etc. is therefore blasphemy; and b) it is inevitable that the world will spontaneously drop Java from other areas, just because they think that's what happened with applets.
It's classic mythology, a semi-ficitional enemy being defeated by a semi-ficitional hero. The truth is far too boring, that one large company got one over on another large company; it must have been evil Sun pushing it, and collective common-sense that defeated it. It's a good story, but that's not what happened.
But enough of that, there's too much nonsense to deconstruct in one go, I'll save some more for later parts.
There are, in fact, many reasons why Java is still going, still going strong, and still a prime technology choice for new projects. (Many people prefer to paint Java as a legacy technology, the reality is rather different, Java is still first choice for many types of applications in a wide array of industries.) Sometimes the reasons why Java is good is one of the very features that people dislike about it.
The first reason that Java isn't shit: static typing.
I will be using the established definitions of "static typing" and "dynamic typing" even though I would prefer to use other words, for the sake of allowing other people to know what I'm talking about. I reckon that most of the static typing hate is because of the word "static". It's more of a controlled dynamism which uses interfaces, or inheritance, or generics to control what can be assigned to what.
Dynamic typing isn't a feature, it's a lack of a feature, it saves a few keystrokes but it also loses valuable metadata. Quite why modern languages are all expected to be dynamic, I do not know, it makes little sense; while they all have modern features (well not really modern, but only recently mainstream) like closures, continuations, metaprogramming, etc. they're expected to throw out an old useful feature at the same time.
You see, in any part of a function, class, etc. a variable can only legitimately be a certain type (or type of types), you can't pass a String to a function called average and expect it to work, etc. Java allows you to enforce this with ease. Why is getting rid of such a feature supposed to be a good thing? Many non-trivial dynamic language projects seem to end up creating a large suite of unit-tests to perform the same checking, at higher cost, rather than just letting a compiler do it instantly.
It seems that many people don't understand static typing, and even more don't want to understand static typing. It doesn't mean a plethora of duplicated functions for similar functions (i.e. you'd write a function that accepted a Collection rather than two, one for List and one for Set). The final weak point was closed with Generics, before then you would have to cast items in and out of collections, not anymore; also Generics provides a way of ensuring consistency between parameters, variables, etc., so that it doesn't matter what the two types are, they just need to be the same type (or extend, etc.).
Generics aren't easy, that much is granted (which again blows the "Java is designed for stupid programmers" argument out of the water), but they're not that hard either. The trap that people fall into is to think of them as equivalents of C++ templates, they are more akin to ML-style type variables.
Java's take on static typing is not perfect. Some sort of type inference is long overdue. I'd prefer a system where you'd specify types on non-private methods, together with instance and static fields; everything else should be deductable from that. That would offer a win/win approach, you still get the type safety but you can cut out the redundant parts of the code.
Dynamic typing seems to get sold on the basis that knowing what a type a variable/parameter/etc. is somehow optional, it isn't. Dynamic typing leads to two unfortunate side-effects: runtime type errors, which are more difficult to track down than compile-time type errors; and the need to keep track of types in other ways, and share this knowledge with other people working on the code.
And, of course, static typing allows other benefits, like making method overloading possible.
See also: Part 2 - Conservatism.
In future instalments I'll be tackling: Why the Java API is better than the standard library of most dynamic languages; why C# is barking up the wrong tree, for everyone except Microsoft; performance; and much, much more.