I moved through the thesis using a mixture of 90% Java, 7% Python, 2% BASH scripting and 1% C programming (not C++, but to my defence, I’ve always used the C99 standard – so I am not that old-fashioned).
When I’ve first mentioned that I intended to do all the thesis prototyping in Java, reactions were overwhelmingly negative, with gloomy cautionary tales of insufferable slowness, and unbearable memory footprint. I’ve kept those criticisms in mind, but imbued of scientific scepticism, I started prudently to perform some of the experiments in Java.
Soon I noticed that my prototypes were slower indeed, but not that slower (certainly less than 10× slower, usually less than 2× slower) and that the memory footprint was higher indeed, but not so high that experiments would become unfeasible. But mainly I’ve noticed that my prototypes were much more reliable than the ones written by the C/C++ programmers (everyone else), thus having less tendency to dump the core in front of the Ph.D. supervisor at the meetings. (It’s not that I program better than my colleagues, it’s just the wonders of having a garbage collector).
I was happy with my choice, and I’ve even gently mocked my colleagues when they defended that real programmers do pointer arithmetic and manual memory deallocation (but they know that I was kidding when I said that C++ is a language whose only purpose is to print at random one of the messages “Hello, world !” and “Segmentation Fault: Core Dumped”).
* * *
But despite the success of that “scientific prototyping experience”, Java has never conquered my heart. All languages have their small annoyances, but my gripe with Java comes from quite an essential design choice: the separation between primitive and reference types. This is one of those essential character flaws, and would only be forgivable if Java provided both autoboxing and a very clever optimising compiler. So far, only half of the solution is available: you can pretend at design time that your Integers are ints (and vice-versa), but try doing it in a critical inner loop, and at runtime you’ll become the laughing stock of all C++ programmers of the lab.
I’ve recently flirted with C# (which apparently handles this problem in a better way), but the lack of a general multiplatform implementation has made me shy of adopting it. One of the wonderful things of Java is that I can develop in Windows, my desktop system, and then run in whatever system is available at the University infrastructure: Windows, Linux, Sun’s UNIX – the integration is absolutely seamless.
I have also considered taking the “XKCD Approach” and do everything in Python, but though I really like Python for scripting, I shudder at the idea of relying exclusively on dynamic typing for anything larger than 500 lines of code.
* * *
In a way, this quest for perfection is all the fault of Bertrand Meyer, and of having read his books in my young naïve years. I am still waiting for The Definitive Programming Language. The one with a static type system which make theoreticians drool, and yet is uncannily intuitive to use. The one with a huge Standard Library, yet with a gentle learning curve. The one whose spoonfuls of syntactic sugar helps the medicinal exception handling code go down in a most elegant way.
I suspect I shall be waiting for a good while.