I don't know if you feel like me, but I think that Sun is turning Java in a non-deterministic language. I'll try to explain...
I've been used to languages where one could say that a loop control structure is O(N). In ol' BASIC days,
took a time proportional to the value of N (at least when the code in the loop is constant and the compiler does not do too much agressive code removal). There are other parameters like the platform, but at least, you could do an algorithmic analysis and optimize your source code on the paper without event running it.
Now, with the Java runtime, the performance of some code depends on soo many runtime
- The JVM type (Sun, IBM, BEA...)
- The JVM compiler type: Hotspot -server or -client. Has the loop run enough to raise the hotspot compiler?
- The JVM parameters: -incgc and surch -X non-portable options.
- And of course platform: you don't write the same code for J2ME or J2SE. On J2ME, you MUST manage your objects allocations.
The programmer has less options to write optimized code and the performance depends on the runtime environment.
As a consequence, when you do a change in your code, you can't predict the impact it'll have on performance.
Remember when code was cluttered with object pools to compensate for the garbage collector poor performance?
Now, I fear to write clean code, with correct performance with today VM (and its tuned runtime parameters) and to discover in a few years that the code+tuning parameters are no more valid with current VM... Nowadays, you take care not to create objects in a critical time loop, but how will run this code when JVM implement escape analysis or other techniques?