Say I have a model with over 2373 vertices.
I would absolutely love to store my vertices in a linked list due to a lot of model files not mentioning their size.
Anyway so far my solution is inelegent and annoying.
In my main loop I do this...
if(this.arrayIdx == this.tmpArray.length)
...because I don't want to do over 2373 casts per primitive type, most models have over 6 types of primitives.
6 * 2373 = 14238 casts if I use an array list or a linked list(prefered in this case).
So lets say I have 64 different entities I needed to cache and all of them are highly detailed(over 2373 vertices) so know we have 14238 * 64 casts , which now makes 911232 casts.
When will primitives be converted into primitives during runtime and not as (int)((Integer)object)?
Since you are doing 2 casts then the real amount of casting done is double:
911232 * 2 = 1822464.
Surely escape analysis can fix this?
If not when will Sun fix this?
I'm not asking for the entire removal of type erasure in this thread, just remove it when dealing with Java's built in primitives because it is highly embarrasing for Sun to not optimise off their own primitives within collections.