One of my biggest bugbears with game rendering is aliasing: Shimmering pixels, crawling pixels, 'jaggies', etc. It just looks *wrong*.
If been thinking a lot about how to solve this, and my current conclusion is that aliasing is essentially a kind of static 'distortion' of the intended image as a result of not having enough bandwidth, while this static distortion should actually have resulted in noise.
I've tried to write the idea down in a lot more detail in this document
. I'm no great writer, so if anything is unclear or is just plain wrong, let me know.
I've also made a small proof of concept, which simulates rasterisation issues by scaling down a high-resolution image to a smaller res image (16 times less pixels).
You can download this little test program here (executable jar)
, with the sources here
. The test program was quickly hacked together, so by no means is it a nicely designed thing; it's just a proof-of-concept.
To use the program:
* Start the executable jar
* You see a downscaled image showing lots of aliasing artefacts. I've drawn a white circle around a particularly problematic area.
* Click on the image to make key events register.
* There are 6 rendering modes:
The default rendering with no anti-aliasing
Added 'noisy rasterisation' (see document), halved colour depth, added noise.
This mode already sort of 'solves' aliasing at half the bandwidth (because of the reduced colour depth), but the image is very grainy.
Traditional 2xMSAA. It's already better than the default no-aa rendering mode, but there are still aliasing issues.
Added 2x 'noisy' MSAA, and reduced overal noise while maintaining the halved colour depth. This should be approximately the same bandwidth as no-aa (the default mode).
Traditional 4xMSAA. Better than 2xMSAA, but the circled part still shows aliasing issues.
Added 4x 'noisy' MSAA, and reduced overal noise while maintaining the halved colour depth. This should be approximately the same bandwidth as traditional 2xMSAA.
Added 4x 'noisy' MSAA, and reduced overal noise but using full 24bit colour depth. This should be approximately the same bandwidth as traditional 4xMSAA. Compared to tradional 4xMSAA, the problematic circled area renders correctly at the cost of noise.
PRESS 'n' to toggle the added layer of noise (but not the 'noisy rasterisation') to see what the effect is.
Perhaps I've just reinvented the wheel somewhere, and I'm not really an expert in 3D rendering in great detail, but I'm quite enthusiastic about the results.
Essentially it sort of 'solves' aliasing by replacing it with 'natural' noise, where noise *should* be happening in real life.
Of course it's of limited use because in many cases one would actually prefer aliasing over noise (nobody wants a noisy Mario), but for naturalistic scenes I think it's quite effective.
What I would be very interested in is an OpenGL implementation of this. Is this even possible?
Adding a simple layer of noise is easy enough, but I wouldn't know how to add noise in the rasterisation process (essentially randomising aliasing artefacts).
Any other thoughts?