I have been working on a simple 2D side-scroller in Java/LWJGL, and decided early on to avoid using the deprecated legacy functions; instead I have set up my own model, view, and projection matrices in a "Matrices" class where they can be initialized and manipulated independently of one another. In creating the side-scroller engine I quickly realized that I require a method which takes in mouse "screen" coordinates and converts them to "world"(forgive me if this terminology is incorrect) coordinates.

I created a method in my Matrices class for this task:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
| public Vector2f getScreentoWorldCoordinates(Vector2f screenCoordinates){ viewProjectionMatrix = new Matrix4f(); Matrix4f.mul(viewMatrix, projectionMatrix, viewProjectionMatrix); viewProjectionMatrix.invert(); float xIn = (screenCoordinates.x / screenWidth) * 2.0f - 1.0f; float yIn = (screenCoordinates.y / screenHeight) * 2.0f - 1.0f; float zIn = 0.5f; float wIn = 1.0f; Vector4f in = new Vector4f(xIn, yIn, zIn, wIn); Vector4f out = new Vector4f(); Matrix4f.transform(viewProjectionMatrix, in, out); return new Vector2f(out.x, out.y); } |

It returns a Vector2f that does this perfectly when the view matrix is at (0,0), but when moved away from zero the position upon which the returns are based becomes distorted. For example if I leave the pointer in the center of the screen and track view matrix coordinates, when the camera is above (10,10) my world coordinates say they are at (10.26, 5.77) The calculation still returns good results relative to the distorted position(meaning that an object that is .5 width will measure exactly so if points are taken with the mouse cursor's world coordinates according to the above method), so I know I didn't get it entirely wrong. I have tried to figure out where I went wrong for a few hours now and I'm stumped.

Could any of you inform me as to my folly?

I realize that I could just calculate the difference between the distorted point and the actual view location and apply offsets to the out(x,y) for good world coordinates, but I'd much rather figure out why my method didn't work and learn from the error.

Edit:

I just "solved" the problem by multiplying the out.x with the xScale value in position 00 of the projection matrix. Same with out.y and yScale. I still feel like I am doing this wrong and shouldn't have needed to take this additional step though.

Edit 2:

Nevermind, this fixed the distorted point but screwed up the x,y values when you move the cursor away from the center. It inverted my problem.