I am using OpenGL for 2D rendering and would like to use actual pixel coordinates. By this, I mean that I would like (0,0) to be in the top left of the window, and (width,height) to be in the bottom right of the window (where width and height are the window's dimensions in pixels). To do this, I use a projection matrix which is generated with glOrtho, and then passed to a vertex shader:

1 2 3 4 5 6
| GL11.glViewport(0, 0, width, height); GL11.glDisable(GL11.GL_DEPTH_TEST); GL11.glMatrixMode(GL11.GL_PROJECTION); GL11.glOrtho(0f, width, height, 0f, -1f, 1f); GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, projectionBuffer); GL11.glLoadIdentity(); |

I am using LWJGL, which does not have bindings for glm so I obtain a 2D orthographic matrix using the OpenGL calls above. I reset the projection matrix so it does not affect my later draw calls. After this, the projectionBuffer, a FloatBuffer, is filled with the projection matrix generated by glOrtho.

The projection matrix produced looks like this (I don't know if this is helpful):

1 2 3 4
| 0.0015625 0.0 0.0 1.0 0.0 -0.0027777778 0.0 -1.0 0.0 0.0 -1.0 0.0 0.0 0.0 0.0 1.0 |

My vertex shader looks like this:

1 2 3 4 5 6 7 8 9 10 11 12 13 14
| #version 330 core
layout (location = 0) in vec4 vertex;
out vec2 TexCoords;
uniform mat4 model; uniform mat4 projection = mat4(1.0);
void main() { TexCoords = vertex.zw; gl_Position = projection * model * vec4(vertex.x, vertex.y, 0.0, 1.0); } |

When I initialise the shader, I use glUniformMatrix4 to set the projection matrix's value. I'm certain this is done successfully as when I use glGetUniform after, it returns the same projection matrix.

The model matrix is produced when every textured quad is to be drawn. Each quad shares the same vertices and uvs which are all stored in a single VBO in the same VAO. The vertex data is shared between each quad that is drawn, with a different model matrix applied to each. The model matrix is calculated correctly to produce real screen coordinates. For example, a square with its top left at (0,0) and width/height of 128 would produce the following model matrix:

1 2 3 4
| 128.0 0.0 0.0 0.0 0.0 128.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 |

This model matrix is passed to the shader successfully using glUniformMatrix4 and I have checked this.

To initialise the shared VAO with the quad's vertex data, I use the following code:

1 2 3 4 5 6 7 8 9 10
| vao = GL30.glGenVertexArrays(); int vbo = GL15.glGenBuffers(); GL30.glBindVertexArray(vao); GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vbo); GL15.glBufferData(GL15.GL_ARRAY_BUFFER, QUAD_BUFFER, GL15.GL_STATIC_DRAW);
GL20.glEnableVertexAttribArray(0); GL20.glVertexAttribPointer(0, 4, GL11.GL_FLOAT, false, 16, 0); GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0); GL30.glBindVertexArray(0); |

QUAD_BUFFER refers to a float[] containing the vertex and texture coordinate data.

Finally, to draw a textured quad, I use the following:

1 2 3 4 5
| shader.setMatrix4f("model", model); GL13.glActiveTexture(GL13.GL_TEXTURE0); texture.bind(); GL30.glBindVertexArray(vao); GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, 6); |

The problem is, when I run the application, nothing is drawn on the window; it remains completely black. When not using the shader, I can draw shapes using the old method (glBegin etc.). I cannot figure out what I'm doing wrong. I suspect its something to do with the projection matrix causing the vertices to be off the window.