Java-Gaming.org Hi !
Featured games (91)
games approved by the League of Dukes
Games in Showcase (804)
Games in Android Showcase (239)
games submitted by our members
Games in WIP (868)
games currently in development
News: Read the Java Gaming Resources, or peek at the official Java tutorials
 
    Home     Help   Search   Login   Register   
Pages: [1]
  ignore  |  Print  
  [LibGDX] Deferred Lighting  (Read 6795 times)
0 Members and 1 Guest are viewing this topic.
Offline Cyraxx

Junior Devvie


Medals: 1



« Posted 2015-08-28 08:55:58 »

Hello JGO!

I'm trying to implement deferred lighting for the first time following this tutorial: http://learnopengl.com/#!Advanced-Lighting/Deferred-Shading.
I was wondering if there is an integrated way in LibGDX to use multiple render targets. I saw that the FrameBuffer class only contains one ColorAttachment and you can't add more via in-built methods.
Should I use a SpriteBatch to render to the gBuffer or should I use meshes?
Should I just use LWJGL classes directly and avoid the LibGDX in-built ones or how should I do it?

Basically my question is how to implement Deferred Lighting in LibGDX.

Thanks in advance for any help!  Smiley
Offline CoDi^R
« Reply #1 - Posted 2015-08-28 11:57:59 »

I was wondering if there is an integrated way in LibGDX to use multiple render targets. I saw that the FrameBuffer class only contains one ColorAttachment and you can't add more via in-built methods.

There isn't, I'm afraid. For our project I've copied and modified FrameBuffer to allow for multiple color attachment, alternate buffer formats, and an optional stencil buffer.

Should I use a SpriteBatch to render to the gBuffer or should I use meshes?

Doesn't matter, both work. You just need to be prepared to write and use custom shaders.

Should I just use LWJGL classes directly and avoid the LibGDX in-built ones or how should I do it?

The libGDX classes provide (almost) everything you need. There are only a few exceptions, e.g. OpenGL functions&constants not exposed in the GL20/GL30 interfaces because they are not supported by GLES.

Basically my question is how to implement Deferred Lighting in LibGDX.

It's just a matter of setting up multi-render-targets & shaders. Oh, and GL states. I found it relatively straightforward if you target desktop. Despite GL states. They suck. Wink

The nice thing about libGDX is that it moves out of your way if you say so, and lets you work with GL functions directly.

Please note that the useGL30 path has been updated with the latest release. It's not really been usable before that.

Robotality - steamworks4j - @code_disaster - codi^r @ #java-gaming
Offline theagentd
« Reply #2 - Posted 2015-08-28 23:04:39 »

That article looks extremely outdated. No one has ever stored the position in a texture. You always reconstruct the view space position from the depth buffer.

Myomyomyo.
Games published by our own members! Check 'em out!
Legends of Yore - The Casual Retro Roguelike
Offline Longarmx
« Reply #3 - Posted 2015-08-28 23:11:41 »

That article looks extremely outdated. No one has ever stored the position in a texture. You always reconstruct the view space position from the depth buffer.

Could you recommend a better tutorial? As I have been trying to get deferred rendering (with LWJGL) working for weeks, using this tutorial as my main guide.

Offline theagentd
« Reply #4 - Posted 2015-08-29 02:51:55 »

I didn't use a tutorial when I implemented it. Just reading about the concept was enough for me. What exactly are you having trouble with?

Myomyomyo.
Offline Cyraxx

Junior Devvie


Medals: 1



« Reply #5 - Posted 2015-08-29 08:37:23 »

I didn't use a tutorial when I implemented it. Just reading about the concept was enough for me. What exactly are you having trouble with?

Well, how do you reconstruct the position from depth if you are using an orthographic projection?
Offline theagentd
« Reply #6 - Posted 2015-08-29 14:56:38 »

It doesn't matter what kind of projection you're using. You can easily reconstruct the view space position anyway. The idea is to upload the inverse of the projection matrix to the shader, reconstruct the NDC (normalized device coordinates) of the pixel and "unproject" it using the inverse projection matrix, hence it works with any kind of projection matrix.

NDC coordinates are coordinates that go from -1 to +1 in all 3 axes. When you multiply the view space position by the projection matrix in the vertex shader when filling the G-buffer, you calculate NDC coordinates, and the GPU hardware maps XY to the viewport and Z to the depth buffer. We can undo the projection, but first we need to get all the data to do that.

First of all, you need the XY coordinates. These are easy to calculate. They basically go from (-1, -1) in the bottom left corner to (+1, +1) in the top right corner. The easiest way is to calculate them from gl_FragCoord.xy, which gives you the position (in pixels) of the pixel. Divide by the size of the screen and you have coordinates going from (0, 0) to (+1, +1). Remapping that to (-1, -1) to (+1, +1) is easy. The Z coordinate is the depth buffer value of that pixel, but the depth buffer value also goes from (0) to (+1) and needs remapping. With this, we have the NDC coordinates of the pixel. Now it's just a matter of multiplying the NDC coordinates with the projection matrix and dividing by the resulting W coordinate.

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
uniform sampler2D depthBuffer;
uniform vec2 inverseScreenResolution; //Fill with (1.0 / screen_resolution) from Java.
uniform mat4 inverseProjectionMatrix;

...

vec2 texCoords = gl_FragCoord.xy * inverseScreenResolution; //Goes from 0 to 1
float depthValue = texture(depthBuffer, texCoords); //Goes from 0 to 1

vec3 ndc = vec3(texCoords, depthValue) * 2.0 - 1.0; //Remapped to -1 to +1

vec4 unprojectResult = inverseProjectionMatrix * vec4(ndc, 1.0);

vec3 viewSpacePosition = unprojectResult.xyz / unprojectResult.w;

//Use viewSpacePosition for lighting



An example G-buffer layout for deferred shading is:

COLOR_ATTACHMENT0: GL_RGBA16F: (diffuse.r, diffuse.g, diffuse.b, <unused>)
COLOR_ATTACHMENT1: GL_RGBA16F: (packedNormal.x, packedNormal.y, specularIntensity, specularExponent)
DEPTH_ATTACHMENT: GL_DEPTH_COMPONENT24: (depth)


EDIT: Actually, if you're only using an orthographic projection, you don't need the W-divide (but it doesn't harm to keep it there).
EDIT2: Also, there are lots of optimizations you can do to this. I opted to just give you the basics before diving into those. I can answer whatever questions you have about deferred shading.

Myomyomyo.
Offline Cyraxx

Junior Devvie


Medals: 1



« Reply #7 - Posted 2015-08-31 12:30:56 »

COLOR_ATTACHMENT0: GL_RGBA16F: (diffuse.r, diffuse.g, diffuse.b, <unused>)
COLOR_ATTACHMENT1: GL_RGBA16F: (packedNormal.x, packedNormal.y, specularIntensity, specularExponent)
DEPTH_ATTACHMENT: GL_DEPTH_COMPONENT24: (depth)


So if I understand it correctly this is how I should do it:

- Bind the frame buffer.
- Bind the currently drawn object's normal map and specular texture
- Start drawing the diffuse textures using a shader wich takes a sampler2D normal, specular aswell
- Fill the COLOR_ATTACHMENT0 with the sampler2D(diffuse, texCoord), where texcoord is the texCoord of the diffuse texture
- Fill the COLOR_ATTACHMENT1's .rgb with the sampler2D(normal, texCoord).rgb and the .a with the specular with sampler2D(specular, texCoord).r
- End drawing
- Draw the FBO's texture with specific shader including all lights.

This is how I'd do it but as you can see I'm unsure how to draw the lights. Should I pass all the light's positions as a uniform array, or should I just render each light one by one on the FBO?
Also where exactly would I use the shader code you provided (position from depth).

I'd love it if you could elaborate on this a little. Wink
Offline theagentd
« Reply #8 - Posted 2015-08-31 19:15:19 »

First of all, you generally have both a specular intensity (how much specular light is reflected by the surface) and also a specular exponent/roughness/glossiness (how mirror-like the surface is), which affects the shape of the specular highlight. You often need to store both of them. In your case, you can store the specular exponent in the first texture's alpha component, as you only need RGB for diffuse textures.

The lighting process is pretty simple but you're mistaken on a few points. You do not want to process all lights on the screen in a single fullscreen pass. The simplest and fastest way of rendering lights is to generate light geometry. For a point light, that'd be a sphere. Spot/cone lights are a cone/pyramid thing. Directional lights (sun/moon) are indeed a fullscreen pass. The idea is to only process the pixels which are inside the light volume. In the lighting pass, you write to a single GL_RGBA16F render target with additive blending to add up all the lighting results.

Myomyomyo.
Offline basil_

« JGO Bitwise Duke »


Medals: 418
Exp: 13 years



« Reply #9 - Posted 2015-08-31 21:17:58 »

i'm a bit confused. Cyraxx do you need help with

- MRT with opengl in general ? (https://en.wikipedia.org/wiki/Multiple_Render_Targets)
- or the g-buffer layout ? (random images)
- or how to apply the g-buffer data in a post-process step to achieve desired shading ? (https://en.wikipedia.org/wiki/Deferred_shading)
Pages: [1]
  ignore  |  Print  
 
 

 
Riven (581 views)
2019-09-04 15:33:17

hadezbladez (5511 views)
2018-11-16 13:46:03

hadezbladez (2403 views)
2018-11-16 13:41:33

hadezbladez (5773 views)
2018-11-16 13:35:35

hadezbladez (1225 views)
2018-11-16 13:32:03

EgonOlsen (4663 views)
2018-06-10 19:43:48

EgonOlsen (5683 views)
2018-06-10 19:43:44

EgonOlsen (3198 views)
2018-06-10 19:43:20

DesertCoockie (4095 views)
2018-05-13 18:23:11

nelsongames (5115 views)
2018-04-24 18:15:36
A NON-ideal modular configuration for Eclipse with JavaFX
by philfrei
2019-12-19 19:35:12

Java Gaming Resources
by philfrei
2019-05-14 16:15:13

Deployment and Packaging
by philfrei
2019-05-08 15:15:36

Deployment and Packaging
by philfrei
2019-05-08 15:13:34

Deployment and Packaging
by philfrei
2019-02-17 20:25:53

Deployment and Packaging
by mudlee
2018-08-22 18:09:50

Java Gaming Resources
by gouessej
2018-08-22 08:19:41

Deployment and Packaging
by gouessej
2018-08-22 08:04:08
java-gaming.org is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑gaming.org
Powered by MySQL Powered by PHP Powered by SMF 1.1.18 | SMF © 2013, Simple Machines | Managed by Enhanced Four Valid XHTML 1.0! Valid CSS!