Java-Gaming.org Hi !
Featured games (83)
games approved by the League of Dukes
Games in Showcase (539)
Games in Android Showcase (133)
games submitted by our members
Games in WIP (603)
games currently in development
News: Read the Java Gaming Resources, or peek at the official Java tutorials
 
   Home   Help   Search   Login   Register   
  Show Posts
Pages: [1] 2 3 ... 11
1  Game Development / Newbie & Debugging Questions / Re: Eclipse Export Jar from command line on: 2014-12-17 22:55:00
Update: It's a lot easier than I thought to create a custom build script following that tutorial (http://ant.apache.org/manual/tutorial-HelloWorldWithAnt.html), rather than using eclipse to do it. Thanks guys.
I made a single build.xml that builds a test project, including bundling libraries and including source projects from other files rather than having to create a separate build.xml for each project.

Thanks again Smiley

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
23  
24  
25  
26  
27  
28  
29  
30  
31  
32  
33  
34  
35  
36  
37  
38  
39  
40  
41  
42  
43  
44  
45  
46  
47  
48  
49  
50  
51  
52  
53  
54  
<project name="TestAntSimple" basedir="." default="main">

   <property environment="env"/>
   <property name="debuglevel" value="source,lines,vars"/>
   <property name="target" value="1.6"/>
   <property name="source" value="1.6"/>
   
   <property name="src.dir"     value="src"/>
   <property name="src2.dir"     value="../TestAntSimpleChildProject/src"/>
   
    <property name="build.dir"   value="build"/>
    <property name="classes.dir" value="${build.dir}/classes"/>
    <property name="jar.dir"     value="${build.dir}/jar"/>
    <property name="main-class"  value="testantsimple.Main"/>
   <property name="jar.name"    value="testantsimple.jar"/>
   <property name="lib.dir"     value="../Storm Libraries/"/>
   <property name="lib.includes"     value="**/*.jar"/>

    <path id="classpath">
        <fileset dir="${lib.dir}" includes="${lib.includes}"/>
    </path>
   
   <target name="clean">
        <delete dir="${build.dir}"/>
    </target>

    <target name="compile">
        <mkdir dir="${classes.dir}"/>
        <javac includeantruntime="false" destdir="${classes.dir}" classpathref="classpath">
         <src path="${src.dir}"/>
         <src path="${src2.dir}"/>
      </javac>
    </target>

    <target name="jar" depends="compile">
        <mkdir dir="${jar.dir}"/>
        <jar destfile="${jar.dir}/${jar.name}" basedir="${classes.dir}">
         <zipgroupfileset dir="${lib.dir}" includes="${lib.includes}" excludes=""/>
         
            <manifest>
                <attribute name="Main-Class" value="${main-class}"/>
            </manifest>
        </jar>
    </target>

    <target name="run" depends="jar">
        <java jar="${jar.dir}/${jar.name}" fork="true"/>
    </target>

    <target name="clean-build" depends="clean,jar"/>

    <target name="main" depends="clean,run"/>
   
</project>


2  Game Development / Newbie & Debugging Questions / Re: Eclipse Export Jar from command line on: 2014-12-17 21:05:37
I exported the ant buildfiles and it created a build.xml for each of my projects. I ran "ant" inside my client directory and it said build successful (with a bunch of warnings) but I can't find a jar file that was built / exported. Is this right? Exporting the jar using eclipse normally takes more than 2 seconds.

Well, look in the ant build file to see where it's putting the jars!

Well turns out my version of eclipse ant export doesn't even have the right commands to build the jar file in there. I guess I'm just going to end up doing it from scratch following http://ant.apache.org/manual/tutorial-HelloWorldWithAnt.html

I am very interested in using JNDT. I am using Launch4j at the moment but it's really only a good windows solution. I'm going to look into this further when I have time to test mac and linux further. Smiley
You can go on using Launch4j under Windows until I implement MSI support. Please note that JNDT is under GPL. If you have no plan to put your own code under GPL, you'll have to use another tool (for example PackR). I remind it because I don't want you to waste your precious time with a tool that you can't use.
Oh, thanks for letting me know. Unfortunately I can't put my code under GPL. Sad
Roland
3  Game Development / Newbie & Debugging Questions / Re: Eclipse Export Jar from command line on: 2014-12-17 10:33:12
Thanks both of you for the replies.  Smiley I will build ant files from scratch once I get the exported one working.

What do you mean by introducing dependencies?

Try exporting the ant build script and looking at it to see what I mean. It includes eclipse stuff that you probably don't need, but eclipse adds for whatever reason.

I exported the ant buildfiles and it created a build.xml for each of my projects. I ran "ant" inside my client directory and it said build successful (with a bunch of warnings) but I can't find a jar file that was built / exported. Is this right? Exporting the jar using eclipse normally takes more than 2 seconds.

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
C:\Storm>ant

Buildfile: C:\Storm\build.xml

build-subprojects:

init:

... (repeats of below for each included project)

build-project:
     [echo] Storm: C:\[...]\build.xml
    [javac] C:\Storm\build.xml:222: warning: 'includeant
runtime'
was not set, defaulting to build.sysclasspath=last; set to false for re
peatable builds

build:

BUILD SUCCESSFUL
Total time: 2 seconds



Hi

Why not writing your own Ant script from scratch? This is what I do and it has worked very well. I create a "runnable" JAR and I wrap it into a self-contained native application bundle Smiley

Maybe this is the way to go. Just feels like a complicated task to do it from scratch and make it work with my setup (above) However I'm willing to give it a shot Smiley Do you have any more information / good links on this?
Thanks
You can look at this section of my latest article, I suggest some tools but it would be a bit harder to use in your case as you use a library with no equivalent of "automated native library loading", you'll have to manage the Java library path by yourself:
http://gouessej.wordpress.com/2014/11/22/ardor3d-est-mort-vive-jogamps-ardor3d-continuation-ardor3d-is-dead-long-life-to-jogamps-ardor3d-continuation/#deployment

I use lots of third party libraries and it works for me, it should work for you too. My build is entirely automated, from the compiling to the upload of the bundles (SFTP).

I am very interested in using JNDT. I am using Launch4j at the moment but it's really only a good windows solution. I'm going to look into this further when I have time to test mac and linux further. Smiley


4  Game Development / Newbie & Debugging Questions / Re: Eclipse Export Jar from command line on: 2014-12-15 21:23:45
Hi

Why not writing your own Ant script from scratch? This is what I do and it has worked very well. I create a "runnable" JAR and I wrap it into a self-contained native application bundle Smiley

Maybe this is the way to go. Just feels like a complicated task to do it from scratch and make it work with my setup (above) However I'm willing to give it a shot Smiley Do you have any more information / good links on this?
Thanks
5  Game Development / Newbie & Debugging Questions / Re: Eclipse Export Jar from command line on: 2014-12-15 21:21:07
From eclipse, go to file -> export -> ant -> ant build script. Voila.

However, that introduces dependencies on eclipse, so building your own might be the way to go.

What kind of dependencies do you have? Just creating an ant build script from scratch would take less time than waiting on replies in a forum, imho...

Thanks for the reply Smiley
What do you mean by introducing dependencies?

I have a semi-complicated setup. My game engine, rendering & network code are split into separate projects, plus separate projects for my game client, server and updater. They require some external libraries ( rhino js, poly2tri, apache commons etc) the client/renderer requires LWJGL.

I need to build 3 jars
-client
-server
-updater

and it would be great if I could just run ant files for all three (I want to make this an automated process I can run from a build server)


Thanks again,
roland
6  Game Development / Newbie & Debugging Questions / Eclipse Export Jar from command line on: 2014-12-15 07:46:12
Hey guys, I'd like to be able to export a runnable jar from the command line rather than having to go through File->Export->Runnable Jar File->...->Finish. Can I do this with eclipse somehow?
Otherwise, is there a way to get the build script from my project and use ant to build it from the command line?

Thanks,
Roland
7  Game Development / Game Mechanics / Convert 2D position to 1D texture coord on: 2014-11-19 05:08:17
Hey, I have a map made out of polygons. The polygon is made out of edges (lines) containing two points (p0 and p1).
On the edges of these polygons I draw decoration using a placeholder tiling texture:


The Y texture coordinate is always either 0 or 1 because i only want to tile horizontally.


The issue I am having is to calculate a good tiling X coordinate for the texture.
I tried:

1  
2  
3  
//centre = (0,0)
x1 = distance(centre, edge.p0);
x2 = distance(centre edge.p1);


UV coordinates for my decoration quad will be:
1  
2  
3  
4  
x1,0
x2,0
x2,1
x1,1


The issue is that if I have two points in a polygon / edge that have the same distance from the centre, the texture coordinate will be the same which means the decoration will be rendered wrongly (The horizontal texture scale will increase to infinity)

To partially fix this, I instead get the distance from the centre to one point on the edge, and add then add the distance to the other:

1  
2  
x1 = distance(centre, edge.p0);
x2 = x1 + distance(edge.p1, edge.p2);


This fixes a single edge, but if I have multiple edges, the texture coords don't match up because it's based on the distance from one point on the edge to the centre of the map, Since the distance between two edges isn't the same as the difference between the distance from one edge to the centre of the map and the distance from the other edge to the centre of the map.



EDIT: I can loop around a single polygon with the above method, starting at a single point and just adding the distance from one point to the next.  (So x1 = x0 + distance(x0, x1), x2 = x1 + distance(x1, x2) etc.) The first and last point won't match up, however they only would if the width of my tiling texture is equal to the sum of the lengths of the edges in the first place. I guess this is the best I can do, however if anyone has a better idea I'd be happy to hear it!

Thanks,
Roland
8  Java Game APIs & Engines / OpenGL Development / Re: [GLSL] Terrain multitexturing shader is too slow on: 2014-11-11 05:12:23
I still don't understand how to do this.  Cry
9  Java Game APIs & Engines / OpenGL Development / Re: [GLSL] Terrain multitexturing shader is too slow on: 2014-11-10 23:18:36
I meant textureLocations uniform array.
textureLocations is a vec4
10  Java Game APIs & Engines / OpenGL Development / Re: Proper GPU profiling with LWJGL on: 2014-11-09 21:16:51
Sorry for the late reply. This is a great tool, thanks theagentd!
Is there any way to modify this for a graphics card that only supports OpenGL 2.1?
11  Java Game APIs & Engines / OpenGL Development / Re: [GLSL] Terrain multitexturing shader is too slow on: 2014-11-09 00:42:50
How does performance goes if you replace uniform array with 1d texture LUT. This way you get away from all integer based dynamic indexing.
Replace uniform vec4 tiles[MAX_MAP_TEXTURES]; with a 1d texture? and replace int tileLoc = int(textureLocations[ i ]*256.0); with just having the x, y, width, height in the 1d texture?

Or instead of using the tile texture to store lookups to the array, I use that to store the tiles somehow in the tile texture and use the 1d array to get it from there?

I can see it somehow working and being faster, but it's just out of reach Smiley Could you give me some more details?
I want to stick with 4 textures maximum (colourmap, virtual texture, tilemap and mixmap) although the alpha channel of the colourmap isn't being used.
Thanks
12  Java Game APIs & Engines / OpenGL Development / Re: [GLSL] Terrain multitexturing shader is too slow on: 2014-11-04 08:56:06
Cool! Happy to help.

I don't remember all the details and can't find the article(branching vs discard or max sampler2d? idk), anyway apparently frag shaders hate dynamic indexing.
That's ok. I've definitely learned my lesson anyway Smiley
Thanks again
13  Java Game APIs & Engines / OpenGL Development / Re: [GLSL] Terrain multitexturing shader is too slow on: 2014-11-04 08:16:20
Try unwinding your loop and see if that helps performance, I'll look for an article I read that cautions you to stay away from things like:
1  
int(textureLocations[b][i]*256.0[/b]);


OMG. Thanks so much thedanisaur!

I thought it was just when the loop amount was variable that it caused a big slowdown.
Eg. for (int i = 0; i < someVariable; i++).

I removed the loop and set i to 0 like so:

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
int i = 0;
//for(int i=0;i<4;i++)
{
   int tileLoc = int(textureLocations[i]*256.0);
     
   vec4 tile = tiles[tileLoc];
   float minTextureSize = min(tile.p, tile.q);
   vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
   textureColours[i] = texture2D(virtualTexture, wrapped);
     
   finalColour.rgb = mix(finalColour.rgb, textureColours[i].rgb, mixmapColour[i]);
}

This caused the code to run at ~30fps. Definitely not fast enough since I was only doing 1/4 of the work (So no speed up at all)

I removed the int i = 0; variable completely and replaced i with 0 in all places.

Boom 60FPS!

So now I have a big block of duplicated code but it runs at 60fps no problems! Smiley Smiley Smiley
I guess I can move some things around and put some code in a separate function so there isn't so much duplication.

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
23  
24  
25  
26  
27  
28  
29  
30  
31  
32  
33  
34  
35  
36  
37  
38  
39  
40  
   {
      int tileLoc = int(textureLocations[0]*256.0);
     
      vec4 tile = tiles[tileLoc];
      float minTextureSize = min(tile.p, tile.q);
      vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
      textureColours[0] = texture2D(virtualTexture, wrapped);
     
      finalColour.rgb = mix(finalColour.rgb, textureColours[0].rgb, mixmapColour[0]);
   }
   {
      int tileLoc = int(textureLocations[1]*256.0);
     
      vec4 tile = tiles[tileLoc];
      float minTextureSize = min(tile.p, tile.q);
      vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
      textureColours[1] = texture2D(virtualTexture, wrapped);
     
      finalColour.rgb = mix(finalColour.rgb, textureColours[1].rgb, mixmapColour[1]);
   }
   {
      int tileLoc = int(textureLocations[2]*256.0);
     
      vec4 tile = tiles[tileLoc];
      float minTextureSize = min(tile.p, tile.q);
      vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
      textureColours[2] = texture2D(virtualTexture, wrapped);
     
      finalColour.rgb = mix(finalColour.rgb, textureColours[2].rgb, mixmapColour[2]);
   }
   {
      int tileLoc = int(textureLocations[3]*256.0);
     
      vec4 tile = tiles[tileLoc];
      float minTextureSize = min(tile.p, tile.q);
      vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
      textureColours[3] = texture2D(virtualTexture, wrapped);
     
      finalColour.rgb = mix(finalColour.rgb, textureColours[3].rgb, mixmapColour[3]);
   }
14  Java Game APIs & Engines / OpenGL Development / [GLSL] Terrain multitexturing shader is too slow on: 2014-11-04 07:04:29
Hey guys,
I came up with a terrain multitexturing technique that works great but it doesn't run fast enough on intel / older graphics cards. I don't know if there's anything I can really do about this. My game runs at 60fps on my ATI card but will hover at around 8-10fps on my intel HD 4000. I was wondering if anyone could look at my technique / GLSL code and see if there's anything significant I could do to improve it?
My other solution is to just have an option to disable multitexturing on slower PCs, where they would instead just use a single texture with coloured polygons. This kind of sucks because the map/level will look a lot worse but maybe it's the only other way.

Because of some PC's only supporting 4 texture units, instead I am using a 2048x2048 "virtual texture" (See image below)


This allows me to have as many textures as I want (up to 32) as long as they fit inside the virtual texture (as "tiles" like in a spritesheet), with a maximum of 4 textures blended per pixel.

My technique requires 4 textures.
1. Colour map (just for painting colours on the terrain, omitted from the code)
2. Virtual texture (described above)
3. Tile map (contains 4 tiles for each pixel) where R=0 is tile 0, R = 1 is tile 1, up to 32. EG. RGBA=(0,1,2,3) will be the first 4 tiles
4. Mix map (contains opacity of the 4 tile textures for each pixel, for blending them together)

I also pass an array of tiles (x,y,width,height) so that I can calculate the location on the virtual texture for each pixel.

I send 2 sets of UV coords to the shader, worldUV is the true polygon UV coordinates, mixmapUV is the polygon's UV coordinates relative to the map bounds (from 0 to 1).

I multiply the UV coordinates by the tile size so that smaller textures don't repeat more than larger (higher quality) textures.

If something isn't clear please let me know!  Smiley


1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
23  
24  
25  
uniform vec4 tiles[MAX_MAP_TEXTURES]; //contains the x,y,width,height of each tile in the virtual texture
uniform sampler2D tileTexture; //contains 4 texture indexes used to get the texture tile from the virtual texture
uniform sampler2D virtualTexture; //2048x2048 packed texture containing all individual terrain textures
uniform sampler2D mixmapTexture; //contains opacity to blend a maximum of 4 textures together

vec4 calculateColour(vec2 worldUV, vec2 mixmapUV)
{
   vec4 textureLocations = texture2D(tileTexture, mixmapUV);
   vec4 mixmapColour = texture2D(mixmapTexture, mixmapUV);
   
   vec4 finalColour = vec4(0.0,0.0,0.0,1.0);
   for(int i=0;i<4;i++)
   {
      int tileLoc = int(textureLocations[i]*256.0);
     
      vec4 tile = tiles[tileLoc];
      float minTextureSize = min(tile.p, tile.q);
      vec2 wrapped = vec2(tile.s + abs(mod(worldUV.x * minTextureSize, tile.p)), tile.t + abs(mod(worldUV.y * minTextureSize, tile.q)));
      textureColours[i] = texture2D(virtualTexture, wrapped);
     
      finalColour.rgb = mix(finalColour.rgb, textureColours[i].rgb, mixmapColour[i]);
   }
   
   return finalColour;
}
15  Game Development / Newbie & Debugging Questions / Re: Java AWT image alpha channel saving problem on: 2014-10-24 12:40:05
Hmm actually how are you checking the output data?

Cas Smiley
Opening it with gimp and deleting the alpha channel. Don't know if that works though.

Saved as a compressed byte array now (with width and height stored in the first 8 bytes) and that works fine so I could just keep it like that, file size seems ok
16  Game Development / Newbie & Debugging Questions / Re: Java AWT image alpha channel saving problem on: 2014-10-24 11:42:18
I think it might be a bug then. Fairly sure I had some severe headscratching over this exact issue.

Cas Smiley
Hmm, I tried another library too (pngj) and no luck there. I guess I'll just save it as a compressed byte array.
Thanks,
roland
17  Game Development / Newbie & Debugging Questions / Re: Java AWT image alpha channel saving problem on: 2014-10-24 11:14:26
AWT is saving the data as premultiplied-with-alpha I'll guess. Try coercing the data to not-premultiplied. I recall having some issue with this a few years ago... might even have been a bug in the png writer.

Cas Smiley
Thanks, I thought that might be the case, but I tried using both BufferedImage.TYPE_INT_ARGB and BufferedImage.TYPE_INT_ARGB_PRE (where I both load and save the image) and they didn't make a difference. I hope it's not a bug Sad I can try a different PNG library to save the image I guess.
18  Game Development / Newbie & Debugging Questions / Re: Java AWT image alpha channel saving problem on: 2014-10-24 11:06:17
You mean that saving color 255, 0, 0, 0 results with a color 0, 0, 0, 0 ?
Yeah  Huh
19  Game Development / Newbie & Debugging Questions / Java AWT image alpha channel saving problem on: 2014-10-24 10:56:44
Hey guys, I'm using an RGBA image for a terrain mixmap, where each channel corresponds to the opacity of a separate terrain texture (so I can blend multiple textures).

The problem is, if I save any pixel with no alpha, for example. RGBA (255,0,0,0) or (20,60,100,0) or (0,200,30,0) it just saves a blank pixel because the alpha component is zero. How can I fix this?

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB); //create image
//loop through and set pixels, which come from a bytebuffer which has perfectly fine data in it
for(int x = 0; x < width; x++)
{
   for(int y = 0; y < height; y++)
   {
      int i = (x + (width * y)) * 4;
           
      int r = buffer.get(i) & 0xFF;
      int g = buffer.get(i + 1) & 0xFF;
      int b = buffer.get(i + 2) & 0xFF;
      int a = buffer.get(i + 3) & 0xFF; //if i set this to 255 it 'partially works' but then the alpha channel values can't be used
      int c = (a << 24) | (r << 16) | (g << 8) | b;
      image.setRGB(x, y, c);
      }
   }
}
ImageIO.write(image, "PNG", filename); //save image


I could just save/load my bytebuffer to/from a file but i'd rather save them as images.

Thanks,
roland
20  Java Game APIs & Engines / OpenGL Development / Re: [LWJGL] FBO stencil buffer not working on intel card on: 2014-10-11 11:16:49
Actually the code above works fine. My problem was that I didn't set GL11.glStencilMask(1); before using the stencil buffer and GL11.glStencilMask(0); afterward. Not sure why It was only necessary in that exact situation but I'm glad it works Smiley

From the answer here:
http://stackoverflow.com/questions/8861736/opengl-stencil-buffer-not-quite-got-it-working
21  Java Game APIs & Engines / OpenGL Development / [solved][lwjgl] FBO stencil buffer not working on intel card on: 2014-10-10 10:04:24
So now the shaders run fine, I just need to fix a problem with my FBOs - normal ones work fine however but I just can't get the stencil buffer to work with them (It works fine on my ATI card but not my intel HD 4000).

I've been looking over the internet for hours and trying different techniques to create the FBO that would work on both intel and ATI but I'm stumped. Can anyone help me?

My FBO creation code is below
1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
23  
24  
25  
26  
27  
28  
frameBufferID = EXTFramebufferObject.glGenFramebuffersEXT();
EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, frameBufferID );
EXTFramebufferObject.glFramebufferTexture2DEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, EXTFramebufferObject.GL_COLOR_ATTACHMENT0_EXT, GL11.GL_TEXTURE_2D, textureID, 0);

if (attachStencilBuffer)
{
   renderBufferID = EXTFramebufferObject.glGenRenderbuffersEXT();
   //System.out.println("Stencil buffer ID: " + stencilBufferID);
   EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, renderBufferID);
   EXTFramebufferObject.glRenderbufferStorageEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, EXTPackedDepthStencil.GL_DEPTH_STENCIL_EXT, width, height);

   //bind the same buffer twice: once as depth buffer, once as stencil
   EXTFramebufferObject.glFramebufferRenderbufferEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT,EXTFramebufferObject.GL_DEPTH_ATTACHMENT_EXT,EXTFramebufferObject.GL_RENDERBUFFER_EXT, renderBufferID);
   EXTFramebufferObject.glFramebufferRenderbufferEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT,EXTFramebufferObject.GL_STENCIL_ATTACHMENT_EXT,EXTFramebufferObject.GL_RENDERBUFFER_EXT, renderBufferID);
}

int status = EXTFramebufferObject.glCheckFramebufferStatusEXT(EXTFramebufferObject.GL_FRAMEBUFFER_EXT);
switch (status)
{
case EXTFramebufferObject.GL_FRAMEBUFFER_COMPLETE_EXT:
   valid = true; //<--- gets to here
   break;
default:
   System.err.println(getFBOErrorStatus(status));
}

EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 0);
EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, 0);



Pre-Rendering setup:
1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, frameBufferID);
if (attachStencilBuffer)
   EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, renderBufferID);
GL11.glPushAttrib(GL11.GL_VIEWPORT_BIT);
GL11.glViewport( 0, 0, width, height );

GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();

GL11.glOrtho(0, width, 0, height, -1, 1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();

if (clear)
   GL11.glClear(GL11.GL_COLOR_BUFFER_BIT);


Post-Render:
1  
2  
3  
4  
5  
6  
7  
8  
9  
EXTFramebufferObject.glBindFramebufferEXT( EXTFramebufferObject.GL_FRAMEBUFFER_EXT, 0);
EXTFramebufferObject.glBindRenderbufferEXT(EXTFramebufferObject.GL_RENDERBUFFER_EXT, 0);
GL11.glPopAttrib();

GL11.glMatrixMode(GL11.GL_PROJECTION);
GL11.glLoadIdentity();
GL11.glOrtho(0, Display.getWidth(), Display.getHeight(), 0, -1, 1);
GL11.glMatrixMode(GL11.GL_MODELVIEW);
GL11.glLoadIdentity();

22  Java Game APIs & Engines / OpenGL Development / Re: [LWJGL] GLSL Link fails with no error on intel graphics on: 2014-10-10 09:52:04
Thanks, I wasn't using a #version directive. However, I don't want to go right up to GLSL 4.0 just so that I can loop over the sampler array  Undecided
Quote
In GLSL 1.30 to 3.30, you can have sampler arrays, but with severe restrictions on the index. The index must be an integral constant expression. Thus, while you can declare a sampler array, you can't loop over it.

Oh well, it seems to be fine how I currently have it, apart from a bit of duplicated code. I'll get over it  Smiley

Thanks again,
roland
23  Java Game APIs & Engines / OpenGL Development / Re: [LWJGL] GLSL Link fails with no error on intel graphics on: 2014-10-10 06:23:43
sometimes shader compilers fail so hard, they don't even tell you why. depends on the code.

try using https://www.opengl.org/registry/specs/ARB/debug_output.txt for more info.

Thanks Smiley You're right about the shader compilers fail so hard, they don't even tell you why. I commented out everything and added back one thing at a time, this introduced some errors. I finally got it linking but it still crashed when I ran it.

Turns out using an array of uniform sampler2D was the issue (As I said, worked fine on ATI). So I just replaced it with 4 uniform sampler2Ds instead.
1  
2  
#define NUM_MAP_TEXTURES 4
uniform sampler2D mapTextures[NUM_MAP_TEXTURES]; //fails on intel



Hi, I am using similar code to

What is "similar"?  One thing that I've found fails on Intel but not other cards is using integers where you mean floats (eg 1 rather than 1.0).  I think the Intel behaviour is actually correct unless you specify #version but it's the one that always seems to bite me.  persecutioncomplex
Thanks, I got some of those errors too before, however they printed out fine in the compile stage. Thinking about why the similar shaders linked fine and this one didn't helped though. Smiley

24  Java Game APIs & Engines / OpenGL Development / [solved] GLSL Link fails with no error on intel graphics on: 2014-10-09 11:17:33
Hi, I am using similar code to http://lwjgl.org/wiki/index.php?title=GLSL_Shaders_with_LWJGL and when I run my program on my intel HD 4000 graphics card, the link stage fails:

1  
ARBShaderObjects.glGetObjectParameteriARB(shaderID, ARBShaderObjects.GL_OBJECT_COMPILE_STATUS_ARB)
returns GL_FALSE but when using
1  
ARBShaderObjects.glGetInfoLogARB(obj, ARBShaderObjects.glGetObjectParameteriARB(obj, ARBShaderObjects.GL_OBJECT_INFO_LOG_LENGTH_ARB));

an empty string is returned.


This only happens on my intel card, my ATI one runs fine.
Also a couple of simple shaders did link fine on the intel card but just drew a black screen (They worked fine on the ATI card too)
Is there anything I can do? Or the intel card just sucks and can't handle the shaders?


Thanks,
roland
25  Java Game APIs & Engines / OpenGL Development / Re: VBO Multiple texture coords on: 2014-07-30 09:02:34
There won't be consequences.

glTexCoordPointer is just an arbitrary vertex attribute. You can use it for anything: colors, normals, an angle... or indeed a pair of 2d texcoords.

In fact, it's preferable to use glVertexAttribPointer, as that is the 'modern', generic way to specify your vertex attributes.

Cool thanks for clarifying that Smiley
I'll look into it.
26  Java Game APIs & Engines / OpenGL Development / Re: VBO Multiple texture coords on: 2014-07-30 05:18:19
I found 2 solutions.


One: (From http://www.gamedev.net/topic/509671-vbo-multiple-texture-coordinates-per-vertex-tex-coord-per-triangle/ "If you are using vbo's (indexed or not), for each vertex a set of properties is stored (depending on what buffers you use), for example: position, normal and texture coordinates. As far as I know, if a certain vertex is used multiple times, but each time you need other texture coordinates, this is not possible. In this case you have to store the vertex and the texture coordinates multiple times in your buffers."


Two (My solution): gl_TexCoord[0] is actually a 4d vector, so by using gl_TexCoord[0].st for the first two coordinates and new vec2(gl_TexCoord[0][2],gl_TexCoord[0][3]) I was able to hack in a 2nd set of texture coordinates by using GL11.glTexCoordPointer(4, ...). I'm not sure what consequences this may cause, but seems to work for my 2d game and means I don't need to duplicate the vertex coords or use gl_TexCoord[1]

27  Java Game APIs & Engines / OpenGL Development / Re: VBO Multiple texture coords on: 2014-07-29 07:11:54
All you should have to do is scale the non-mixmap textures to a smaller size. Then you don't need two sets of tex coords

edit: something along the lines of this

   
1  
vec4 texture0_color_map = texture2D(texture0, TexCoord * tex_scale);


Thanks, that looks like a possible solution and in most cases would be fine. However I need to be able to set custom texture coordinates for my vertices so I have to use 2 sets Sad Any other ideas?
 
28  Java Game APIs & Engines / OpenGL Development / Re: VBO Multiple texture coords on: 2014-07-29 03:39:59
Noticed that your two sets of the texturecoords are the same.

1  
2  
3  
4  
5  
6  
7  
8  
// Set1    Set2
   0,0,    0,0,
   1,0,    1,0,
   0,1,    0,1,

   1,0,    1,0,
   0,1,    0,1,
   1,1,    1,1

So, you can pass only one set of them and use them directly in the shader. (If that is what intended)

Thanks for the reply, I was just using it for testing.
The following still makes only the red square come up:
1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
vertexData.put(new float[]{
   //Triangle 1
   -150,-150, 0,0,0.1f,0.1f,
   150,-150, 1,0,0.6f,0.1f,
   -150,150, 0,1,0.1f,0.6f,

   //Triangle 2
   150,-150, 1,0,0.6f,0.1f,
   -150,150, 0,1,0.1f,0.6f,
   150,150, 1,1,0.6f,0.6f});
vertexData.flip();


29  Java Game APIs & Engines / OpenGL Development / VBO Multiple texture coords on: 2014-07-29 03:09:18
Hey, I have a terrain made of 4 tiling textures(grass,dirt,metal,concrete) + one 'mixmap' as described in the top solution here: http://stackoverflow.com/questions/1110844/multiple-texture-images-blended-together-onto-3d-ground where the value of each channel(RGBA) corresponds to the alpha of the texture assigned to it (so red is the opacity of the grass texture at that pixel)

EVERYTHING works fine if I use the same texture coords for both. However, Since I want my ground textures to tile, and the mixmap to not tile (so top left corner of the map is texture coords 0,0 and bottom right is 1,1) I need to use separate texture coordinates for the map textures and the mixmap.

I am using an interleaved VBO. When I use gl_TexCoord[1].st, it doesn't seem to work.  For now I'm setting the second set of texture coords to have the same value as the first, but it's not working.

Vertex Shader:
1  
2  
3  
4  
5  
6  
7  
void main()
{
   gl_FrontColor = gl_Color;
   gl_Position = gl_ModelViewProjectionMatrix*gl_Vertex;
   gl_TexCoord[0]=gl_MultiTexCoord0;
   gl_TexCoord[1]=gl_MultiTexCoord1;
}


Fragment shader:
1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
const int NUM_MAP_TEXTURES = 4;
uniform sampler2D maskTexture;
uniform sampler2D mapTextures[NUM_MAP_TEXTURES];
vec3 textureColours[NUM_MAP_TEXTURES];

void main()
{
   /*vec4 maskColour = vec4(texture2D(maskTexture, gl_TexCoord[1].st)); //gl_TexCoord[0]
   maskColour.a = 1 - maskColour.a;
   
   for(int i=0;i<NUM_MAP_TEXTURES;i++)
      textureColours[i] = vec3(texture2D(mapTextures[i], gl_TexCoord[0].st));
   
   vec3 finalColour = textureColours[0] * maskColour.r;
   for(int i=1;i<NUM_MAP_TEXTURES;i++)
      finalColour = mix(finalColour,  textureColours[i], maskColour[i]);
   
   gl_FragColor = vec4(finalColour, 1.0) * gl_Color;*/

   
   vec4 maskColour = vec4(texture2D(maskTexture, gl_TexCoord[1].st)); //gl_TexCoord[0].st 'works' but I need to use 1
   gl_FragColor = maskColour;
}


VBO creation:
1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
int numTriangles = 2;
         int numVertices = numTriangles * 3;
         int vertexSize = (2 + 2 + 2)*4; //xy (2D) + texture COORDS(2 sets) only. NO COLOUR OR NORMALS //*4 for sizeof(float)
         FloatBuffer vertexData = BufferUtils.createFloatBuffer(vertexSize*numVertices);
         vertexData.put(new float[]{
               //Triangle 1
               -150,-150, 0,0,0,0, //0.2f,0.2f,
               150,-150, 1,0,1,0, //0.6f,0.2f,
               -150,150, 0,1,0,1,//0.2f,0.6f,
               
               //Triangle 2
               150,-150, 1,0,1,0, //0.6f,0.2f,
               -150,150, 0,1,0,1, //0.2f,0.6f,
               150,150, 1,1,1,1});  //0.6f,0.6f});
         vertexData.flip();
         vertexBufferData(vertexBufferID, vertexData);


Rendering:

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
14  
15  
16  
17  
18  
19  
20  
21  
22  
23  
24  
25  
26  
27  
28  
29  
30  
31  
32  
33  
34  
35  
36  
//Bind shader [omitted]

GL11.glEnableClientState(GL11.GL_VERTEX_ARRAY);
GL11.glEnableClientState(GL11.GL_TEXTURE_COORD_ARRAY);
GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vertexBufferID);
         
int vertexSize = (2 + 2+ 2)*4; //xy + uv + uv2
int numVertices = 6;
         
GL11.glVertexPointer(2, GL11.GL_FLOAT, vertexSize/*stride*/, 0/*offset*/);


GL13.glActiveTexture(GL13.GL_TEXTURE0);
GL11.glTexCoordPointer(2, GL11.GL_FLOAT, vertexSize/*stride*/, 2*4/*offset*/);

int sampler=ARBShaderObjects.glGetUniformLocationARB(shader.GetShaderID(), "mapTextures[0]");
GL11.glBindTexture(GL11.GL_TEXTURE_2D, textures[0].GetTextureID());
ARBShaderObjects.glUniform1iARB(sampler, 0);



GL13.glActiveTexture(GL13.GL_TEXTURE1);
GL11.glTexCoordPointer(2, GL11.GL_FLOAT, vertexSize/*stride*/, (2+2)*4/*offset*/);
sampler=ARBShaderObjects.glGetUniformLocationARB(shader.GetShaderID(), "maskTexture");
GL11.glBindTexture(GL11.GL_TEXTURE_2D, mixMapTexture.GetTextureID());
ARBShaderObjects.glUniform1iARB(sampler, 1);

GL13.glActiveTexture(GL13.GL_TEXTURE0);

GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, numVertices);

GL11.glDisableClientState(GL11.GL_VERTEX_ARRAY);
GL11.glDisableClientState(GL11.GL_TEXTURE_COORD_ARRAY);


//Unbind shader [omitted]



Ouput using vec4 maskColour = vec4(texture2D(maskTexture, gl_TexCoord[0].st));
NB my mixmap texture has 4 bars of colour: 1,0,0,1 0,1,0,1 0,0,1,1 0,0,0,0



Ouput using vec4 maskColour = vec4(texture2D(maskTexture, gl_TexCoord[1].st));

However if I change the mixmap image colour to full green, it will show a full green rectangle. I'm thinking that the 2nd set of texture coords just doesn't work at all and they are all set to (0,0) or something.

Can anyone help me figure this out?
Thanks,
roland






30  Java Game APIs & Engines / OpenGL Development / 2D Dynamic Lighting and shadows for top down game on: 2014-07-27 08:05:13
Hey guys I'm trying to implement lighting and shadows for a 2d game that is made out of polygons. Although my game is a side scroller I am using a top down approach for the lighting / shadows. I'm trying to combine two techniques:

Technique 1. 3 pass rendering for lighting (SEE http://www.java-gaming.org/topics/realistic-lighting-in-a-2d-game/20161/msg/161973/view.html#msg161973)
---Pass 1: Render map and polygons to FBO #1
---Pass 2: Render lights to lightmap FBO #2
---Pass 3: Combine both offscreen textures with fragment shader.

Technique 2. Shadows: (SEE htt p:// lw jgl.org/forum/ind ex.php/topic,5203.0.html (sorry couldn't link it properly, remove spaces))
Except I have some differences:
---1. Background won't be fully black
---2. Shadows will be semi - transparent rather than fully black, and shadows will darken other shadows if they overlap
---3. I don't want shadows to exist on top / underneath lights (SEE http://prntscr.com/46jgw8)

My goals:
-if no lights exist the area should be mostly in shadow. I want lights to Add to the brightness of the game, and the shadows they create should not darken the map any more than if the light didn't exist.

My problem:
I'm unsure how to combine the shadows with the lightmap in a way that makes sense, so shadows don't exist where lights do. (NB only occurs with multiple lights)

My proposed solution:
---Pass 1: Render map and polygons to FBO #1
-----------------------------------------------
Section I'm unsure about:
---Lightmap starts off empty (all pixels set to "dark colour" RGBA (0.1,0.1,0.1,1))
---Somehow render the shadows and multiple lights to the lightmap (FBO #2). Shadows can't render where any of the lights are (or need to combine somehow) and lights can't block out shadows completely?
-----------------------------------------------
---Pass 3: Combine both FBO #1 with FBO #2 with fragment shader.

Can anyone help me with this?
Thanks,
roland



Pages: [1] 2 3 ... 11
 

Add your game by posting it in the WIP section,
or publish it in Showcase.

The first screenshot will be displayed as a thumbnail.

Mr.CodeIt (10 views)
2014-12-23 03:34:11

rwatson462 (38 views)
2014-12-15 09:26:44

Mr.CodeIt (31 views)
2014-12-14 19:50:38

BurntPizza (62 views)
2014-12-09 22:41:13

BurntPizza (99 views)
2014-12-08 04:46:31

JscottyBieshaar (60 views)
2014-12-05 12:39:02

SHC (74 views)
2014-12-03 16:27:13

CopyableCougar4 (77 views)
2014-11-29 21:32:03

toopeicgaming1999 (138 views)
2014-11-26 15:22:04

toopeicgaming1999 (127 views)
2014-11-26 15:20:36
Resources for WIP games
by kpars
2014-12-18 10:26:14

Understanding relations between setOrigin, setScale and setPosition in libGdx
by mbabuskov
2014-10-09 22:35:00

Definite guide to supporting multiple device resolutions on Android (2014)
by mbabuskov
2014-10-02 22:36:02

List of Learning Resources
by Longor1996
2014-08-16 10:40:00

List of Learning Resources
by SilverTiger
2014-08-05 19:33:27

Resources for WIP games
by CogWheelz
2014-08-01 16:20:17

Resources for WIP games
by CogWheelz
2014-08-01 16:19:50

List of Learning Resources
by SilverTiger
2014-07-31 16:29:50
java-gaming.org is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑gaming.org
Powered by MySQL Powered by PHP Powered by SMF 1.1.18 | SMF © 2013, Simple Machines | Managed by Enhanced Four Valid XHTML 1.0! Valid CSS!