Java-Gaming.org Hi !
Featured games (90)
games approved by the League of Dukes
Games in Showcase (686)
Games in Android Showcase (197)
games submitted by our members
Games in WIP (758)
games currently in development
News: Read the Java Gaming Resources, or peek at the official Java tutorials
 
   Home   Help   Search   Login   Register   
  Show Posts
Pages: [1] 2 3 ... 116
1  Java Game APIs & Engines / OpenGL Development / Re: Possible to manually "start X" instead of -XstartOnFirstThread? on: 2016-08-23 12:05:02
Yeah, I realized that from Spasi's explanation. It just forces the entire VM to start on the first thread of the process, which due to Mac's stupid limitations is the only thread that can do UI stuff. Usually it'd dedicate the first thread to AWT/Swing just for this purpose. It's just stupid.
2  Java Game APIs & Engines / OpenGL Development / Re: Possible to manually "start X" instead of -XstartOnFirstThread? on: 2016-08-23 01:22:58
Why not move the GLFW stuff to start on the main thread, and the game run on a separate thread? ( That may take a ludicrous amount of time, depending on the game )
That's exactly what I'm doing right now, but it's ugly. Forcing the user to move the entire game to a new thread and doing some seemingly arbitrary unintuitive calls is what I'm trying to avoid. The cleanest solution of this kind is to abstract the entire thing away and have the player give a "game interface" to an engine, which handles the thread hand-over. I don't want to force the user to do this shit, and I don't want to surprise them 6 months into development with having to do hacky shit like this to make it work on Apple.

I had a long chat session with Spasi over Skype, and got some more info on why this is required on Mac and how to work around it. The "cleanest" solution for the user is to create a small native C program which fires up a JVM using the Invocation API. It'd create a new thread for the application and call the application's main() method, then start the GLFW event handling loop on the main thread. The Java code then just needs to realize that the event handling thread will be created externally.

I'd prefer this solution in the long run since I'd rather require a hack for getting the engine to run on Mac than have the user manually work around limitations of Mac to get anything at all to run.
3  Java Game APIs & Engines / OpenGL Development / Possible to manually "start X" instead of -XstartOnFirstThread? on: 2016-08-22 22:37:37
Hello.

I've made this fancy-schmancy GLFW event handling system with LWJGL 3 that dedicates a thread to input processing while respecting all the threading limitations of GLFW (trust me, that was hard). I'd fire up a new thread and dedicate it to GLFW commands so that rendering/updating doesn't affect the responsiveness of the window or input event recording precision. However, when running this code on Mac it's broken since I need to use the -XstartOnFirstThread VM command... which breaks everything since I want to "start X" (whatever that means) on different thread than the main thread: the dedicated GLFW thread. This leads to the extremely awkward position where I need to do everything backwards, and dedicate the main thread to GLFW and transfer control of the entire game to a new thread instead. This is intrusive as f**k, reduces readability and has already caused hard-to-find bugs for us.

Is there any way of manually doing the equivalent of -XstartOnFirstThread but on a thread of our choosing?
4  Game Development / Newbie & Debugging Questions / Re: Converting RGB images to a master palette on: 2016-08-20 21:01:00
I guess a reverse lookup table would kinda work. If you have 256*256*256 different colors possible, you can simply create a table for all those combinations (16MB). I guess doing some kind of approximation to reduce the size of the table would be a good idea though. xd
5  Discussions / General Discussions / Re: Programmer jokes on: 2016-08-18 18:13:23
Why did the C-programmer never show up at university?

Because he had no classes.
6  Java Game APIs & Engines / OpenGL Development / Re: Problem: Delete texture doesn't affect RAM on: 2016-08-18 03:14:04
In general, allocating a texture should only allocate VRAM. No RAM is used. The exception is if you have an integrated Intel GPU, in which case the GPU has no VRAM and uses a portion of system RAM instead. However, I'm pretty sure this portion is reserved, meaning that the RAM that the GPU has reserved can't be used by the CPU even if you free it (although I may be wrong). TL;DR: Don't worry about it.
7  Discussions / Miscellaneous Topics / Re: HTML/CSS Is Gross on: 2016-08-17 03:58:56
*looks at URL*
Nope, this IS JGO, not Hydroque's blog.
8  Discussions / Miscellaneous Topics / Re: What I did today on: 2016-08-16 13:34:14
With flexible graphics plugins, there is always the hope that @theagentd will be a huge fan and will upload a free fan-made plugin with gorgeous 3D models, shaders and stuff.
Heh, please don't count on that, as I'm very low on free time nowadays. xd Still, I might take a look at some point. =P
9  Java Game APIs & Engines / OpenGL Development / Re: Problem: Draw Framebuffer on Framebuffer on: 2016-08-12 11:27:45
If your framebuffer setup is correct, then it's probably your texture coordinates. I'm guessing that the texture you're rendering to is still full resolution.

This is my new guess:

1. You have a window-sized texture.
2. You set the viewport to only cover a small portion of the entire texture you draw to and render to it.
3. When reading the FBO, you read the entire texture instead of just the small portion you drew to.
10  Java Game APIs & Engines / OpenGL Development / Re: Problem: Draw Framebuffer on Framebuffer on: 2016-08-11 14:28:04
You forgot to call glViewport() to update to the new resolutions?
11  Discussions / Miscellaneous Topics / Re: What I did today on: 2016-08-07 10:55:10
I ran some small tests on Vulkan pipeline creation. Vulkan has two ways of improving the performance of creating pipeline objects: derivatives and pipeline caches. Derivatives allow you to hint Vulkan that there are big similarities between two pipelines you want to create and that the driver should do its best to reuse data from one pipeline when creating the second one. A pipeline cache is an object you pass in when creating pipelines, and the driver is free to store and read data from the cache when creating pipelines. In addition, this cache object can be serialized to disk and reloaded later, allowing you to persistently save the cache between runs of your game, improving load times after the first run.

I simply ran a small test creating 1000 pipelines 100 times and measuring the average time it took:
1  
2  
3  
4  
Nothing:     86.599ms
Cache:       80.048ms
Derivative:  86.826ms
Cache+Deriv: 79.531ms


In other words, on Nvidia drivers derivative have no impact on load times, but the cache does take off almost 10% of the load time. In addition, the cache is thread-safe, allowing you to use the same cache from multiple threads. Looks like the simplest and fastest solution is to ignore derivatives and just multithread the whole thing while using a pipeline cache, at least on Nvidia. If someone has an AMD GPU, it'd be interesting to see how their driver deals with this.

EDIT: The above tests created all pipelines using a single vkCreateGraphicsPipelines() call to create all 1000 pipelines. If I create 1 pipeline at a time using one vkCreateGraphicsPipelines() call each, it takes around 92ms even with a cache.
12  Java Game APIs & Engines / OpenGL Development / Re: Global Illumination via Voxel Cone Tracing in LWJGL on: 2016-07-18 09:29:56
You could try doing something like "tricubic" filtering manually, but other than that there's not much you can do. You can do tricubic filtering with "only" 8 samples (instead of 64).
13  Game Development / Newbie & Debugging Questions / Re: [LWJGL] How to fix texture bleeding? on: 2016-07-13 02:14:48
Oh.

People will talk down on the game engine. If you aren't solving a problem, people here don't care. Most popular response would be, 'just use libgdx' or a more seasoned library with everything you need.
Well, making game engines isn't going to make any new games or most likely result in a better engine than existing ones. Nothing wrong with making an engine for learning purposes though, as long as you've understood that.
14  Game Development / Shared Code / Re: Fast sRGB conversion GLSL snippet on: 2016-07-11 14:31:50
I found that a simple ternary operator had very noticeably higher performance than mix+step, so I have updated the code above. Note that each vector component had to be calculated individually as GLSL doesn't allow (vec3 < vec3) comparisons, only (float < float).
15  Game Development / Shared Code / Fast sRGB conversion GLSL snippet on: 2016-07-11 01:28:06
EDIT: I've updated the code below with a new version using ternary operators that's ~25% faster. The ternary operators compile to conditional assignment instructions that are faster than step+mix. The old slower version can be found here for reference: http://www.java-gaming.org/?action=pastebin&id=1467

Hey, this goes kind of hand-in-hand with my gamma correction post. Here's a small snippet to manually calculate sRGB colors, which is useful in some cases if you want to do the sRGB conversion manually.

1  
2  
3  
4  
5  
6  
7  
8  
9  
10  
11  
12  
13  
vec3 srgbEncode(vec3 color){
   float r = color.r < 0.0031308 ? 12.92 * color.r : 1.055 * pow(color.r, 1.0/2.4) - 0.055;
   float g = color.g < 0.0031308 ? 12.92 * color.g : 1.055 * pow(color.g, 1.0/2.4) - 0.055;
   float b = color.b < 0.0031308 ? 12.92 * color.b : 1.055 * pow(color.b, 1.0/2.4) - 0.055;
   return vec3(r, g, b);
}

vec3 srgbDecode(vec3 color){
   float r = color.r < 0.04045 ? (1.0 / 12.92) * color.r : pow((color.r + 0.055) * (1.0 / 1.055), 2.4);
   float g = color.g < 0.04045 ? (1.0 / 12.92) * color.g : pow((color.g + 0.055) * (1.0 / 1.055), 2.4);
   float b = color.b < 0.04045 ? (1.0 / 12.92) * color.b : pow((color.b + 0.055) * (1.0 / 1.055), 2.4);
   return vec3(r, g, b);
}


There are cases where you would want to do the conversion yourself. For example, if you want to do good dithering, you need to do it in sRGB space, or the noise you add will get distorted by the sRGB encoding. Ideally, dithering shouldn't add any noise at all to a completely black screen, but if you dither in linear space and then convert to sRGB you'll get values as high as 5-6 randomly popping up. So, the correct approach is to convert to sRGB space, THEN dither (with GL_FRAMEBUFFER_SRGB disabled since we did that part ourselves).
1  
2  
   color = srgbEncode(color);
   color = clamp(color + (rand(texCoords) - 0.5) * (1 / 255.0), 0.0, 1.0);


Now, let's say we're rendering to a GL_SRGB8 texture with GL_FRAMEBUFFER_SRGB on so that we get correct blending in linear space, and we realize that we need dithering to reduce banding (even with gamma correction/sRGB you may get subtle banding). In that case, we can't encode to sRGB in the shader and disable GL_FRAMEBUFFER_SRGB since we want the gamma correct blending from GL_FRAMEBUFFER_SRGB, so we have to do something pretty ridiculous to get the correct result. We need to dither in sRGB space, but output linear colors for blending, so...
1  
2  
3  
   color = srgbEncode(color);
   color = clamp(color + (rand(texCoords) - 0.5) * (1 / 255.0), 0.0, 1.0);
   color = srgbDecode(color);


Yup, we convert to sRGB, dither, then convert back to linear space again. And that's an example of why these two functions are useful! The performance of these two functions is very good, as ternary operators are compiled to conditional assignment instructions without any branches required.
16  Game Development / Articles & tutorials / Re: Basics of gamma correction on: 2016-07-10 04:25:05
Aaaaaand I can't modify that comment either. It just says "Loading..." forever...
17  Game Development / Articles & tutorials / Re: Basics of gamma correction on: 2016-07-10 04:24:28
I can't seem to modify the article to fix the errors I found in it. >___>
18  Game Development / Newbie & Debugging Questions / Re: Temporary ObjectOutputStream wrapping InputStream on: 2016-07-10 01:38:39
Serializing "shader objects". A byte[] for the code, a number of descriptor set definitions, other meta data about the shader. Just tedious to do it by hand for each class I have.
19  Game Development / Articles & tutorials / Basics of gamma correction on: 2016-07-10 00:21:10
Greetings, everyone. This guide is here to teach you all about gamma correction and why you need to take it into consideration. Let's get started!



Background

So, what is gamma correction/encoding/decoding? The idea of gamma encoding came from trying to optimize the memory usage of images. The key observation was that a simple linear representation, where 0.0 was black, 1.0 was white and 0.5 was the exact middle between black and white, was not enough to store an image with 8 bits per channel due to banding at low values, while the values close to 1.0 suffered no visible banding at all since human eyes are much less sensitive to changes in light intensity at high values. In other words, there was an excess of precision close to 1.0 and insufficient precision around 0.0. To solve this, you can apply a function to the intensities to try to redistribute the precision. If we for example apply the square root function on each value, then 0.5 gets increased to 0.707. Using 8 bits for each color (0 to 255), that means transforming 127 all the way up to 180. In other words, for the darker colors we have 180 unique values, while for the bright values we only have 75. Hence, the general approach of gamma encoding is to take the 0.0 to 1.0 color values and just run them through a power function x^g, where g is the gamma value (0.5 in the previous example). To do gamma encoding, we run it through x^(1/g) to get the original value back again, since (x^g)^(1/g) = x^(g*1/g) = x^1 = x.

In a sheer coincidence, it turns out that old CRT monitors don't have a linear response to the voltage applied to the electron gun in them. For most electron guns, the color intensity that was produced based on the voltage x applied to them follows a non-linear curve: x^1.8 to x^2.2. Hence, if we encode our images with gamma 1/2.2, we end up cancelling out the non-linear response of the electron gun and get a linear result like we want, while getting all the precision redistribution benefits of gamma encoding. For gamma 1/2.2, 127 is brought up to 186, giving us 186 values between 0.0 and 0.5, and 69 values between 0.5 and 1.0. Perfect! Now, we don't have CRT monitors anymore, but today's monitors and TVs still follow a gamma of around 2.2. They often use hardware look-up tables that convert an 8-bit color intensity to the voltage to apply to each pixel to achieve this.



Why should I care? I've never seen anything look wrong?

Indeed, if you load in an image file from your computer and just display it, you will get a 100% gamma correct result. Why? Because all major image formats (JPG, PNG, BMP, even XMI) are all stored with in the sRGB color space, which is very close to gamma 1/2.2 to 1/2.4. If you simply take the image, load it into Java2D/OpenGL/Vulkan/whatever, you are taking a gamma 1/2.2 image and displaying it on a gamma 2.2 monitor, which cancels out, so why do you need to care? A major problem occurs when we try to manipulate the colors in the image. As we already know, a color intensity of 0.5 corresponds to 186 in gamma 1/2.2. Hence, half of 255 is 186, not 127. As soon as you start multiplying, scaling or adding together colors, you need to take the gamma into account!

Here's a simple proof that gamma correction is required for correctness. Make sure that you view these pictures at their exact resolutions without any scaling (i.e. not on a phone). Take a step back so that the lines on the side of the pictures look like a single solid color. Here's the version WITHOUT gamma correction.

Note that the middle actual solid color block is 127, but looks much darker than the "physical" true gray you see from by looking at the alternating black/white lines on the side from a distance. However, if you gamma correct the center...

Now, that's much closer to what you would expect! That's also 188! It probably doesn't look exactly like the side bars, depending on your monitor and its settings, but it gives you a good idea on why we need gamma correction to get correct results!



OK, so my game's colors are incorrect, but they still look good?

True, not doing gamma correction will give you significantly darker gray values and give a deeper look, but it's not physically accurate. If you're aiming for realism, you will not be able to get realistic results (or even results that make sense at all) without taking gamma into account. For example, a linear gradient will look more pleasant without gamma correction as the color intensity basically gets squared. With gamma correction, most of the gradient will look white as our eyes aren't as sensitive at high light intensities. It's important not to mix up accuracy and how good something look.



Fair enough, how do I make my graphics gamma correct?

To be able to scale a value or blend together two values correctly, we must convert the sRGB values to a linear color space, apply the changes we want, then back again to sRGB so that they are displayed correctly. Graphics cards actually have hardware for doing this for us.

The first one is related to how textures are handled. If we were to decode sRGB (pretty close to applying x^2.2 to each channel) when we loaded a texture from file, we absolutely destroy the precision of the values if we attempted to store the result in a simple 8-bit texture. For example, all values in the image that are darker than 14 would all map to 0, completely devastating the precision for the blacks. Then when we encode back to sRGB before displaying the result to the screen, we end up undoing this transformation again, but the damage is already done. We've lost the additional precision that sRGB gives where it matters and given it to whites instead! Luckily, there is hardware to handle this! OpenGL 2.1 hardware has support for loading the raw sRGB data into a texture and have the GPU do the conversion for us! What does this mean? It means that when our shader samples 186 from the sRGB texture, it will instead return 0.5 from texture()! Similarly, if it samples a 1, it will be converted to around 0.000015, a VERY small number. Even better, it will also perform texture filtering correctly in linear space. In other words, we have massively improved precision of blacks where we need it! To get the same precision for blacks with a linear color space texture, we'd need at least a 16-bit texture!!!

Great! So we've managed to get high precision linear color values into our shader! If your game/graphics engine uses HDR render targets (16-bit float textures), you're free to use linear colors (read: what you're already using) in your entire engine from now on and not have to worry about gamma at all. At the end when you copy the result to the framebuffer, just apply an sRGB/gamma transform to convert it back to sRGB/gamma color space to compensate for the monitor's gamma curve and you're done!

However, not all games want to use high-precision render targets for memory usage or performance reasons. There may simply not be a need for it. In this case, we get the same problem as above when we write our linear color values to a simple 8-bit texture, destroying the precision. When we later convert our values back to sRGB for displaying, we amplify up the precision loss again and get horrifying banding. Luckily, there is a second hardware feature you can take advantage of, this time in OpenGL 3.0 hardware. This feature allows you to seamlessly use sRGB textures as render targets without having to worry about any conversions yourself. By enabling GL_FRAMEBUFFER_SRGB, the hardware will make sure to do all the conversion for you. When you have an SRGB texture assigned to the framebuffer object you're rendering to and GL_FRAMEBUFFER_SRGB, the hardware will:

 - Convert the linear color values you write from your shader to sRGB space before writing them to the texture.
 - Do blending with linear colors by reading the sRGB destination values, converting them to linear, blending them with the linear values from the shader, then convert the result to sRGB for storage again.
 - Even do correct color space conversions when blitting between framebuffers, for example between a 16-bit floating point texture and an SRGB texture.
 - Never give you up.
 - Never let you down.

It's actually possible to request an sRGB default framebuffer for OpenGL when creating the OpenGL context so that you can use it with GL_FRAMEBUFFER_SRGB. Weirdly enough, Nvidia ALWAYS gives you an sRGB-capable default framebuffer even if you ask it not to, but when you check if it's sRGB it says no. A workaround is to render a 0.5 color pixel with GL_FRAMEBUFFER_SRGB and read it back. If it's 127, it's a linear framebuffer. If it's ~186ish, then it's sRGB.


Color gradient comparison

A simple linear color gradient rendered to a GL_RGB8 texture, then blitted to the screen with GL_FRAMEBUFFER_SRGB doing an automatic sRGB conversion, ruining the precision:


The same procedure, but with an GL_SRGB8 texture instead (meaning the sRGB conversion is done when writing the 32-bit float pixel color values to the first texture, not when blitting 8-bit values):


20  Discussions / General Discussions / Re: Virtual Reality on: 2016-07-07 15:18:29
Wow, that's... really kinda useless. Spot on summary.
21  Game Development / Newbie & Debugging Questions / Re: Temporary ObjectOutputStream wrapping InputStream on: 2016-07-07 14:22:44
Hmm... So I guess the simplest solution would be to simply change the Input/OutputStream parameters to ObjectInput/OutputStreams instead and let the user create the stream instead. That'll be good enough for my usage right now...

Thanks for all the responses! Glad I could catch this error early. This really seems like a pretty major flaw in the stream design though...
22  Game Development / Newbie & Debugging Questions / Re: [LWJGL] How to fix texture bleeding? on: 2016-07-06 23:25:39
Use a 2D texture array instead (GL_TEXTURE_2D_ARRAY). In many cases it's just a 3D texture, but the advantage here is how mipmaps and filtering works.

3D texture mipmaps: 0: 4x4x4, 1: 2x2x2, 2: 1x1x1
2D texture array mipmaps: 0: 4x4x4, 1: 2x2x4: 2: 1x1x4.

As you can see, the layers in the 2D texture array don't get blended together. In addition, there is no possible way that you can get texture bleeding between layers as they are separated into individual layers, meaning that GL_CLAMP_TO_EDGE will solve all cases of bleeding that you can possibly get.
23  Game Development / Newbie & Debugging Questions / Temporary ObjectOutputStream wrapping InputStream on: 2016-07-06 21:48:41
So I've been trying to figure out how to do this 100% correctly but I can't find any good info.

Basically, I have a serialization function for a certain object, and that function takes in an OutputStream. This OutputStream may already have stuff written to it, and other things may be written to it after the serialization. Here's the function:
1  
2  
3  
4  
5  
6  
   public void serializeTo(OutputStream output) throws IOException {
      ObjectOutputStream oos = new ObjectOutputStream(output);
      oos.writeObject(this);
      //WHAT DO I DO WITH oos NOW?!
      oos.flush();
   }

My question is: After creating the temporary ObjectOutputStream, I can't close it once I'm done writing my stuff because that would close the underlying OutputStream. Is the correct thing to do here to only flush() it and then just leave it hanging?


Similarly, what do I do when I want to deserialize?
1  
2  
3  
4  
5  
   public static GLSLFile deserializeFrom(InputStream input) throws IOException {
      ObjectInputStream ois = new ObjectInputStream(input);
      return (GLSLFile) ois.readObject();
      //I feel so dirty for not closing ois...
   }

Is this correct? Can this explode somehow? What am I supposed to do here?!
24  Game Development / Newbie & Debugging Questions / Re: Fast color manipulation on: 2016-07-05 22:21:21
use a pre-computed lookup table with 256 elements, or if you really need them, 3 lookup tables.
Bonus; this gives you arbitrary color remapping.
LUTs are surprisingly overlooked in many valid cases.
25  Game Development / Shared Code / Re: XM File Formats on: 2016-07-03 02:19:59
You guys have to chill. He's working on a cool project that he obviously enjoys. No point in telling him nobody will ever use it, he's enjoying coding something.
The beef this entire community has with Hydroque is that he's saying the exact opposite of that. He's saying he puts the benefit of the masses as his top priority.
26  Game Development / Newbie & Debugging Questions / Re: [LWJGL 3] GLFW Cursor Input on: 2016-06-30 01:13:33
GLFW glfwSetCursorPos() docs:
Quote
Do not use this function to implement things like camera controls. GLFW already provides the GLFW_CURSOR_DISABLED cursor mode that hides the cursor, transparently re-centers it and provides unconstrained cursor motion. See glfwSetInputMode for more information.
http://www.glfw.org/docs/latest/group__input.html#ga04b03af936d906ca123c8f4ee08b39e7
27  Discussions / General Discussions / Re: Thanks to JGO - Exiled Kingdoms wouldn't have been possible without this website on: 2016-06-29 15:56:04
It seems to be possible to "try" the code on the PC Play Store instead of in the app. In that case, it either says the code has already been used or tells you to put it in the Play app instead without redeeming it.
28  Discussions / General Discussions / Re: Thanks to JGO - Exiled Kingdoms wouldn't have been possible without this website on: 2016-06-29 14:42:53
Yay, I got one! =D
29  Discussions / Miscellaneous Topics / Re: What I did today on: 2016-06-27 12:46:48
If you were shrewd you'd basically implement Unity's shader language... thereby having access to thousands of free and awesome shaders and a huge amount of amassed expertise...

Cas Smiley
Well, that's why I started that discussion thread about this BEFORE I started coding it... That being said, I'm not entirely sure it'd work considering I'm aiming for emulating Vulkan's descriptor sets in OpenGL.

@theagentd You should definitely consider emitting the #line directives which allow you to get correct line numbers while debugging shaders.
That's a good idea. Hmm, it's a bit difficult though considering that $importset could be multiple lines in a different FILE... Hmm, actually the set-file structs and uniform buffers are verified completely when they're parsed into memory before compiling shaders, so the chance that there's a problem in there which passes into a shader when imported is fairly low. It might just be good enough to have it show the $importset line as the problem in those rare cases.
30  Discussions / Miscellaneous Topics / Re: What I did today on: 2016-06-27 04:01:38
So I finally got time to sit down and play with my GLSL preprocessor.

When you're done, PM me a github link and I can port it to some standard text preprocessor for you, then put it in the shared code! This looks really useful, Ive always needed a better way of doing shader/client interoperability like checking for primitives and all.
Sure, that'd be lovely! I plan on integrating the whole thing with a couple of tools to write a binary shader file (char[] of GLSL source code + info about the shader for GL, byte[] SPIR-V code + descriptor set layouts for VK), so your port probably has to be written in Java. xd
Pages: [1] 2 3 ... 116
 
roseslayer (374 views)
2016-08-06 11:43:29

roseslayer (342 views)
2016-08-06 09:43:11

xTheGamerCodes (418 views)
2016-08-04 15:40:59

xTheGamerCodes (409 views)
2016-08-04 15:40:24

orrenravid (761 views)
2016-07-16 03:57:23

theagentd (836 views)
2016-07-11 14:28:54

Hydroque (927 views)
2016-07-06 05:56:57

Hydroque (972 views)
2016-07-03 08:52:54

GrandCastle (783 views)
2016-07-01 09:13:47

GrandCastle (634 views)
2016-07-01 09:09:45
Rendering resources
by Roquen
2016-08-08 05:55:21

Rendering resources
by Roquen
2016-08-08 05:52:42

Rendering resources
by Roquen
2016-08-08 05:50:38

Rendering resources
by Roquen
2016-08-08 05:49:53

Rendering resources
by Roquen
2016-08-08 05:32:39

Making a Dynamic Plugin System
by Hydroque
2016-06-25 00:13:25

Java Data structures
by BinaryMonkL
2016-06-13 21:22:09

Java Data structures
by BinaryMonkL
2016-06-13 21:20:42
java-gaming.org is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑gaming.org
Powered by MySQL Powered by PHP Powered by SMF 1.1.18 | SMF © 2013, Simple Machines | Managed by Enhanced Four Valid XHTML 1.0! Valid CSS!