Hi !
Featured games (83)
games approved by the League of Dukes
Games in Showcase (539)
Games in Android Showcase (132)
games submitted by our members
Games in WIP (603)
games currently in development
News: Read the Java Gaming Resources, or peek at the official Java tutorials
   Home   Help   Search   Login   Register   
  Show Posts
Pages: [1] 2 3 ... 9
1  Java Game APIs & Engines / OpenGL Development / Re: Modern OpenGL ES 3.0 / 3.1 compute shaders for Java with Android 5.0 on: 2014-12-15 04:16:30
Ahh Julien Julien Julien..  Clueless   Kiss   persecutioncomplex  It's not so much as a framework as it is a compact utility library that replaces the very long in the tooth mechanisms that have been around since Android 1.0. The library is for simple demos though I'll in time provide all the utility code that makes it possible to port the NVidia Gameworks samples to Java for Android. I'm trying to strip things down to the bare minimum of what is necessary for modern OpenGL and making the code clean for small demos. It's good for test cases too to test different mobile GPU manufacturers I already found some issues in the Snapdragon 805 Adreno 420 that don't exist in the 320 / 330 or Tegra K1 and have run into the proper folks at Motorola to have them take a look. One of the demos briefly shows the problem; only Nexus 6 / Adreno 420 with current weak drivers is showing the problem. The hardware is supposed to support OpenGL ES 3.1, but a gimped driver was shipped (just GLES 3.0 support). So, having samples that run across platform really show the glaring issues that manufacturers need to fix as some developers notice!

I actually stripped a lot of the utility code out of TyphonRT itself. What I released is definitely not the meat of TyphonRT from an architecture standpoint. That is why FBOData has public fields and accessor methods as I just got rid of the interface as it was one more class to track. IN a component architecture where there is a lookup often to retrieve an object one makes the member for data objects public.

Not trying to take over any binding anytime soon (the utility code posted uses the Android SDK bindings, but makes them pleasant to use). I actually welcome LWJGL3 or even JOGL to GLES / EGL especially whoever will potentially take on making a full profile Java / OpenGL Binding for the Tegra K1 including all the cool NV extensions (NV_command_list), etc.
2  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-12-14 23:06:56
Should be titled what I did yesterday, all last night, this morning, and continuing today...

Finally got some really nice clean and modern OpenGL code up for Android which supports compute shaders and such.

Just got featured on the front page of XDA Developers... Cheesy

Woot woot...  Grin
3  Java Game APIs & Engines / OpenGL Development / Modern OpenGL ES 3.0 / 3.1 compute shaders for Java with Android 5.0 on: 2014-12-14 20:03:27
Hi folks,

Check out these Github repos I just posted this morning for modern OpenGL ES 3.0 / 3.1 demos and a lightweight framework that makes using this tech much easier from Java / Android

The main demo repo is here, but check out the install guide:

Glad I could get my first open source (Apache2) effort up after all these years. There are few if any modern OpenGL ES 3.0 / 3.1 demos for Java with Android and this is important tech. I spent a good chunk of the last week pulling out code from my commercial middleware (TyphonRT) to make a lightweight and concise GL framework that makes working with modern GL much easier. I also provide a few demos with more forthcoming though this is a spare time effort.

I ported over Kai's LWJGL / basic ray tracing in compute shaders demo code to OpenGL ES 3.1 and it works like a charm on the Tegra K1. Can't wait for his follow up post with the full code for something much prettier.

The nice thing is that the demo code is separated from the framework, so the framework can be used separately. Eventually I'll be porting the NVidia Gameworks demos to Java / Android. You do need a Tegra K1 based device to take advantage of OpenGL ES 3.1 as well Motorola and / or Qualcomm shipped deficient drivers that only supports 3.0 for the Snapdragon 805 / Adreno 420 on the Nexus 6 launch; talk about a disappointment there.  This framework and repo is updated for the absolute latest Android Studio 1.0.1 and build tools "21.1.2". There is a reasonable guide on how to check the code out, but a prebuilt apk is also available. Lots more to say, but I'd be glad to answer any questions about modern OpenGL on Android. I'll be expanding the wiki on the repos linked here in the next week.

If you think the code above is useful or cool it's the basis for where I'm headed with building a next-gen video engine for Android. I have ~8 hours left on a crowd funding campaign that will help me get it out sooner than later having bootstrapped to 95% of the way there spending 2.5k hours over the last year building it covering all costs out of pocket ($80k) / not quite close to raising the funding goal ($20k), but I'm most interested in finding early users who are interested about next-gen video and having fun / cool toys before iMeh; if you back the project even though it likely won't time you'll get an early invite regardless! I'll keep it short as there is a video available to check out of it in action.

KS here: At this point I'd really just like to connect with interested early users as the funding goal is well... yeah... Thanks folks!
4  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-12-10 23:16:42
I was planning on coding away today, but when I woke up I had 20 new backers for my Kickstarter which has been the biggest influx yet...

The very nice folks over at Phandroid did a nice little write up and article on the TyphonRT Video Suite:

So I have spent the last 4 hours trying to get some more momentum behind it.. There is some activity on Reddit BTW if any of you use Reddit... "TyphonRT" may be a good search term..  Roll Eyes

About coding though.. I am close to getting a nice little repo up on Github with a very lightweight utility code to do modern OpenGL dev with Android.

The repo is located here and I hope to get things up in the coming days:

So hopefully back to hacking on the modern gl demos.. I'll be porting over the most excellent ray tracing demos posted on the LWJGL blog. Check it out... Good work Kai:
5  Discussions / General Discussions / Re: am thinking of getting this PC, for both gaming and developing on: 2014-12-10 09:05:18
I chose that motherboard over a full size one, as it was the cheapest motherboard that supports overclocking.

Most definitely. I mentioned ATX, because if the case I have to give away (3U rack-mount) is used for the build out it would look "interesting" inside with all that extra space.

We'll get @philfrei jamming soon enough with some new gear!  Grin

Link for the Kickstarter is in my sig... hint hint.. all of you..  Wink 4 days left.. I can use all the help possible on getting the word out. I just couldn't get it into the tech press which was what was needed...
6  Discussions / General Discussions / Re: am thinking of getting this PC, for both gaming and developing on: 2014-12-09 23:10:47
Suppose we start with the purchase of the computer I listed earlier, and add another 4MB RAM for 8 total (16 possible):

Typically you want to have matched RAM. These days 8GB min; preferred 16GB.

How do I find out if it is possible to upgrade the CPU to the recommended  FX 8320 CPU (3.2ghz 8 core)? (I know much less about how interchangeable CPU's might be.)

The CPU will have a socket type the FX-6300 and 8320 are AM3+.

The PCPartPicker site is nice because it lists all the details you need to mix and match parts.

I'm tempted to try and build, but I think my "angel" may be more comfortable with going with the existing pre-built unit.

I'd recommend it for similar reasons folks above have mentioned: better quality parts, easily exchange / upgrade parts due to a case with space, plus then there is just having the satisfaction of knowing you had a hand in building what you are using daily and know exactly what to do to upgrade a part.  Once you build one you'll never go back to pre-built. It'll take ~30 min to put it together.
7  Discussions / General Discussions / Re: am thinking of getting this PC, for both gaming and developing on: 2014-12-09 20:08:13

I've got a spare rack-mount case I haven't used in ~7 years if you'd like to grab it.  From my quick reading of this thread I can also help you put together a box too as it seems like you might not have done this before. Besides we chatted previously about hanging out at some point.  Not too many JGOers have historically been near SF Bay Area as far as I'm aware.

It looks like what @Phased posted is a good build out: at that price range. I'd switch to an ATX motherboard as that is what the case is unless you wanted to buy a microATX case. I guess you can put a microATX board in an ATX case; it'd look empty with the case I have...

I live close by to Central Computers in SF, so if there by chance is a missing cable it's easy to pick it up. If you decide to pull the trigger on ordering those parts PM me..
8  Java Game APIs & Engines / OpenGL Development / Re: [framebuffers] Rendering after another framebuffer was bound on: 2014-12-08 04:43:57
I don't have any way to test this, the call to glClear is done where the framebuffers are bound, so its fine. I just wanted to know if you stopped rendering to a frame buffer then re-bound it and drew anouther object, would both of their content be there.

Yes... Keep in mind though the necessity of correctly setting glViewport when dealing with different sized frame buffers.  An easy way to test is simply bind a FBO, set the glViewport to the backing texture size, draw something, unbind, rebind set glViewport to something smaller and redraw. The 2nd drawing will be in the smaller view port size on top of the original drawing.

So yeah... not knowing what the rest of your rendering code does one does have to keep in mind calls to glViewport when binding / unbinding FBOs.
9  Game Development / Newbie & Debugging Questions / Re: Bug with touch movement in a game on: 2014-12-07 20:41:27
Consider handling ACTION_CANCEL and ACTION_OUTSIDE as those are additional states that are likely being triggered when dragging outside of a target View.  Use a switch statement.
10  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-12-04 16:44:48
Well, I've chucked a few bucks your way Smiley Does look kinda fun, although I can't quite honestly see the application beyond monkeying about!

Thanks Cas, you know I got your back on your next one!  Grin  Yeah.. When doing an impromptu demo with Scoble (he literally pulled out his phone and hit record within 30 seconds of engaging him which even caught me off guard a little) and the last one with Android Central what mainly is shown is scrolling between presets and most of the ones I have loaded up are rather gratuitous in style as to be obvious what is happening. There are plenty of opportunities for subtle image enhancement. One fun one is adding a slight bloom around edges (seen in the sizzle reel video with quick shot of the silhouetted girl & the drapes). What really can't be shown though easily at least in a quick demo is the preset editor where one can spend minutes or hours experimenting with different combinations to tweak out things to come up with original effects composition. For many folks this will be their first time ever doing something like that. I plan to offer a tutorial series describing basic to advanced effects composition and also discuss the GLSL code that makes it happen. I definitely want to enable sharing of effects presets between users.

For game devs it's certainly possible to prototype an effects composition chain and understand what it'll look like before implementing it in ones own game / engine.

I think the biggest professional use case is adding lower thirds in the field and immediately uploading completed content to media sites without further handling from offline editors.
11  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-12-04 10:53:07
I happened via serendipity (longer story!) to run into Robert Scoble ( today. He quickly whipped out his iPhone and....

"This Android app makes me jealous. Video effects. Feels like a new age in video." -- Robert Scoble...   

I need all the help I can get to tip the crowd funding campaign... Even with Scoble posting to his FB that just got 5 backers 30 to 35.... Not going to make it very likely, but anyone on JGO who does a crowd funding campaign has my $5; RPC being my latest funding... Tomb of Tyrants before that and of course Spine....  So yeah... I can use all the help I can get at this point w/ 10 days left. Still haven't got into the press per se...

<a href=";hl=en_US&amp;start=" target="_blank">;hl=en_US&amp;start=</a>

12  Games Center / Contests / Re: End of 4K. Thanks all! on: 2014-11-30 09:51:00
Yah.. Bummed I never had the free time to enter...

I understand there is a 4k / Android category with this competition:

At least there was in 2014 / didn't attend, but I might in '15... Granted it's likely all GLSL re: demo scene / eyecandy and not a Java game competition.
13  Discussions / Miscellaneous Topics / Re: I don't have the lollipop update yet? on: 2014-11-27 08:30:48
Or you can do it yourself especially since you have a Nexus 5:

14  Game Development / Newbie & Debugging Questions / Re: I want to scale ParticleEffect - Libgdx on: 2014-11-23 22:58:11
What is the value of "Values.Scalar_Width"? You need to likely take screen density into account. The S4 isn't the highest density screen either. IE I just got the Nexus 6 for instance. These are Android system values from DisplayMetrics:

Nexus 6:
density: 3.5
densityDpi: 560
scaledDensity: 3.5
xdpi: 494.27
ydpi: 492.606
widthPixels: 2413
heightPixels: 1440

Nexus 5:
density: 3.0
densityDpi: 480
scaledDensity: 3.0
xdpi: 442.451
ydpi: 443.345
widthPixels: 1794
heightPixels: 1080

You get them by getting a DisplayMetrics instance for the desired screen:

DisplayMetrics displayMetrics = new DisplayMetrics();

You can use any Context (Activity / Application) to call getSystemService.

Granted this is the solution for Android. You'll want to find a value with some of the factors returned by DisplayMetrics and set "scaleEffect" accordingly.
15  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-11-19 02:30:14
Yeah Galaxy Nexus was released in the fall of '11 though there wasn't that much of a delay if I recall. I happened to attend Google I/O in '12 and got one and the original Nexus 7. It replaced and was slightly better than the G2X / Tegra 2. I think that was the last year of good device giveaways for Android developers. I skipped the Nexus 4 (not that great of a GPU) and went straight to the Nexus 5 / Snapdragon 800. I started building my video engine on the Galaxy Nexus; har har har... About 1 filter (only some of them; forget blur!) running at ~24FPS while encoding. One couldn't instantiate a MediaCodec encoder / decoder at the same time or one after another due to whatever native resources not being released / out of memory. Snapdragon 800 / 801 and the Adreno 330 GPU blew the door wide open for doing what I'm doing today; up to 8 effects + plenty of window based ones (blur) at 30FPS (OK multiple blur / or kuwahara turned up can kill things). It's what I call the "FBO bonanza" as the Adreno 330 was the first phone form factor GPU that wasn't heavily penalized for multiple FBO rendering in a single pass.

The Snapdragon 805 / 810 is like the tock of a tick; ~30% and 80% faster respectively of the Snapdragon 800.
16  Discussions / Miscellaneous Topics / Re: What I did today on: 2014-11-18 23:11:58
Goodness things have been busy... Last week I attended the Samsung Developers Convention and ran into Phil Nickinson of Android Central... On demoing the video capture / effects composition app he pulled out his video camera and we recorded an impromptu demo that was posted to Android Central (first outside press):

I'm definitely looking forward to AnDevCon the rest of the week!  For anyone attending I'm giving a lightning talk on Wednesday evening about modern OpenGL on Android. I also have the opportunity to participate with a demo booth at the Intel sponsored after party Thursday evening 7:30 to 10pm. Stop by and check out the TyphonRT Video Suite. I'll have an LCD TV on hand and allow folks to get hands on with the video capture / effects composition app and have some fun.  Wished I had time to get together a green screen backdrop for photo / video fun, but I'm working hard on my demo code for my presentation.

My presentation on OpenGL ES 3.1 / compute shaders is on Friday at 10am. I am finishing up the source code takeaways today using the Nexus 9. What I'm demoing and providing utility code for is using OpenGL ES 3.1 from Java / Android Studio. There presently is no information out there about how to do this and so far I've found several tricks that make it possible that aren't obvious in integrating compute shaders with standard Android mechanisms like loading textures from resources that can also be used in compute shaders. I'm working real hard today in converting the advanced ComputeParticles demo (
from the NVidia Gameworks samples today to Java... Don't know if I'll make it as it's complex, but I'm giving it a try. Source code and slides will be available on GitHub soon.

If you are planning to make it to AnDevCon give me a holler and let's hang out and chat all things Android + modern graphics / video!

Also, some darn cool updates. I just installed Lollipop on the NVidia Shield tablet and yeah yeah yeah yeah YEAH MediaCodec works fine and my video engine is stable on the K1 / Lollipop. NVidia shipped the Shield w/ KitKat and a broken implementation of MediaCodec. So moving to Lollipop as the base OS for the video engine is really going to help in stability and hopefully I won't have to build as huge of a device testing lab.

And OMG Nexus 6 in the mail.. I'm trying to reroute it to a UPS station, so I can pick it up tomorrow morning!!! I still am amazed that a year ago I was using the Galaxy Nexus and now the Nexus 6 which is light years apart from the GN. 

17  Games Center / WIP games, tools & toy projects / Re: TyphonRT Video Suite (Kickstarter) on: 2014-11-13 23:33:22
Didn't mean to insult Smiley It the audio doesn't fit the video, maybe that colored my perception of the audio, ending up a tad harsh Smiley

No worries.. That is one of the only few tracks of mine that would be at all appropriate or close to appropriate for a general audience. All the rest are outsider / eccentric / experimental music.

Anyway, if anything, that 8 second 'intermission' is hopefully easy/cheap to rip out. That will surely help retention rates.

That I can take a look at and am plenty skilled at editing. Maybe There could be a way I could mash up the monologue and intro sizzle reel. I think refining the textual content is much more pertinent at this point.

As things go it's practically impossible for me to get anyone to be in a film / reshoot scenes without involving other creatives on the clock. I've essentially locked myself in my studio coding for the past year+ and most old acquaintances won't be in a video or even back the KS, so yeah. So everything has a dollar sign attached to it.
18  Games Center / WIP games, tools & toy projects / Re: TyphonRT Video Suite (Kickstarter) on: 2014-11-13 23:02:03
Definitely a live and learn experience I suppose..  As things go the current video cost a lot to produce. I don't have the resources / equipment to shoot a better one myself and was engaged full time as a contractor when things were being filmed for half of it at least (then that client ran out of funds and ahhh...). I'm taking on a short term contract Dec / January to survive at this point, thus also pushed out the release date a month for the KS.

A comparison KS is Looksery's:

Sure the video is a bit... Hmm.. I guess I'll call it "consumer friendly". I dunno...

then 8 seconds of fade out (bye bye!) followed by a monologue without background audio.

Dang, I thought it would seem over produced with background music during the monologue.

that background music is so generic it would be better to switch to audio that

Hey man.. I produced that track.. Albeit back in 2002... ;P
19  Games Center / WIP games, tools & toy projects / Re: TyphonRT Video Suite (Kickstarter) on: 2014-11-13 22:34:30
Good points.. I do appreciate the feedback because I'm not getting much. I'll try an figure out verbiage changes that make sense.

I see all sorts of projects being transparent in where the funding goes. I guess there are some that don't list it either and lean toward the direction you bring up. A big problem is just getting out of the friends and family category of backers. During the first campaign I was counting on conference attendance to pass out flyers and demo in person to Android enthusiasts. This turned out to be not so good.

I'm at the Samsung Developers Conference right now though and got very lucky to get featured on Android Central via Phil Nickinson who've I've bumped into in the past at conferences since the early days of Android:

Getting further coverage in Android blogs / any press is the only thing seemingly that will make it tip.
20  Games Center / WIP games, tools & toy projects / Re: TyphonRT Video Suite (Kickstarter) on: 2014-11-13 21:57:14
So, I rebooted the Kickstarter. I lowered the funding goal from $45k to $20k and did this without lowering quality expectations. Basically I needed to fund a significant device testing lab if I was going to release for KitKat due to MediaCodec API instability across the ecosystem. I'm bumping the min Android OS requirements to Lollipop which received more testing from Google. There are also other benefits like defaulting to the Camera2 API.

Check out the new Kickstarter:
21  Game Development / Newbie & Debugging Questions / Re: Ways of keeping track of classes... on: 2014-11-13 21:51:46
Packages are for organization and the default access (package private) modifier can be used to provide a non-public API across classes, but other than that packages don't mean anything.

Packages have a role in general regarding modularity. In particular they can be useful for meta-data describing import / export requirements of a module. This is scene in OSGi for instance.

The link below has some details then describes OSGi and how packages are used as meta-data:
22  Game Development / Newbie & Debugging Questions / Re: Perlin Noise clouds & CPU-Usage on: 2014-11-13 21:43:49
After some testing, i've proved that the Perlin Noise doesn't use alot of CPU, it's the creating of the bitmaps...

Then don't create a new Bitmap every time and recycle Bitmaps. Make two Bitmaps (mask and target) parameters to "generate clouds".  Handle the error case when mask and target are not the same size. Certainly the "clouds" bitmap can be recycled.  I don't know how many mask images you load or how often this method is called.
23  Game Development / Newbie & Debugging Questions / Re: Ways of keeping track of classes... on: 2014-11-10 21:35:30
Consistency is king when it comes to organizing code.  I work on various engine middleware and have ~1500 classes to organize. It's long past where I can memorize every single class name.

As far as code architecture is concerned the MVC pattern and inheritance eventually break down and really make things too rigid. Most large libraries (Java & Android SDK included) that solely base themselves in this direction go towards being a "kitchen sink" or "big ball of mud"

You are not going to face that problem yet. I'd say things start to buckle around 500 classes.

So some tips:
- I'm partial to naming interfaces with a leading capital 'I'. This makes it obvious when looking at the file that it is an interface and not a class.

- For package naming I put interfaces and other common utility code that doesn't have dependencies using a "commons" tag in the package name. Here is the package name for IEventBus (interface):

"commons" indicating it stores interfaces, java6 to indicate language level, sdk (in my middleware efforts it's either 'sdk' or 'rt' for runtime), opencore indicating this will be a freely available package ('core' is another variation for indicating for pay), then the actual specific naming portion for the code at hand.

And one implementation of IEventBus is BasicEventBus; here is it's package:

When you have a bunch of code and even when you don't a big help in organizing is to think about module separations from the get go. At least with IDE tooling like IntelliJ Idea & Android Studio (based on Idea) there is a strong module concept in the IDE. I understand Eclipse might be a bit different, but should have a similar construct.

What I have done is aggressively modularize my middleware and am now at ~750 modules / independent IDE projects. Essentially a module should expose one main core package and a few more if necessary. You don't split packages between modules! IE don't have the same package in two different modules. The really cool thing that this does is that in the IDE and build files you can explicitly see any other modules that any given module may depend upon.

I name the modules based on the core package name exposed.

For IEventBus this is the module it is in:

For BasicEventBus this is the module it is in:

"shared" indicates that it runs on any platform where Java SE is supported; the other parts of the module name mirror the package name internally. I tail things off with "commons" so its easier to see modules that are interfaces / utility code w/ no dependencies.

So, now when I look through the entire middleware code base I have ~750 modules that are named according to package and utility areas. I've ended up now with about ~50 utility areas with X amount of modules under them. Entity system, physics, opengl, opengl es, etc. are all major groupings.

Basically you just have to get meta on organizing your codebase. With 27 classes you are fine. At some point though putting off strongly organizing your code will have a negative consequence in maintenance and productivity. So for the time being just be as consistent as possible. Depending on how long the project / effort goes and how big it gets will determine if you need to consider doing the kinds of things I describe above.

The direction I'm travelling down has been summed up in the Modularity Maturity Model:

24  Game Development / Game Play & Game Design / Re: How to save/load game data with Entity/Component based Engine via JSON on: 2014-11-09 11:04:24
I'd recommend checking out existing JSON / Java serializers like Jackson or Boon. I haven't worked with either of those libraries, so I don't know how extensible they are or if you can replace how they might walk object graphs. I have no idea offhand if either of those would work out of the box and properly serialize an Entity or if you'll have to provide bespoke serialization which means you'd have to manage the process. If the latter is the case perhaps you can make tag, mask, compMap package private and have a "JsonEntityUtil" class in the same package that manages the serialization / deserialization which can directly access tag, mask, compMap.
25  Game Development / Newbie & Debugging Questions / Re: Image Effects on: 2014-11-09 04:42:59
Why not use a scripting language like Lua or even Javascript for your effects? Then you could alter your effects at runtime.

There is fast (GLSL) and then there is slow: everything else...   

BTW you can recompile GLSL code on the fly. It's how a lot of the convolution / kernel based shader code is manipulated. IE programmatically recompiling shader code base on various settings that expand / contract the window applied.

Just wondering, I am using the image kernal for a lot of effects like blur and emboss... But what about effects like sepia, instant and enhance?

Think about a "kernel" operation (not "a" BTW Cheesy) for a color matrix. I'm sure there are other ways of doing a sepia tone image like an image lookup, but here is how things are implemented in GPUImage w/ a color matrix.


As mentioned previously by @SHC if you have a proper matrix / vector library you'd multiply the color matrix by the color vector for the pixel essentially. Look at the GLSL code and translate that to Java basically.
26  Discussions / Miscellaneous Topics / Re: simplifying getter/setter classes on: 2014-11-08 01:42:52
I think this conversation is interesting and I meant to jump in sooner. The real problem that I see with traditional getters / setters is that they become awfully fragile across module boundaries in terms of code changes (3rd party library for instance). Architecturally speaking they are a horrible way to retrieve various resources from a main model or control (re: say a custom Activity in Android). Dependency Injection is a partial solution, but still pretty weak IMHO (especially constructor injection!) and has inherent maintenance concerns too.

Yes yes.. Component architectures with implicit lookup comes to the rescue IMHO. This is one of the super strong points of this approach such that across module boundaries lookups are implicit and don't require specialized getter / setter methods to connect everything together at runtime. It's great too because the component architecture API itself is super standardized across all modules that use it and dependencies aren't leaked between module boundaries (exposed in traditional getter / setter method signatures).  JGO will hate me for this, but I'll get buzz wordy and say it's essentially "In-process microservices".

Going beyond implicit lookup is the EventBus pattern (not Guava / Otto, etc.). At least my EventBus implementation allows setting up adhoc message passing between producer / consumer relationships and one can setup two way messaging (input / output ports). This allows passing messages across module boundaries that otherwise would be method invocations. For instance in my video engine efforts I have a camera manager. When I want to start a video stream for the camera. I send a "start video" message to the input port defined by a series of extensible enums that define a nested category indicating some sort of camera manager is receiving the message. And then camera frame received messages start streaming over the output port.

I could go on in this direction, but in the past JGO has been overly critical for little reason on the directions I've taken even though they solve a lot of problems with traditional OOP; especially the getter / setter debacle.

27  Game Development / Newbie & Debugging Questions / Re: Image Effects on: 2014-11-03 01:24:12
Then I will have a shader object which has the compiled shader, and all the shaders it requires...

(Lol, my Safari on my iPad froze twice while I was typing this... Srsly, how do people think iOS is amazing?)

Technically you can do whatever you want in regard to pre-processing your own shader code before linking it. You might be able to do some sort of limited chaining of what should be independent shader passes that don't require doing any sort of convolution kernel operations on intermediate state, but that would really make your post processing pipeline rather rigid and with no practical speed improvements. There are other fish to fry basically for performance.

By building a composable pipeline of independent image operations where each one writes to a successive FBO you can then allow the user to mix and match on the fly image operations to perform.

One of the really cool things my post-processing pipeline allows is a built in preset editor where users can interactively drag horizontally on the screen and the GL component shows dragging between between shaders / image operations by rendering the pipeline twice and combining. There is also interactive dragging between complete presets (renders the pipeline twice with each preset and combines).

Indeed... The GLSL direction is a lot to take in for what is a simple image processing API. Take what you learned from the Java side of things though and consider a GLSL implementation as it's useful in game dev or general media based apps.
28  Game Development / Newbie & Debugging Questions / Re: Image Effects on: 2014-11-03 00:24:15
So can I have some sort of shader with something like:
#use <shader here> <args>
#use <another shader> <more args>
And then when my program reads the shaders it checks for lines like that and it will pass the image through those shaders first?

No.. There is no embedding or continuation of shader code that is controlled GPU side*. You need to control the succession of shaders in a post-processing pipeline on the CPU side in addition to whatever variables / textures are set up / shared across executing various shader code.

* OpenGL ES 3.1 introduced indirect draw functionality that can be paired up nicely with compute shaders which allows drawing to occur after computation without CPU intervention, but this is not the same as bespoke GPU continuation like you are asking about.

At this point it may be good to get into trying some OpenGL yourself. There is a learning curve no doubt, but it will be worth it. If you happen to have an iOS device perhaps take a look at GPUImage example apps and source code. I guess for some examples there (modifying a static image) the simulator could work. I guess GPUImage also works on the desktop. I've never actually ran any GPUImage apps... Just snarfed* all the fragment shader code then improved it for OpenGL ES 3.0+ and my own custom pipeline.. Cheesy *will attribute of course!

29  Game Development / Newbie & Debugging Questions / Re: Image Effects on: 2014-11-03 00:10:04
Is there a max size on the matrix in OpenGL? So can I do something like 21x21 or something?

For built in variable types the max matrix size is 4x4. It's handy to review the reference cards for OpenGL ES 3.0 & 3.1 for built in types and GLSL functions:


Also there is a max array size for uniform variables depending on the OpenGL implementation. More to the point there is max uniform storage across all data types defined in a shader. See this link and "Implementation limits" section on how to query for the max sizes; usually 512 and greater is supported:

You can aggregate basic types in arrays and structs. For a "21x21" matrix an array of 441 floats, but you'll have to deal with the indexing yourself and of course built in matrix operations don't apply to a bespoke array. Not all OpenGL implementations support multidimensional arrays in GLSL. I understand that OpenGL 4.3+ should support or where the ARB_arrays_of_arrays extension is available. I understand OpenGL ES 3.1 has multidimensional arrays / ARB_arrays_of_arrays support.
30  Game Development / Newbie & Debugging Questions / Re: Image Effects on: 2014-11-02 21:37:32
1. But how can the developer tell they want to use another shader before or after with just "a shader". Also I want to send data from one shader to the other.
2. I mean number arrays. So I can have an image kernal. Like so (in Java):
This will do some sort of edge detect. Again, sry if I'm using the wrong term here.

1. You have CPU side control over the whole pipeline one builds for post-processing. You don't control the pipeline between shaders in shader code itself as it's done on the CPU side. You can share data directly between shaders via CPU side control such as FBO (textures) which are shared between shaders and you can share uniform data (variables in the shader code) between shaders. OpenGL ES 3.0 also enables a lot more in regard to general buffer objects as input / output from shaders (PBO is an example). But in the case of uniform data there are UBO (Uniform Buffer Objects). These are interesting because they allow setting uniform data of a shader efficiently in one call, but can also share data across multiple shaders. See the following for (OpenGL ES 3.0 feature):

2. Most definitely you can do convolution based kernels. See this code example for fragment shaders which do a Laplacian edge detection:

The line in the fragment shader "uniform mediump mat3 convolutionMatrix;" stores the 3x3 matrix and would be filled with the desired coefficients.

It should be noted that doing convolution based filters is generally expensive due to dependent texture reads. IE setting the color of the current pixel being processed is dependent on reading other pixel values surrounding it. You'll want to be careful in your implementation to reduce "dependent texture reads" in your fragment shaders. A good article on optimizations for this from Brad Larson regarding the Gaussian blur filter in GPUImage:

In short.. Yes, you can do convolution based filtering with OpenGL. There is nothing you can do in Java / CPU side that you can't do in OpenGL.
Pages: [1] 2 3 ... 9

Add your game by posting it in the WIP section,
or publish it in Showcase.

The first screenshot will be displayed as a thumbnail.

rwatson462 (31 views)
2014-12-15 09:26:44

Mr.CodeIt (23 views)
2014-12-14 19:50:38

BurntPizza (50 views)
2014-12-09 22:41:13

BurntPizza (84 views)
2014-12-08 04:46:31

JscottyBieshaar (45 views)
2014-12-05 12:39:02

SHC (59 views)
2014-12-03 16:27:13

CopyableCougar4 (57 views)
2014-11-29 21:32:03

toopeicgaming1999 (123 views)
2014-11-26 15:22:04

toopeicgaming1999 (114 views)
2014-11-26 15:20:36

toopeicgaming1999 (32 views)
2014-11-26 15:20:08
Resources for WIP games
by kpars
2014-12-18 10:26:14

Understanding relations between setOrigin, setScale and setPosition in libGdx
by mbabuskov
2014-10-09 22:35:00

Definite guide to supporting multiple device resolutions on Android (2014)
by mbabuskov
2014-10-02 22:36:02

List of Learning Resources
by Longor1996
2014-08-16 10:40:00

List of Learning Resources
by SilverTiger
2014-08-05 19:33:27

Resources for WIP games
by CogWheelz
2014-08-01 16:20:17

Resources for WIP games
by CogWheelz
2014-08-01 16:19:50

List of Learning Resources
by SilverTiger
2014-07-31 16:29:50 is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑
Powered by MySQL Powered by PHP Powered by SMF 1.1.18 | SMF © 2013, Simple Machines | Managed by Enhanced Four Valid XHTML 1.0! Valid CSS!