Java-Gaming.org    
Featured games (81)
games approved by the League of Dukes
Games in Showcase (488)
Games in Android Showcase (112)
games submitted by our members
Games in WIP (553)
games currently in development
News: Read the Java Gaming Resources, or peek at the official Java tutorials
 
    Home     Help   Search   Login   Register   
Pages: [1] 2
  ignore  |  Print  
  Getting out of the stone age - baseline OpenGL functionality  (Read 6544 times)
0 Members and 1 Guest are viewing this topic.
Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Posted 2011-11-30 10:23:30 »

I'm getting back into some graphics programming, so I'd like to get a quick survey/discussion on what people think is good baseline OpenGL functionality these days. I've been stuck programming for 1.1 for aaaaages, with more fancy things like FBOs as optional extras. It'd be nice to increase the base level to something more recent, say, GL2.0 because things can be so much faster and nicer to work with.

So, what do you consider a good base line? Still in fixed-function land? Completely shader based? GL1.1 + VBOs + FBOs? GL1.5 + shader model 2? Or DSA all the way?

For reference, here's the latest valve hardware survey (with the usual cavet that it's mostly gamer hardware): http://store.steampowered.com/hwsurvey

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #1 - Posted 2011-11-30 11:48:00 »

The Valve hardware survey is massively biased toward hardcore gamers though.

My own stats show a slightly different - though encouraging - picture. If you're mostly interested in Windows and newer Macs, you can safely go for OpenGL 2.1+; if you want to get as many Macs as possible, you'll still need to target fixed function for another year or so (at a guess). Linux is still largely irrelevant and most of the drivers leave something to be desired anyway.

VBOs are universally supported and if you use them exactly right they work well and bug-free so far as I can see.

Cas Smiley

Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #2 - Posted 2011-11-30 13:06:25 »

Yeah, the valve survey is pretty slanted, but it's useful none the less. And everyone will have their own idea of how 'casual' or hardcore they're aiming at.

I'd also be interested in what kind of reliability people have found with the more exotic features. If I remember right FBOs were annoyingly picky at first, but they seem to have settled down nicely now from what I've seen, and you mention something similar for VBOs.


[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Games published by our own members! Check 'em out!
Legends of Yore - The Casual Retro Roguelike
Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #3 - Posted 2011-11-30 13:31:33 »

If Linux is irrelevant, then so is Mac, by your own numbers princec as well as humble bundle numbers. As for drivers, meh... i recall having the same pain on windows. Drivers are buggy.

I have found VBO to work fine. I had some ATi problems at first but that was because I had things not quite right and nvidia is more forgiving. I still don't use FBO, but they are pretty old now as well. With so many games using shadow buffers and differed rendering i guess its something that is well supported....

But my bone to pick with all this is performance. I have a pretty new NVS 300. Which is a "work" level card. Its new and its pretty slow compared to a old GT8800 or even GF7600. It has quite low fill rates compared to other cards. Laptops are even worse. I have stuff that is in the 100s fps on a desktop that won't get 5fps on a laptop yet the card should only be about 2-4 times slower, not 20!.

I hate to think what android would be like with variable performance.

I have no special talents. I am only passionately curious.--Albert Einstein
Offline theagentd
« Reply #4 - Posted 2011-11-30 14:12:28 »

100% OpenGL 3.3+. Because it's how you're meant to use OpenGL these days. All fixed functionality is deprecated. Do you use JFrame.show()? No? Then don't use glBegin()-glEnd() either!

Myomyomyo.
Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #5 - Posted 2011-11-30 14:17:45 »

Quite a lot of cards out there don't support opengl 3.x, but still support 2.1+. for example one of my desktops and my laptop.

Sure if your doing Rage or some other bleeding edge thing. But nothing is more irritating that a sprite game that uses some supper new card feature when it simply didn't need to.

No offense  princec, but we are not going to upgrade our cards for RoTT or some such games. Getting us to part with our money is about as much as you can expect.

I have no special talents. I am only passionately curious.--Albert Einstein
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #6 - Posted 2011-11-30 14:19:14 »

Macs make up about 1/4 of my income but they also account for only 0.1% of my support time. (So: if I had to make my life easier, I'd ditch Linux)

A 20x slower card probably means you've hit an unaccelerated software path, not that the card is 20x slower necessarily, surely? Anyway for anything anyone in this forum is ever likely to do (in the context of game development as a hobby or indie, using Java) even a work card is going to exceed your requirements these days. Revenge was tested on a hilariously poor integrated Nvidia on a 3 year old laptop at the time and I still got it cranking out 60fps.

Cas Smiley

Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #7 - Posted 2011-11-30 14:23:44 »

100% OpenGL 3.3+. Because it's how you're meant to use OpenGL these days. All fixed functionality is deprecated. Do you use JFrame.show()? No? Then don't use glBegin()-glEnd() either!
No, no, no. That is not how this works, at all.

As developers we are advised which functions will be either removed in forthcoming versions of the software or discouraged because there are better alternatives. However these hints are entirely irrelevant when faced with the actual deployed software that is out there. A fairly huge proportion of systems are using OpenGL2.1 or less, end of. You are not meant to use the 3.3+ programmable pipeline if you want your software to actually run.

Using glBegin()/glEnd() has been effectively redundant since OpenGL1.1 when a faster alternative was given (vertex arrays). As all machines use GL1.1 or better you'd be daft to use it now it's been marked as deprecated.

Cas Smiley

Offline theagentd
« Reply #8 - Posted 2011-11-30 14:33:29 »

Quite a lot of cards out there don't support opengl 3.x, but still support 2.1+. for example one of my desktops and my laptop.

Sure if your doing Rage or some other bleeding edge thing. But nothing is more irritating that a sprite game that uses some supper new card feature when it simply didn't need to.

No offense  princec, but we are not going to upgrade our cards for RoTT or some such games. Getting us to part with our money is about as much as you can expect.
- The drivers for the new API is much easier to create and maintain for the graphics card companies, so they will have better support for a longer time in the future.
 - The new API has better performance, both CPU- and GPU-wise, even especially on low end computers.
 - Some functionality (like FBO's) are pretty much required for anything I'd make. Guess when it was promoted to core? OpenGL 3.0.
 - It's the future. People are abandoning DirectX 9 now. Direct 9 = OpenGL 2.x.
 - Less Intel graphics card to support / hack away for.
 - Sampler objects. Just those should be enough to make you never go back to OpenGL 2.x ever again after using them once.

At least use as much OpenGL 3.0 compliant code as possible. I do agree that using glBegin-glEnd is okay if you just want to draw a fullscreen quad... >_>


Quite a lot of cards out there don't support opengl 3.x, but still support 2.1+. for example one of my desktops and my laptop.

Sure if your doing Rage or some other bleeding edge thing. But nothing is more irritating that a sprite game that uses some supper new card feature when it simply didn't need to.

No offense  princec, but we are not going to upgrade our cards for RoTT or some such games. Getting us to part with our money is about as much as you can expect.
http://www.amazon.com/EVGA-GeForce-Express-Graphics-512-P3-1300-LR/dp/B004BQKQ8A/ref=sr_1_1?ie=UTF8&qid=1322662893&sr=8-1
I'll leave it as a reader's exercise to find out when this card was released, and also to buy it if you have a card weaker than that in your desktop and hang on Java-Gaming.org.

100% OpenGL 3.3+. Because it's how you're meant to use OpenGL these days. All fixed functionality is deprecated. Do you use JFrame.show()? No? Then don't use glBegin()-glEnd() either!
No, no, no. That is not how this works, at all.

As developers we are advised which functions will be either removed in forthcoming versions of the software or discouraged because there are better alternatives. However these hints are entirely irrelevant when faced with the actual deployed software that is out there. A fairly huge proportion of systems are using OpenGL2.1 or less, end of. You are not meant to use the 3.3+ programmable pipeline if you want your software to actually run.

Using glBegin()/glEnd() has been effectively redundant since OpenGL1.1 when a faster alternative was given (vertex arrays). As all machines use GL1.1 or better you'd be daft to use it now it's been marked as deprecated.

Cas Smiley
Wait, people on Macs play games? I didn't know! I thought they were busy with physics simulations (AKA that bird game).

Myomyomyo.
Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #9 - Posted 2011-11-30 14:34:51 »

100% OpenGL 3.3+. Because it's how you're meant to use OpenGL these days. All fixed functionality is deprecated. Do you use JFrame.show()? No? Then don't use glBegin()-glEnd() either!

If Khronos want to come and give everyone a proper 3.3 compliant card and drivers, *then* they get to decide which version of their API I use. But until that happens I'll use whatever version and extensions I damn well like.

I find your post unhelpful and unproductive.  Tongue

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Games published by our own members! Check 'em out!
Legends of Yore - The Casual Retro Roguelike
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #10 - Posted 2011-11-30 14:35:38 »

You can find it unhelpful and unproductive all you like, but you don't write games for a living, so your advice and arguments aren't worth shit, I'm afraid.

Cas Smiley

Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #11 - Posted 2011-11-30 15:01:27 »

Reading that back I realise that sounds a little harsh, but the argument stands if not the mean sentiment.

Cas Smiley

Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #12 - Posted 2011-11-30 15:12:28 »

Harsh is an understatement.. I would have used the words flamebait or troll  persecutioncomplex

I have no special talents. I am only passionately curious.--Albert Einstein
Offline Cero
« Reply #13 - Posted 2011-11-30 15:23:57 »

What about writing code that works with 1.1, but when it detects a higher version, it uses nicer features

Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #14 - Posted 2011-11-30 15:31:32 »

Its more work if you really want more than 1.1.  Seriously 2.1 is old now and is everything you really need for sprites and nice 3d with standard shaders. gl 3.x not so much.

Lets not forget that quake3 ran on TNT cards. princec is right, anything that is matched to the kind of art assets we can realistically produce, should run well on pretty much anything.

Till you go mobile of course.

I have no special talents. I am only passionately curious.--Albert Einstein
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #15 - Posted 2011-11-30 15:42:21 »

Sorry, been in "not mincing my words" mode since yesterday about our twatbag civil service having a massive tantrum and walk-out because there's not enough money in the country to keep them in their Cyprus holiday homes into their retirements as they had been led to believe 20 years ago.

Cas Smiley

Offline theagentd
« Reply #16 - Posted 2011-11-30 16:34:53 »

Its more work if you really want more than 1.1.  Seriously 2.1 is old now and is everything you really need for sprites and nice 3d with standard shaders. gl 3.x not so much.

Lets not forget that quake3 ran on TNT cards. princec is right, anything that is matched to the kind of art assets we can realistically produce, should run well on pretty much anything.

Till you go mobile of course.
Thread title is "Getting our of the stone age". OpenGL 2.1 is the stone age. OpenGL 1.1 is more about dinosaurs than computers.

Myomyomyo.
Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #17 - Posted 2011-11-30 17:40:35 »

Harsh is an understatement.. I would have used the words flamebait or troll  persecutioncomplex

Factually inaccurate too, but that's never worth getting in the way of a good rant.

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline lhkbob

JGO Knight


Medals: 32



« Reply #18 - Posted 2011-11-30 18:29:22 »

I've targeting 1.5 and implementing 2.0+ features using extensions or 3.x APIs when they are detected, or disabling the feature.

Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #19 - Posted 2011-11-30 19:17:04 »

Orangy - I think I just made a very odd brain mistake back there and do apologise - I could have absolutely sworn that the post I was replying to was written by theagentd.

Cas Smiley

Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #20 - Posted 2011-11-30 20:05:40 »

Orangy - I think I just made a very odd brain mistake back there and do apologise - I could have absolutely sworn that the post I was replying to was written by theagentd.

Cas Smiley

Just blame it on Riven's new post caching or something. Wink (Sorry Riven!)

Moving on - anyone any experience with multiple render targets (MRTs)? It's something I'd really like to use since they could really cut down on fill rate for certain effects, but I suspect they're still too bleeding-edge to be used reliably.

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #21 - Posted 2011-12-01 09:48:06 »

I have used MRT and found on my older hardware, and on linux that they are reliable. I was using 4 MRT, and my performance was about the same. As in my total fill rate was about the same as with 1 RT. I did not do extensive testing on anything else.

I was doing for a 4x4 gpu dct transform, so the shader was not trivial. Most fragment shaders would a little less i would think, so i was probably shader fill rate limited. 

I have no special talents. I am only passionately curious.--Albert Einstein
Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #22 - Posted 2011-12-01 09:56:13 »

That's useful to know - maybe the explosion of deferred renderers has pushed up the reliability of MRTs. Do you remember what hardware it was? It sounds like the secondary render targets were almost free in terms of fill rate?

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #23 - Posted 2011-12-01 10:03:34 »

How do these MRTs work, and what do you use them for?

Cas Smiley

Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #24 - Posted 2011-12-01 10:10:56 »

I'm sure delt0r can correct me, but the idea is that you can render a triangle with a fragment shader once, but instead of the fragment shader outputting one colour into one destination FBO, you can output two (or more) separate colours to two (or more) separate destination surfaces.

For example, in Rescue Squad I render all my sprites twice, first to get a fullbright version, then again to get a lightmap version. With MRTs I could draw all that geometry once rather than twice, and from the sounds of it drastically cut my fill rate usage.

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline delt0r

JGO Knight


Medals: 27
Exp: 18 years


Computers can do that?


« Reply #25 - Posted 2011-12-01 10:21:25 »

Yes. You just attach more than one FBO as the destination. In the fragment shader you say what gets written to which FBO. The idea is that, at least for me, was to reduce passes, and hence texture fetches.

However when i say fill rate was the same, i mean the total was the same. ie i used four MRT and hence my total fill rate is the number of pixels per sec x4.  So my fill rate was no different for 4 pass in theory. Except perhaps that if i tested that properly i would loose to texture fetch bandwidth limits.

The old hardware was 8800 and 7600S geforce things. The new stuff i don't remember, but was pretty bleeding edge for this time last year, since we got the cards at work for CUDA stuff. Again all linux drivers. We just don't have any windows at work.

I have no special talents. I am only passionately curious.--Albert Einstein
Offline theagentd
« Reply #26 - Posted 2011-12-01 10:28:23 »

Orangy - I think I just made a very odd brain mistake back there and do apologise - I could have absolutely sworn that the post I was replying to was written by theagentd.

Cas Smiley
Quote
You can find it unhelpful and unproductive all you like, but you don't write games for a living, so your advice and arguments aren't worth shit, I'm afraid.
So that was directed at me, huh? Sad

(I realize that people are flaming everyone by "mistake" here, so I'm not trying to douse you all in gasoline here by answering.)

Keep in mind that we have different interests in programming. I'd love to see you implement shadow map ray-traced volumetric lighting using OpenGL 1.1. Or deferred shading. My interest in game making is graphics, and I also think that people should use the most powerful tools they have available. I've stated my arguments for OpenGL 3.3. I'm also only targeting hardware that can actually run my stuff at 10+ FPS. I'm also ignoring Intel.

So I'm sorry, Intel card owners and Mac owners. No ray-traced volumetric lighting effects for you. Or GPU accelerated particle effects. Or deferred shading with MSAA. I'm getting sick of being told "YOU CAN'T MAKE GOOD GAMES SO USE OPENGL 1.1 AND JAVA 1.0". I want to get into the gaming industry after university (5.5 years in the future), but I want to do game graphics most of all. I want to show my OpenGL 3.3 compliant demo with tessellation and deferred shading when I apply for a job.

I now realize that I actually AM dousing you all in gasoline, but whatever.

That's useful to know - maybe the explosion of deferred renderers has pushed up the reliability of MRTs. Do you remember what hardware it was? It sounds like the secondary render targets were almost free in terms of fill rate?
Uhm, I'm pretty sure Starcraft 2 uses deferred shading (and obviously MRTs) on anything over the absolute lowest setting, and have you heard any problems related to that? Driver bugs in up to date OpenGL 3.3 drivers are pretty much a myth IMHO.

First of all, MRTs cost no fill rate at all. It's the same pixel, so no additional coverage checks ( = filling pixels) are done, which is the whole point of MRTs. Bandwidth, however is increased linearly by MRTs. HOWEVER, this doesn't matter, because...

To spew some technical reasoning about the fill rate: How many new commercial games do not use deferred shading nowadays? Don't you think that graphics cards makers have adapted? You can show this to yourself by enabling 8xMSAA in a forward rendering game. Your FPS will most likely not even drop to 3/4th the FPS with no MSAA (assuming a realistic test). Why? Because your graphics card has much more bandwidth and fill rate than it needs for basic forward rendering. HDR rendering + Deferred shading is three or four 16-bit floating point RGBA textures render targets. Add antialiasing, and you multiply both the fill-rate (subsamples) and the bandwidth needed, and you STILL don't get a linear drop in FPS. I'll even dare say that ALL graphics cards are unable to use their full hardware potential without antialiasing and/or deferred shading.

Anyway, if you need help with setting up MRTs, I'm you man. xD

Myomyomyo.
Offline princec

JGO Kernel


Medals: 367
Projects: 3
Exp: 16 years


Eh? Who? What? ... Me?


« Reply #27 - Posted 2011-12-01 10:54:22 »

Yes, it was, but if you start swapping names around in the thread then you can see the whole thing is a conflation of error (my humblest apologies). As you never said what I thought you said and Orangy never said what I thought he said my response was completely random...

I do realise you've got a completely different objective, and a worthy one at that: it is a very good idea to get a portfolio of awesome graphics programming together if you really want a job in the industry doing that. Orangy has a slightly different objective here though (not least coz he's already in the industry Wink). Orangy's targets are generally about 2-3 years behind the Steam demographic target (and you should be aware that the Steam demographic is heavily skewed, and not only that, but has a sharp divide along the Mac OS line last time I looked). Just because an API is new and an old version is deprecated does not mean it is no longer in use; in fact it's probably the older API that you'd want to target if you want to actually release something. There is a reason why World of Warcraft remains at #1, and it's because it runs on just about anything as it relies on very low specifications. So that is something to bear in mind - it's probably useful for you to know that fancy effects is only half of getting a job in the industry: you'll need to know how to fall back without them at some point.

Cas Smiley

Offline Orangy Tang

JGO Kernel


Medals: 56
Projects: 11


Monkey for a head


« Reply #28 - Posted 2011-12-01 10:55:09 »

So I'm sorry, Intel card owners and Mac owners. No ray-traced volumetric lighting effects for you. Or GPU accelerated particle effects. Or deferred shading with MSAA.

Which is where we differ I'm afraid. You seem to be aiming at only the high-end hardcore gamer spec, whereas Cas and myself both value being able to run on Mac and on intel chips. Across the entire forum yourself and Cas are probably at the extreme ends of the spectrum. Smiley The whole point of this thread is that I want to move to something more in-between, and finding out the sweet spot of functionality and compatibility. Lurching to either extreme isn't particularly practical I feel.

Edit: Wot Cas said.

[ TriangularPixels.com - Play Growth Spurt, Rescue Squad and Snowman Village ] [ Rebirth - game resource library ]
Offline theagentd
« Reply #29 - Posted 2011-12-01 11:00:50 »

So I'm sorry, Intel card owners and Mac owners. No ray-traced volumetric lighting effects for you. Or GPU accelerated particle effects. Or deferred shading with MSAA.

Which is where we differ I'm afraid. You seem to be aiming at only the high-end hardcore gamer spec, whereas Cas and myself both value being able to run on Mac and on intel chips. Across the entire forum yourself and Cas are probably at the extreme ends of the spectrum. Smiley The whole point of this thread is that I want to move to something more in-between, and finding out the sweet spot of functionality and compatibility. Lurching to either extreme isn't particularly practical I feel.

Edit: Wot Cas said.
Dammit. Please stop calling anything above at or above the level of a GTX 8000 series card high-end already. My friend's 600$ laptop has DX11 support for god's sake. You're targeting the lowest of low-end, but I am NOT targeting high-end only. There's a middle ground... -_-'

Myomyomyo.
Pages: [1] 2
  ignore  |  Print  
 
 
You cannot reply to this message, because it is very, very old.

 

Add your game by posting it in the WIP section,
or publish it in Showcase.

The first screenshot will be displayed as a thumbnail.

TehJavaDev (17 views)
2014-08-28 18:26:30

CopyableCougar4 (26 views)
2014-08-22 19:31:30

atombrot (39 views)
2014-08-19 09:29:53

Tekkerue (36 views)
2014-08-16 06:45:27

Tekkerue (33 views)
2014-08-16 06:22:17

Tekkerue (22 views)
2014-08-16 06:20:21

Tekkerue (33 views)
2014-08-16 06:12:11

Rayexar (68 views)
2014-08-11 02:49:23

BurntPizza (45 views)
2014-08-09 21:09:32

BurntPizza (36 views)
2014-08-08 02:01:56
List of Learning Resources
by Longor1996
2014-08-16 10:40:00

List of Learning Resources
by SilverTiger
2014-08-05 19:33:27

Resources for WIP games
by CogWheelz
2014-08-01 16:20:17

Resources for WIP games
by CogWheelz
2014-08-01 16:19:50

List of Learning Resources
by SilverTiger
2014-07-31 16:29:50

List of Learning Resources
by SilverTiger
2014-07-31 16:26:06

List of Learning Resources
by SilverTiger
2014-07-31 11:54:12

HotSpot Options
by dleskov
2014-07-08 01:59:08
java-gaming.org is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑gaming.org
Powered by MySQL Powered by PHP Powered by SMF 1.1.18 | SMF © 2013, Simple Machines | Managed by Enhanced Four Valid XHTML 1.0! Valid CSS!