Game Programming
--
Questions
--
Followers
Top Experts
as far as blocking is concerned. On ATI cards, the glswapbuffers command appears to be a blocking command, which caused the original problem. Opengl applications
using less than 5% cpu on a machine with an nvidia card were using 100% on a similar machine with an ATI card.
I induced an active wait in my renderloop, measuring the render time and trying to sleep the process till just before the vertical retrace or if that is disbled the requested
framerate. This works sometimes, there are some quirks though. For starters, the windows thread switching granularity and the precission of timers is very important,
but one can work around that and so I did. The result was that the same reference app was now using about 20 to 25 % cpu on systems with ATI cards.
But, switching from the reference app to something more intensive, demonstrated yet another problem. It appears that many opengl commands can block the cpu on
an ati equiped machine. On nvidia, the commands return immeadiately whereas on ATI the cpu blocks on unpredictable opengl commands. Nvidia has the NV_FENCE
extension that allows me to do fine grained synchronisation, so I have no problems there .. but there's no equivalent on ATI, so I'm all out of ideas as to how to get
reasonable CPU usage on ATI-equiped systems.
I was also wondering how tripple buffering fits into this picture.
Zero AI Policy
We believe in human intelligence. Our moderation policy strictly prohibits the use of LLM content in our Q&A threads.
However, it is possible that if you have VSYNC turned on, swapbuffers becomes a sync-point for ATI (and for some reason not with NV). Â I can check with the ATI engineering folks if this is a problem. Â I assume you are using a modern (radeon) ATI card, and latest drivers? Â I also assume you are running double-buffered applications. Â Windowed or fullscreen might make a difference. Â Any other 'special' extensions being used might make a difference.
You should never wait-state yourself. Â The drivers should take care of VSYNC if you in fact do want it, and the swap should happen IN HARDWARE in fullscreen at least. Â Possible that windowed mode apps work differently with VSYNC enabled.
d
I am using opengl 1.4 with many extensions, but the problems occur even in verry simple setups. Drawing a few primitives and calling swapbuffers will yield a 100% cpu
usage on ATI cards, 0% on nvidia .. I could send you samples, both source and binaries, but it's really too trivial ..
I've been scooping around on google, there's an article somewhere about the radeaon linux drivers that says they do an active wait till vsync because of timing issues ..
I couldn't find anything about the situation on windows though ..
I'm running my apps full screen, but there's no way of telling that to openGl, so unless the driver concludes it from the dc size, I guess the situation is the same windowed
or full screen .. Â Maybe ATI always has tripple buffering enabled, causing my app to generate to many frames, eventually resulting in a blocking call (if the command buffers
are full)..
This problem has been puzzeling me for a couple of weeks now, I just want my simple ogl applications not to cludge up my cpu ..
Both vendors have some sort of frame-ahead buffering of the rendering queue. Â That's so when they start going really fast, they can keep processing things optimally. Â It may be that you are rendering something so small, so rapidly, that the ATI driver is doing a 'hard wait', while the NV driver is doing a 'soft wait' -- i.e., the ATI driver does a spinloop of sorts, while the NV driver still has control of the process tree but is giving up cycles to other threads. Â Or you have a call that is instigating the issue.
I'd be more than happy to look at a small sample -- and if it's really doing something that seems incorrect, I can forward it to my friends at ATI and get a 'proper answer' for why you are getting this particular results.
d






EARN REWARDS FOR ASKING, ANSWERING, AND MORE.
Earn free swag for participating on the platform.
be without the _nospamn_ off course .. That way I have your email as I can't find it here anywhere ..
The GPU isn't doing any work. Â All you do every frame is clear the buffer, which is an async flush in the hardware. Â The NV driver probably does some sort of sleep internally, sleeping your process, if you get too far ahead of them. Â The ATI driver is apparently hard-waiting when you get too far ahead.
The moral of the story is that you are running like 100,000 FPS or something extremely high, and at that point the driver is actually getting bottlenecked on the pure speed of requests, and does a block wait at some point. Â When you start to add in real rendering code (and yes, triple buffering yourself), you shouldn't see any such bottleneck -- you'll always be doing enough work either in GPU rendering, or app-side logic, that you'll be below say 300fps (or more realistic, on the average machine, below 60fps).
Try adding in some basic amount of rendering into the Draw function. Â Grab another few NeHe samples, see what it looks like.
If with 'real world' rendering submittals you continue to see an issue, you've got my email: ping me, and I'll DEFINITELY make sure that ATI takes a look. Â Include a DXDIAG dump, as they'll want to see that.
Hope that helps!
-d
www.chait.net
Furthermore, this app is only running free if you have vsync disabled. If VSync is enabled, it should run at exactly
the refresh rate, which it does, BUT .. it uses a 100%cpu while waiting for the sync. So you call glswapbuffers,
and it does not only lock the rendering thread (I'm fine with that), But it does not allow _ANY_ thread to execute
as the driver is performing an active wait. (resulting in a 100% cpu usage .. nomatter how little or much you are
drawing .. ).
But I'll send you a somewhat more extensive example that loads some models and does some rendering ...
really want to get to the bottom of this ...
grtz ..

Get a FREE t-shirt when you ask your first question.
We believe in human intelligence. Our moderation policy strictly prohibits the use of LLM content in our Q&A threads.
Make sure to send me a DXDIAG too, just in case they want that.
-d
So you can expect that in your mailbox one of these days..
enki
Game Programming
--
Questions
--
Followers
Top Experts
Game programming is the software development of video games. Game programming requires substantial skill in software engineering as well as specialization in simulation, computer graphics, artificial intelligence, physics, audio programming, and input. Like other software, game development programs are generated from source code to the actual program by a compiler. Source code can be developed with almost any text editor, but most professional game programmers use a full integrated development environment (IDE).