What's the difference between a graphics fps and logic fps?
Posted on 2004-08-26
Recently I've developed a simple vertical shooting game using DirectX. The game itself is written in C++. I've set the frame-per-delay at 20ms, which means per second it'll run at 50 frames.
However, I've done a comparison between mine game's graphic fps feel and Diablo 2's graphic fps feel and found out that at 50 fps they felt the same. P.S Diablo 2 is running at 25 fps.
My game is 800x600x16bits, while Diablo 2's should be 800x600x8bits (Correct me if I'm wrong, but please make sure that you're absolutely right, in my experience, I don't think D2 is 16bits)
Note that I'm absolutely sure that my game isn't blitting twice. You might ask, it may be that my game is running at 16bits which contributed to this slow performance. But I've set my graphics blitting at 100fps and it runs perfectly, even on a Celeron 1.7Ghz... though I've yet to try it on a slower machine.
The Question is, can anybody explain to me clearly on why this is happening? I mean, if you notice carefully, D2's graphical movement is somewhat jerky if you are strict abt it but it's acceptable at 25fps - since VCDs run ard 30 fps...
If 25 fps is only slightly jerky, 50 fps should be smoother significantly, but in my game I only see the same jerkiness as I do in Diablo 2.
I would really appreciate some really helpful enlightenment on this matter.