I'm thinking of making a low resolution scrolling 2d game, and the only way to have perfectly smooth scrolling is to use a fixed frame rate synced to the vertical refresh (or integer multiple thereof). The problem is choosing the best frame rate.
60Hz would work on every monitor, but it is disgustingly flickery on CRTs. Running 60Hz animation on a 120Hz refresh CRT would also scroll smoothly, but at the cost of higher CPU use. I don't anticipate CPU speed being a problem, but I don't think many people have monitors that can refresh that fast.
72Hz is fast enough to avoid flicker on CRTs, but now we have problems with LCDs. Not all LCDs can refresh at 72Hz, and some that claim to only fake it, throwing out frames to achieve a real 60Hz refresh, which would defeat the whole point of having a fixed frame rate.
I could implement a fallback variable frame rate mode, but really I'd like smooth, flicker free scrolling on as many systems as possible. The decision of which frame rate to use therefore depends on which is most common: CRTs that can do 120Hz at 640x480, or LCDs that can do 72Hz (or 70Hz, which is close enough that the slowdown would be tolerable). Which do you think would best achieve the goal?
First, syncing to the frame update is tricky on modern OSes. You might not be able to do it.
Second, depending on the type of game, you might not need to fix the frame rate at all. You can keep all positions as floating point values, and measure the length of the previous frame to figure out how far things should move during one frame. This is easy for certain game types (those with lots of flying smooth motions) and much trickier for others (platformers and such). Then, just use whatever frame rate is appropriate on your current system. Or just go as fast as you can. You might get some tearing, put in practice, it won't be very bad.
Syncing to frame update is possible on Windows and most graphics drivers on Linux.
Variable frame rate will never look as good as fixed frame rate for scrolling 2d games until we have very high minimum refresh rates (>120Hz) and each CPU power to do high quality temporal anti-aliasing (motion blur). Designing a game for fixed frame rate allows you to set movement rates to be a consistent integer number of pixels per frame, which is the temporal equivilant of hinting in fonts. The only question is whether the base frame rate should be 60Hz or 72Hz, which depends only on whether 72Hz capable CRTs are more or less common than 120Hz capable CRTs.
I've written games that use variable framerate scrolling in 640x480, and as long as you run at a high frame rate, it looks as smooth as anything. Of course, this is for variable-speed smooth motions, not for things moving at constant rates.
Also: You really shouldn't be designing based on CRT specs in this day and age. CRTs are most definitely on the way out, and LCDs are gaining very strongly.
...which is disappointing, because other than in size, sharpness, geometry distortion and power consumption LCDs are inferior in every way. But if designing for the future, then I think 72Hz is reasonable for a common LCD refresh rate. TV manufacturers will want support for integer multiples of 24, and 48Hz is too slow for computer use, so 72Hz is the lowest refresh rate good for dual purpose use.
I love my 200Hz vertical refresh CRT, even with it's crappy geometry.
I love using my LCD in portrait mode. It's awesome. Can't really do that with CRTs.
I love that my LCD fits on my desk, as opposed to pretty much every CRT since 2000 or so.
Anyway, since we're talking LCDs, that also means you don't really want your game to run at a non-native resolution. Scaling to a fixed resolution is iffy, and you might be better of running in a window and using 2x2 or 3x3 pixels. Which means you get problems syncing to the referesh rate, again.
I've been using LCDs for the past six years. No, you're not getting me to use CRTs ever again. I haven't had eyestrain since I ditched the horrid things.
Anyway,
> Running 60Hz animation on a 120Hz refresh CRT would also scroll smoothly, but at the cost of higher CPU use.
How? If you've filled your buffers, execute a blocking vsync() and watch the CPU use drop.
> Which do you think would best achieve the goal?
I think you're worrying too much. This isn't some FPS, so I highly doubt a person will notice 1/120, 1/72, or even 1/50th of a second difference in a side-scroller. Particularly a low-res one.
Games now days don't tie their logic to timed loops or frame rate. You shouldn't either.
>>8
Now that is a problem, as LCDs typically have retarded 5:4 native aspect ratio. On Linux arbitrary resolutions are no problem, but is it possible for a program to change to 640x512 resolution on Windows? The game could then be run letterboxed at the correct aspect ratio, with scaling to an integer multiple of the most common native resolution which shouldn't look bad. If not, then I'd have to depend on fast enough hardware scaling to run at full resolution, which might not be a problem (but this needs some testing).
>>How? If you've filled your buffers, execute a blocking vsync() and watch the CPU use drop.
At 60Hz vsynced, you've got 1/60th of a second to draw one frame. At 120Hz 1/120th of a second, so twice as much CPU required, unless you use triple buffering which is unacceptable because it adds an extra frame of latency.
>>Games now days don't tie their logic to timed loops or frame rate. You shouldn't either.
Unfortunately, there is no other way to get smooth scrolling graphics. It is actually much less noticeable in a FPS, but it is very noticeable in 2d scrolling games.
You said, "Running 60Hz animation on a 120Hz refresh CRT would also scroll smoothly, but at the cost of higher CPU use."
Was that a misprint? I can't discuss what I'm not clear about.
> unless you use triple buffering which is unacceptable because it adds an extra frame of latency.
At 60Hz that's 1/60th of a second. Either you're superhuman, or you're overestimating what humans are capable of. I highly doubt anybody could notice that on a side-scroller, least of all one with poor resolution. How fast is the scene moving?
> Unfortunately, there is no other way to get smooth scrolling graphics.
You could have fooled me. I used to write side-scrollers on a 286 and a 486, and I never noticed this. Admittedly, I tended towards dirty-buffering over triple- or even double-buffering, but having played games with triple-buffering I just can't buy this.
On the other hand, lolocaust has proven I suck as a twitch gamer, so maybe something is wrong on my end.
The fastest possible human reaction time, with perfect mental state, is about 100ms (olympic sprinters can occasionally approach this when reacting to the gun). 1 frame at 60Hz is 16ms, which is a significant fraction of the fastest possible time. 160ms is more plausible for a good twitch gamer (as it's mostly reacting to sight rather than sound, which is slightly slower), and I can't think of any such person who would voluntarily slow their reactions by 10%.
This isn't really an issue, as I don't think triple buffering will be necessary anyway.
I have to agree, you're exaggerating wildly now. 10% of your fastest reaction time is "signficant"? That's just crazy.
And I don't see how running at 120 Hz would make any difference CPU-wise in the first place. Why would you have to finish your frame in half the time all of a sudden? You can just sync to every other frame instead of every frame.
Not at all. First assume that there are two players of equal skill (tactical ability, mind games, etc) and equal reaction speed competing in a 1v1 game that requires fast reaction, eg. a first person shooter. Reaction time will be variable, and it will follow a probability distribution. This won't be normally distributed because there's a fixed lower bound (call it 150ms), maybe something like log-normal distribution. Something with fairly heavy skew, finite lower bound, infinite upper bound.
Because the gamers are concentrating hard they will be mostly operating near peak reaction time, so I don't expect much variance. Now if we offset the distribution curve of one of the gamers by 16ms, and look at the difference/intersection of area under the curves:
(p)
| __ __
| / X \
| | |\__\__
|/ /%%%%\__\____
|__|%%%%%%%%%%\_ \___ (time)
The unshaded area is guaranteed win for the unlagged player in a mutual surprise situation. This matches what happens in server side hit detection FPS, where the "LPBs" almost always win when 2 players of equal skill surprise each other. 16ms might not be much compared to 160ms, but it is compared to the variance of reaction time, which is the important thing. These small differences in lag do have a significant gameplay effect, which is why almost every modern FPS now uses client side (but usually checked on server for anti-cheating) hit detection.
Also I don't know any reliable cross platform method of syncing to every other refresh.
> The fastest possible human reaction time, with perfect mental state, is about 100ms
It seems a bit closer to a mean of 150ms, for professional athletes, who incidentally aren't performing a complex action. A shot means "go", so there are no variations.
Human beings have horridly slow reactions the moment you start involving the cerebrum. Even something simple results in at least ~400ms, even with several repetitions. And computer gaming most certainly involves the higher brain. Should I jump? Fire? Do both? Throw in a complex scene with many moving objects and the reaction time drops further.
I've actually been a subject in the classic "dropping a ruler" test. My results were both surprisingly dismal, and quite consistent with other subjects. We're just not rigged to go fast.
BTW, what does >>14 have to do with a scroller? I agree with >>13, and that's just for 60Hz. Perhaps you need to try a double-blind test with a simple scoller first before deciding...
>>But you're not making an FPS, are you? You're making a low-res 2D game, and those have never had that kind of obsessive attention to delays.
But they never needed that kind of attention to delays, because back then everyone synced to vertical retrace and nobody had enough ram for triple buffering. A FPS is merely a worse case scenario example of where even very short lag is most damaging, it is still bad in other genres.
>>Sync, read timer, draw frame, check timer, sync once or twice depending.
...but I don't have a reliable cross platform way to just sync, only to sync and then swap hardware buffers. It's completely irrelevant anyway as the kind of game I'm thinking of would run perfectly on a 386, modern hardware is overkill.
>>17
The complex action is running in a different "thread" in the brain. The reactive action is then anticipated, and triggered without complex thought. The perfect example is running round a corner in a FPS and into an surprise enemy. You don't think, you just shoot, and whoever reacts fastest wins.
You are right about the need for testing though. I'll hack together a minimal 60Hz shmup game engine and ABX test it with 0 and 1 frames of added control lag. I am almost 100% certain that I will be able to tell the difference.
> ...but I don't have a reliable cross platform way to just sync, only to sync and then swap hardware buffers.
Then replace "sync twice" with "wait until one frame has passed, then sync".
> The complex action is running in a different "thread" in the brain.
Much like physically walking, I suppose? That's quite possible, particularly if a person is well practiced at a game.
I still doubt you'll get anywhere close to 150ms though. This isn't like touching a hot surface, where the signal goes to the spine and right back. No matter what you do there'll be some cognitive processing, even if anticipating the likelihood of an opponent around the corner reduces the reaction time.
On the other hand, in a capture-the-flag tournament you're sort of stuffed. Just because it moves doesn't mean it's the enemy. So twitch isn't always helpful in FPS either.
Completely tangental, but it's an interesting topic.