View Full Version : Q: frame rate and TV out
I have a question on game frame rate and TV output.
Just wondering if a game does not produce consistant frame rates, say anywhere between 40 and 70 fps (proper frames not fields), will the scan conversion (TVout) flicker b/c its not rock solid on 25 (PAL) or 30 (NTSC) fps?
1. What is the desired frame rate when developing a game to be displayed on TV?
2. Are there guide lines/rules when developing for TV?
3. Should the frame rate be twice that of TV frame rate: 50 for PAL and 60 NTSC?
thanks in advance,
04-27-2004, 02:01 PM
I think some rendering hardware are frame limited to 60 or 50 FPS (for NTSC and PAL respectively) when you enable the TV out option. This is due to the refresh rate of the even and odd fields on the TV. So on an NTSC system, fields are displayed at 60Hz, which makes the actual frame rate 30Hz (of FPS).
There is no general rule (at least I don't use one), just enable frame syncing in your card's driver and pump out the scene at max FPS. Frame syncing will prevent image 'tearing' during fast movement.
Tachicoma, thanks for that. It makes perfect sense.
07-17-2004, 11:21 PM
Your final rendering will be very dependent upon the choice of output medium. Not all TV's are created equal. Some of the differences are:
Aspect ratio: Not only the horizontal-vertical dimensions of the screen but also the shape of the pixels. The dots aren't square, and they are not the same size. Frame rate: How fast the images are redrawn on the screen. US broadcast video is one speed; European PAL is another. HDTV is yet another. It is good to render at a higher frame rate, ideally one that is a multiple of the various rates you might need to publish to, so that various outputs can be produced without asking the computer to "interpolate" (synthesize...) any data. (Anytime a computer synthesizes anything there's a lot of noise; looks bad.) Interlacing: On broadcast TV, even-numbered lines are drawn then odd-numbered lines. On computer screens there is no interlace. But if you simply "add" interlacing after-the-fact you might get horrid jaggies on an interlaced screen! (And of course, viewing interlaced output on a non-int screen is essentially unviewable.) Colors: Computer monitors have considerably more color resolution than TVs do. Subtle color distinctions may be lost. Variations in brightness and contrast can also be enormous. Just drop by the television section of any store and compare the images on sets that are showing the same broadcast. Laptops/LCDs: A similar but unrelated color problem has to do with the fact that LCD (laptop) screens have sometimes-nasty color shifts when the viewer's eyes are not 90-degrees to the screen. "Safe" areas: Depending on the particular set, not all of the image might be visible. The margins might be cut-off, just a little or fairly extremely. Also, a render that was done for the HDTV 16:9 aspect-ratio might need to be re-cropped for NTSC or PAL because there is no time or budget to re-render it.
As a designer, you have to plan your production to accomodate these realities. You have to work within these limits. Many of these issues can only be economically addressed before the work begins.
01-18-2006, 04:00 AM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.
vBulletin v3.0.5, Copyright ©2000-2014, Jelsoft Enterprises Ltd.