View Full Version : interlacing?
12 December 2003, 12:30 PM
Hi there, I've recently finished uni and am starting to get animation work on a freelance basis, some of it being tv work.
I've not had much experience of rendering for tv and have heard that interlacing my renders will improve the broadcast quality.
If i dont use interlacing in my render will the animation suffer?
If i do use it, which field do i render first, odd or even? Does that even make a difference?
Any help welcome!!
12 December 2003, 12:57 PM
First, interlacing is not compulsory.
Second, I don't recommend interlacing in Lightwave. Image filters sometimes will screw up, and other reasons.
I'd recommend you don't interlace for the time being. Feature films you see on TV are not interlaced. Did their quality ever suffer?
Interlacing can give a nice "video" look to things, but I'd wait 'till you understand it a bit better before you do it. Even then, in most cases it's better to do it from a compositing application, rather than from Lightwave.
12 December 2003, 06:45 PM
I usially render 24fps progressive, then go to 30fps interlaced via 3:2 pulldown in After Effects or DFX+.
No noticable image quality loss and you render 20% less frames.
12 December 2003, 07:24 PM
Some good basic questions there. Interlacing can be an essential "technical" side of presenting your work depending on a few things.
First thing first, never render your original elements/animation with fields/interlacing. This is best left for compositing output work. But when do you us interlacing? Well that can depend on a few things. You have to decide what will make your animation best intregrate into the program you are taking part of.
For example: If you are making graphics for a show that is cutting the graphics in with video work, then yes you will want interlacing most likely. Camcorder/video is shot at 30 (29.97 if you want to be technical) frames per second (fps) on NTSC and 25 fps on PAL format, the two most common broadcast standards. You should also animate at the same frame rate as the program you are taking part in. And since video has fields you will also want interlacing.
Animating for film:
Film however is normally shot at 24fps so you should also animated at 24fps. Not only does it looks more"film like" but you have to render and animate far less frames. However, what happens if you want to show your 24fps animation on a 30fps NTSC monitor for TV? Well this introduces a problem every movie, many tv shows shot on film deal with when being played on TV or put on VHS/DVD. You have to add a "3:2 pull down" which basically adds 6 blended frames to your 24 existing frames to make 30 fps. It blends the frames by adding feilds to split every other line of a frame and stacking odd and even feilds to make a new "inbetween" frame. So in that being said every feature film you see on TV does have feild work on it with interlacing otherwise the quality would suffer. Unless it's being broadcast on HDTV which can use a native fps of 24.
since we're talking about playing on different mediums (computers/tv, etc) something you should also look into is pixel aspect ratios. Essentially pixels on a computer monitor have an aspect of 1:1 but TVs have an aspect of .9. So what is round on a computer monitor will distorted on TV and vice versa.
So this was just a bunch of off the top of the head explanations written between doing other work, so hopefully it more or less made sense. Take eveything I said with a grain of salt and not an absolute definition. It is a topic that you should look and fully understand as it is important fundamental information that everyone should understand. Information lightly touched on here will come up in your career if you are serious about working in tv/film.
Why is it important stuff can also be so boring? *sigh*
12 December 2003, 08:44 PM
I'm still struggling with this issue because the result changes at each step of the process. I am now rendering out of Lightwave in Enhanced High anti-aliasing with fields off so I can use dithered motion blur, which looks good in Mirage.
But if I render out of Mirage to VT3 with fields on, I get jagged steps in the edges as if I'd never used anti-aliasing at all. So I tried rendering out to VT3 in progressive mode, and the image looks good again when played from the timeline.
If I encode the VT3 project in TMPGEnc for progressive scan and watch the result on the computer, looks great. But when I encode in TMPGEnc for NTSC and burn a DVD in NTSC format, the jagged edges return. :cry:
I know there is a process that will eventually work, but it's been a real adventure to find it. My main concern is the eventual DVCAM tape that will be broadcast on local cable access. I have no idea what the broadcast image will look like until I try it. I think I'll schedule that first broadcast at 2 AM. :wise:
12 December 2003, 09:22 PM
I'll see if I can answer some of your questions but I'm not familiar with the hardware/software you are using.
1. what is the "Mirage"
2. how are you getting your master on to DVCam?
3. if you want this for local cable broadcast, why do you want progressive scan?
4. what is the DVD burned copy for if you're delivering a DVCam master? Just personal use?
5. what resolution and frame rate are your animations renders from LW? What resolution/framerate are you outputting?
I'll see if I can help you out as there's a few things you mentioned that may be "questionable." But myself, normally I'll just submit the work to the imput/output person and they put it to tape so I rarely in the past few yrs have delt with that end. But I should be able to tell you what you files should be like when laying to tape.
I think ;)
12 December 2003, 09:42 PM
1. Mirage is a spinoff(?) of Aura. Basically the same program but with new features and a new management team.
2. I have a Sony DSR11 DVCAM tape deck that is plugged in to the Y/C cables direct from the VT card - no BOB. I have firewire Deck Control. So far, I've been playing the VT3 project in real time from the timeline and just placing the DSR11 in record mode.
3. I've tried both fields and progressive scan at each step of the way. Everytime I use fields, I get ragged edges in VT3, no matter odd or even. Now I'm getting them at the DVD stage.
4. Yes, the DVD is just personal use for now, the DVCAM tape is the main deal. (But I will want to use DVDs in the future to market the program I am producing.) My DSR has every connection with something plugged into it, so I don't want to keep remaking those connections just to haul it around - bad for the deck. The DVD is to show stuff where there is no DVCAM deck, and looks better than VHS, if I can get this resolved.
5. Frame renders are 720x486 at 29.97fps from Lightwave. Same is used in Mirage and VT3. Output from TMPGEnc is set at 720x480 29.97 for NTSC interlaced preset.
It's the step from progressive to interlaced where the problem occurs it seems.
12 December 2003, 10:07 PM
@ Simon, for broadcast work, field rendering is very important. Television was designed at a time when a full frame of video could not be displayed on the screen at 30 frames per second. To get around this, the engineers split the frame into two parts. Every frame consisted of two images, each at half the resolution. These are called fields. The fields are not the same image split in half, but rather two sequential images blended together into one full frame. The blending of these is called interlacing. Therefore, NTSC video (used in America and Japan) consists of 30 separate frames per second and 60 individual fields per second. Where interlacing becomes really noticable is when you have a lot of fast motion, as the images in each field become substantially different. If you've ever paused your VCR's playback during a scene with fast motion and seen the image "jitter" onscreen, you're seeing the effect of the interlacing.
You can render out your animations without fields and they will look fine. But if you have fast moving objects, they will not move smoothly across the screen. Instead, they'll exhibit a "stuttering" effect which is rather telltale and unprofessional.
If you're rending directly from LightWave to a file for playback on your NLE, you should use interlacing. Make sure you find out what field order your hardware uses, as wrong field order can result in animations that look worse than non-interlaced (aka Progressive Scan) renders. If you're going to be compositing your footage in After Effects or another similar application, do not use field rendering in LightWave, but be sure to set it in your compositing app.
@ Tom, in my experience, most jagged edges found on antialiased, interlaced footage is the result of improper field dominance. If you're working with interlaced imagery, it's important to make sure that every application knows the order of your fields and keeps that order throughout the process. If you output odd fields in LightWave and mistakenly set Mirage to use even fields, you'll run into troubles.
It's also important to make sure that every application de-interlaces the footage properly. I don't work with Mirage, but I know that After Effects splits the fields into separate images, runs its filters and effects on them, then reinterlaces everything for its final output.
I'd do a simple test animation of a box moving quickly across the screen, close to the camera. Turn on field rendering and make an animation with your field dominance set to lower, another set to upper, and a third with no interlacing at all. Import these directly to the Toaster and play them back. At least one of the interlace animations should work, and it should look better than the non-interlaced imagery. If it does not, you should probably contact NewTek's tech support.
Hope this helps!
12 December 2003, 08:13 PM
Well seems like Steve has given some good tips on where to start identifying the problem. Hope his tips clear up some problems.
01 January 2006, 09:00 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.