Track data From SynthEyes not matching up in 3ds max 2013


Hi guys, I just started leaning syntheyes and I have tracked a scene I shot. Within syntheyes 2011 all is well, my error avg is 0.4722 and going through frame by frame there is little to no floating of a test box I tossed in. The problem I am experiencing presents its self in max 2013, upon running of the exported max script, you can tell by looking carefully that the tracker points are not quite situated in the same way relative to the footage in syntheyes. The footage I am using in max and syntheyes has been undistorted already in aftereffects and I have set the image aspect of the viewport background to match the bitmap. I am really at a loss here and I have a quickly approaching deadline to meet, I hope one of you can point me in the right direction!

Thanks in advance!!


first, always use the raw footage for tracking. when you warp the image it will mathematically make less sense to the solver) You can use syntheyes to undistort/stabilize after tracking if you must. check the ‘start at frame 0’ checkbox in SE export dialog. Are your framerates matching? (shouldn’t matter I guess since both software work on a per frame basis) Does it match if you render a few frames, viewport background maybe a bit of?


I’ve seen both ways in production. Some like to undistort first, others don’t. I see some sense in change the math of the scene, and the small blurring around the edges that undistorting a footage introduces. Is this really a rule, to use the footage undistorted and unstabilized?


First rule of tracking is that there are no rules :slight_smile: If it works it works!

Recently I had some octocopter footage that came heavily stabilized and the tracker had a really hard time. After getting the raw footage, and removing the shutter lag, it tracked fine and almost instantly.

You can stabilize the footage but you’ll have to use syntheyes’ stabilizer, that will produce footage that is still mathematically solvable. A warp stabilizer just warp the image as if on a rubber sheet, the tracker will go insane on that! :slight_smile:

Lens distortion doesn’t influence the solver much, I find it best to track the footage, export the undistorted footage using the found lens distortion parameters. Then use this straightened footage to comp the 3d on. The 3d which will always be rendered with a ‘perfect’ lens. Then give the final comp some distortion back again to make it look more authentic. Or you can distort the 3d after rendering to match the original footage which will keep things more crisp. The advantage of the straightened footage is the the 3d will match in your viewport perfectly (viewport cam is a perfect lens with no distortion)

So there are 3 bad guys here:

-shutter lag -> always fix before you do anything

-stabilization -> use a stabilizer that produces trackable footage. AEF’s warp stabilizer does not do this! Or you can stabilized once the 3D is comped on… motion blur is something to watch out for. If you stabilize before adding 3d the footage will still have some motion blur in it from original movement, the 3d will have not. If you render the 3d though the shaky cam it is easier to match the native motion blur.

-lens distorion -> you’ll either match to footage to the 3d, or the 3d to the footage. Once comped you can distorted it together as you wish.

Anyways, tracking imperfect shots requires a lot of voodoo to make it right!


@jonadb - exactly what I’m doing right now! I have an octocopter highly distorted video here which was captured with a fisheye lens (10mm). Some guys told me to undistort it first, others like you said to work with raw. I’ve both undistorted and stabilized it in After effects and SynthEyes still did a pretty good job at tracking it.

Having said that I’ll try my chances again with the raw footage and see if it is easier/more accurate. Do you always start with autotracking or you place your trackers manually? And when you have trackers on different levels (say, different floors of a building without walls) do you HAVE to select them and constrain them to a plane?

Sorry to hijack your threat yamanash. My advice to you is check your origin point. It’s a good idea to place a tracker to be the origin in the same place where the origin is in Max - that way your only a rotation away from a perfect match. If you can pinpoint two common points then no need to manual tweaks later.


Usually I start in the ‘feature panel’ by letting it find a bunch tracking points by it self, and ‘peel all’ them. I, it will have found a few good trackers by it self. Then I’ll delete the shaky ones, and those that are obviously wrong. Best spotted if you turn off the footage and enable tracker trails. You can easily spot those that jump around or do weird things Then manually add a bunch of trackers that span as much frames as possible, just add them where the solve is bad or where there are to little. If you have trackers that go out of frame and later come back you can use the ‘combine trackers’ to align those, that will help keep the solve consistent over the whole length of the shot.

Usually I don’t use the constraints a lot, just ‘3 point axis setup’ to set up the main coordsys dimensions and that is it. Maybe I’m missing something but I never found constrains to be that helpful, good trackers will solve it without them.


Even when you have trackers on different levels/planes (different z heights?)? I find hard to trust the autotracking and not constraint those type of points, but thinking about it, maybe it is indeed better to let the software do the math and figure them out for us.


Does turning on “Safe Frames” help?

I once tracked (by hand) and object through some footage way back in the day and I forgot to turn on Safe Frames and it was all junked up. Longest night ever.


@x24BitVoxel You mean in SynthEyes?

@jonadb In the footage I mentioned (the one shot with a fisheye lens) you HAVE to undistort it before tracking. This is because of Max camera being unable to reproduce (in viewport at least) the kind of distortion a wide angle lens produce (the one where buildings appear to be made of jelly )

So, if you have problems when your tracking works fine inside SynthEyes (or any other track app) but does not match up in Max, be sure to watch out that Lens panel and undistort your footage before tracking/exporting to Max.

Can I ask what’s your workflow to remove/reduce the shutter lag?


Yeah that needs straightening if you want to match up 3d in the viewport, to bad max doesn’t have fisheye viewport capabilities. Both Vray and MR do so you might be able to render for the original footage, keeping both 3d and video as sharp as possible.

For the SL I simply AEF’s ‘rolling shutter repair’ effect, it is included since CS6. If you don’t have CS6 you could rent The Foundry’s ‘rollingshutter’ plugin for $3 a day, should work as well. ( )

No lack of markers in that shot btw! But it does look a bit tricky to track right, glows and glares aren’t helping. Did you try adding some blur or other preprocessing methods to enhance tracking? (you can do that from within SE as well)

Are you using dark/light-point trackers or pattern matching ones?


Speaking of Vray and distortion.

You can use Chaosgroup’s Lens Analysis Tool for creating lens files for matching Vray’s virtual camera to the physical counterpart. I’ve tried this tool myself and it can really produce an accurate match to whatever camera setup you have, included physical “inaccuracies” that doesn’t exist in in the mathematically correct virtual camera, such as offset optical centre for example.

Syntheyes have workflows for calibrating stuff like this as well, but the main idea with the Chaosgroup’s Lens Analysis Tool is to just being able to feed the virtual camera with the physical settings from the shot and the accurately distort the CG render accordingly.

If you don’t want to render distorted, Syntheyes have a nice workflow for applying distortion to CGI elements, as suggested previously in this thread:

I’m not sure what is the most preferable approach in a shot with very pronounced distortion such as a fisheye shot, but usually I like to undistort in Syntheyes - render out undistorted CGI from the 3d app - distort the CGI using Syntheyes - and then comp the distorted CGI on top of the original untouched source plate.

But at stated already, there really is no right or wrong whether you undistort the source plate in your 3d app, or render out distorted from the 3d app. Whatever is the simplest workflow for you is quite possibly the best one. Best way is just try and find out.


@Jonathas - Hehehehe! That’s not the footage I’m working, sorry to confuse you! I just slapped that picture to illustrate the kind of distortion I’m facing right now. Thanks for the heads up on the Shutter Lag (isn’t that also called Rolling Shutter?).

@Swahn My workflow was just like as you said - undistort in SE, export a TGA sequence of the undistorted footage and track, load the camera of the undistorted footage in Max, render, comp the undistorted TGA and Max render. If it was needed the distort Max’s render to match the original I’d do it inside SynthEyes.


We’d used Syntheyes for the camera tracking of footage, then I hand tracked a rotating object into the scene. (3D). Since I didn’t have safe frames on, the footage was all wrong and I had to start over. Noob mistake of me, and maybe doesn’t apply to your situation.


This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.