First rule of tracking is that there are no rules If it works it works!
Recently I had some octocopter footage that came heavily stabilized and the tracker had a really hard time. After getting the raw footage, and removing the shutter lag, it tracked fine and almost instantly.
You can stabilize the footage but you’ll have to use syntheyes’ stabilizer, that will produce footage that is still mathematically solvable. A warp stabilizer just warp the image as if on a rubber sheet, the tracker will go insane on that!
Lens distortion doesn’t influence the solver much, I find it best to track the footage, export the undistorted footage using the found lens distortion parameters. Then use this straightened footage to comp the 3d on. The 3d which will always be rendered with a ‘perfect’ lens. Then give the final comp some distortion back again to make it look more authentic. Or you can distort the 3d after rendering to match the original footage which will keep things more crisp. The advantage of the straightened footage is the the 3d will match in your viewport perfectly (viewport cam is a perfect lens with no distortion)
So there are 3 bad guys here:
-shutter lag -> always fix before you do anything
-stabilization -> use a stabilizer that produces trackable footage. AEF’s warp stabilizer does not do this! Or you can stabilized once the 3D is comped on… motion blur is something to watch out for. If you stabilize before adding 3d the footage will still have some motion blur in it from original movement, the 3d will have not. If you render the 3d though the shaky cam it is easier to match the native motion blur.
-lens distorion -> you’ll either match to footage to the 3d, or the 3d to the footage. Once comped you can distorted it together as you wish.
Anyways, tracking imperfect shots requires a lot of voodoo to make it right!