Hi. I have a sequence of 23 shots, all of which were shot at the same location. I want to track every single camera, not only in relationship to the tracking points, but in relationship to each other, so that when i solve my cameras, I can have them all in 3D space in the correct position, and when brought to my 3D application, i only have to make one scene, and adjust the scale, position, and rotation once.
I am using PF Match It, which as far as I am concerned has the capability of tracking multiple cameras, however I am not getting the desired results.
I bring every single shot into my tree view, and an autotrack node to each one indiviually, track it, and then connect every single autotrack node into a user track node, where i set up a couple (3 to be specific - I wish i could use more, but in 23 cameras moving around can be hard to find many features in common) of common features, then add a camera solver node, connecting the 23 upouts, however some of my cameras don’t work at all…
Has anyone ever done this? Am I doing it correctly?
Any help would be greatly appreciated!
Thanks
