After seeing all the hacked kinect videos, and examples of the z-depth sensor - I was wondering what it would be like piped into a pflow, e.g. colour from the RGB sensor and depth from the infra-red driving movement of particles…
Not being a coder, I don’t know if there’s any off-the-shelf software to capture both sets of data as video to go into Max, but it could look pretty cool.
Cheers
Steve
