Nope… you dont HAVE to go through motionbuilder at all.
Set your target character up in LW with something of a reasonably correct skeleton (the fbx/mb standard is easiest probably, details/tuts on my site)… You can then bring this character straight into iPi, and retarget your finished mocap straight over.
This, ofc, is the important part, the retargeting. you could have just pulled the bhv into LW, adjusted the skeleton to fit your mesh, rerested the bones, and away… but it’d not look very good, due to differences in proportion between the capture character, and the target character.
What doing it in iPi wont give you though is pretty much anything to tweak the mocap at all… for that you’d need something like MB as an itermediary retarget/tweak package… or animeeple (while it still around) or Ikinema’s webanimate.
its worth noting that depending on the type of animation you’ve got in mind to do, and the kind of quality desired… you may very well need to do manual cleanup, and tweaking. iPi will NOT just give you dead bang motion capture, perfect and clean. you WILL get issues with small jitter, foot slip, and so on.
As for a more realtime process as you say… well, to an extent no, cos iPi needs to track the motion first, which takes time, so its not like its just there and done as soon as you’ve recorded. Otherwise, for using motionmixer, sure… all MM requires is that the rig/controller set that its mixing is the same for each “clip”… so once uve got motions out of iPi, or whatever other in the middle software you may have used, then you can happily use those in LW’s MM.