Selecting MPxLocatorNodes in 2016


#1

Has anyone figured out VP2 selection in with locator nodes in 2016? I’m rendering a locator with an MPxDrawOverride, using addUIDrawables and MUIDrawManager. According to this, as of 2016 it should automatically do hit testing using the geometry given to drawManager when in VP2, but the object isn’t getting selected.

The boundingBox methods of both the MPxDrawOverride and the MPxLocatorNode nodes are getting called and returning something reasonable.  MPxLocatorNode.useClosestPointForSelection never gets called, which seems correct for VP2 selection.  MPxDrawOverride.refineSelectionPath never gets called, which is confusing, but maybe it's just not making it that far.  I think that's where I'll need to set the selection mask.

The goal is to reproduce the viewport handles that HumanIK has: drawn in xray, selectable by clicking anywhere in the handle and not just on a wireframe so they’re easy to select. If anyone wants to take a look: testNode.py


#2

Quick question: Which VP2 display mode are you running with when testing?

OpenGL?
OpenGL Core Profile?
DX11?

The only one that supports VP2 hardware selection is OpenGL Core Profile. If you’re in any of the other modes while testing, then the reason would be that you haven’t implemented the MPxLocatorNode::draw() method in your locator implementation. If you were to implement that, then it should then select.

In 2015 and earlier, all VP2 selection is done through the VP1 path.
In 2016, all VP2 selection is done through the VP1 path except for OpenGL Core Profile mode.


#3

That’s deeply confusing, since the docs say that the very purpose of the VP2 selection interface is to avoid requiring OpenGL when using renderers like DirectX:

As of Maya 2016, native Viewport 2 selection is available.

Before the 2016 release, selection could only be performed using existing Viewport 1 interfaces. The Viewport 1 interface drawing must use OpenGL even if the draw code for Viewport 2 is using a different drawing API (such as DirectX11).


#4

Well, that was a headache for such a simple thing: https://s3-us-west-2.amazonaws.com/temporary-random-junk/testNode.py

I think the only thing left is figuring out MSelectionMask. These nodes need to select like joints, so they take priority over meshes, etc. in selection (just rendering in xray won’t do this). I figured that out with shape nodes (overriding select() in the MPxSurfaceShapeUI node), but so far I haven’t found any hint of how to do this with other node types…


#5

I couldn’t find any way to pick the selection mask for a locator (or anything but a shape), so I scrapped that and rewrote it as a shape node. It needs cleanup, but if anyone’s curious: https://s3-us-west-2.amazonaws.com/temporary-random-junk/zHandle.py


#6

Thanks for sharing this. I had been looking at the new footprint example in the sdk, but seeing how to do this in python is excellent indeed.

David


#7

Next puzzle: freeze transforms doesn’t work, since there are no components for it to freeze to, which is pretty annoying for a node meant as a rig control. Any idea how to make that work? MPxLocatorNode applies (part of) freeze transforms to localPosition, but I can’t find any info about how it does that.

It seems like MPxSurfaceShape::transformUsing might be it, but that doesn’t get called…

Edit: If I parent them to each other, pressing down doesn’t traverse the tree correctly. It just goes down once from the transform to the shape node and gets stuck there. I think it would only work if I can make this an MPxTransform, which would be preferable (that’s closer to what joints and HIK controls are), but reintroduce a bunch of selection problems I’ve only managed to completely solve using a shape node. Making the Maya API behave feels like a losing game of whack-a-mole…


#8

Can’t you just build a transform node to your custom shape in the post - function off the shape class and connect your custom shape to it as a child?

You should not have any selection problems since the drawing and all around it is handled in the shape anyway…

Or am I getting this completely wrong?

:slight_smile:

/risto


#9

I am not sure what you are trying to solve, but custom shapes for rig controls is not really something you have to go to the API to solve…

For example, we use NURBs Curves for creating custom shapes to be used as Rig Elements.

Are you looking for some additional functionality you cannot get with using NURBs?

Edit: Are you referring to HumanIK in Maya or in Motion Builder, or from somewhere else?


#10

HIK in Maya, of course. I explained the goal in the first post: viewport controls that are drawn in xray, like joints and HumanIK’s controls, so they don’t get lost in the scene and are easy to select, and that are drawn as a solid object like a mesh (unlike joints and curves, where you have to click an outline–very annoying). The only stock nodes I’ve found that draw in xray are joints and HIK controls. (HIK controls only work with HIK, they don’t work in isolation, and you also get very few options for the look of joints.)

Other than the freeze transforms problem (which I can live with, I’ll just group the control and set the origin on the group, which is probably the right solution anyway for translation), it’s working well so far. I’ll drop it on GH in a bit after it’s cleaned up a bit more.


#11

Cleaned up, if anyone’s interested: https://github.com/zewt/zRigHandle

Now if only I could find a way to let people install plugins with just a .mod file, and not the ugly extra step of having to paste something into userSetup in order to get menus to load…

Another thing I haven’t figured out is how to name the transform created around the shape, so it isn’t called “transform1”. Builtin nodes seem do this without needing a wrapper command…


#12

Checkout this : http://ewertb.soundlinker.com/api/api.016.php


#13

Thanks, that worked. More of a trick to trigger a hack than an API…


#14

I’m sorry if this feels uncomfortable to you.
Should one choose a different line of toolbase, perhaps, or something else to whine about?


#15

Those “how-tos” have really stood the test of time haven’t they?

I particularly enjoyed reading the closing line…

Makes one consider that this whole exercise was rather pointless, doesn’t it?

Almost poetic in that context.

@gfk: Thanks for sharing the github link. That stuff is really useful for some of us still trying to catch up.

David


#16

Touche :slight_smile:

Those bits ( and lumis.com, of course ) were my bible back in the days :slight_smile:


#17

A rather unnecessary attitude up there…

djx: Still catching up here, too. Now I need to figure out why node editor templates won’t work, so I don’t have to always use “show all attributes” every time on my nodes…


#18

The footprint examples by autodesk are also provided in python api v2 as of 2016.
Just look in the “devkit/plug-ins/scripted” folder.

The examples do provide guidance on drawing the custom shape in VP2.0 (DX11 and OpenGL) and in the legacy viewports.


#19

The hard part wasn’t drawing (I’ve done that before), it was getting selection, drawing and Xray to work at the same time. There were several false starts with MPxLocator, VP2 selection, selection masks, etc. I still don’t know how to set a selection mask with an MPxLocator (needed for an object drawn in xray to select correctly), which is why I changed to an MPxSurfaceShape. MPxSurfaceShapeUI is the only way I’ve found to do this.


#20

You’re right Gfk, and I apologize.

:expressionless: