OK…I’m just wondering if this would be useful for others…I’m guessing it will be especially useful for wacom tablet users (where RMB and MMB are just plain unatural).
With armatures, rather than just having the choices RMB, MMB, LMB to have mouse direction aswell…so, you could say LMB going left and right changes say heading and LMB (yes LMB) going up and down changes say pitch…I reckon this would be rather useful and would place less strain on that poor old thumb of mine! - Ergonomics!
Oddly enough I just asked the exact same question yesterday in both this forum and the user group message board. I’ve yet to get an answer but I don’t believe it is natively possible. I’m hoping that I will be able to implement said behavior via messiah:Script or the API.
I’d have to agree 100 percent with this one! Lightwave behaves in a similar fashion. It just seems more intuitive to have one mouse button control several axes of movement. This simple control addition would be at the top of my wishlist of Messiah features. My 40 year old fingers prefer fewer separate LMR button clicks…
Thanks…It’s an interesting point about ‘fewer apearate LMR button clicks’ I completely agree. I went over to a pen for a try about three years ago (partly due to tennis elbow!)…I haven’t looked back. Wacom tablets take a little getting used to but when you’ve given them a wek or so you discover they are SOOOO much more comfortable to handle…The only thing that reduces their ergonomics is software that includes lots of LMR clicks…Sofftware developers should recognise the pen as a growing method of ‘data entry’ and allow for customisation. I imagine that most pen users prefer NOT to use MMB or RMB clicks but like me, prefer keyboard and pen combinations (ie LMB and Keyboard). LW has always been good at this. Messiah has grown up without much pen usage (only since v5 could you actually use one properly - as soon as the pen friendly version came out, I bought the next day!) and so there are some aspects that could benefit from user customisation. An example here is that pen users (IMO) tend to prefer slider changes by horizontal movement rather than vertical movement. I am hoping that someone at PMG will start to use a pen or at least get some input from a 100% pen user so that the software can evolve to be as equall pen friendly as mouse friendly…which would make the application far more hands friendly!!!
PS Have you tried a pen? If the mouse is proving to be irritating after long hours/days…give a pen a go…They are simply fantastic!
Hi. I’m not sure if that’s what you’re after, but note that you can activate the M (or R for rotations, etc) in the middle column, right next to the channel setting of an armature action. It acts the same as if you clicked e.g. the M in the middle of an edit sphere, and lets you move an armature on two axes simultaniously.
One thing that I’d like to see for armature handles though is an additional option to use the ctrl, alt and shift key modifiers. This would add some more possibilities to the use one handle
Thanks maks…what is wierd is that you can select a single channel in the first column and also “m” in the middle column at the same time. Does that just mean that the middle column takes precedence over the first column’s selection?
Also, this might help DMack out but it isn’t really what I’m trying to do. Let’s say I have a bloat effect and a melt effect applied to a mesh for some reason. I would like to drive the bloat strength with the armature’s y-axis movement and the melt strength with the armature’s x-axis movement. Or let’s say I have two poses I have saved for a mesh that are meant to work together. I would like to specify one pose to be driven by one axis and the other pose by the other axis…that way I could adjust them both at the same time. Kind of like how Z-Brush allows for a more organic feel to modeling, this is a more organic form of animation…meta-animation!
The concept basically replaces sliders. Sliders really seem to behave like less-powerful armatures…or at least, armatures evolved from sliders. The documentation its self says “…even we have only scratched the surface of what they’re capable of.” And I think that’s completely true. There really is no longer any need for something as old-fashioned as sliders. As PMG says, “messiah represents something new in the world of 3D computer graphics: a system designed by animators.” If armatures can be “opened up” to control more functionality then animators will certainly have the power to work more like an artist and less like a programmer. I can visualize setting up a face with a bunch of poses, or weights, or bones…then having a graphic of the face with armatures overalying it…each one driving an effector. Left-click and drag an eyebrow armature around and the mesh follows. Middle-click and it mirrors the movement so that both eyebrows are positioned. Left-click an eye to make a wink. Middle-click to make a blink.
I personally feel the possibilities would be endless.
I know what you mean, and personally I also feel that armatures could be some more improved in some places. However, with some workarounds you may be able to achive what you want to do by mapping e.g. your morph slider or whatever to a null’s channels through expressions (driven keys), and then control that null with an armature hande. You can also clamp the nulls movement, and divide it by 4 (as negative x, positive x, negative z and positive z) through min/max/abs/etc functions, so you could e.g. control 4 morphs (or whatever) with one armature handle (when set to M). Does this make sense?
Makes perfect sense maks. Actually, it’s a rudimentary version of what I was talking about…the armature controlling the null controlling the sliders; sure it’s kind of klunky, but it is a start. Essentially we need to cut out the “middlemen” (aka - nulls and sliders) and be able to apply the expressions and feed the data directly to the armatures for a very elegant solution for so many different and varied animation needs. To me, that would really be bringing armatures into their own. Imagine how clean and powerful the interface would be.
After looking some more at the API docs I’m thinking it might be possible to attach my own data to an armature which means I might be able to both constrain its movement and cause it to drive something else in the app. But I’m not sure since I still don’t know much about responding to ANs and retrieving data from entities like sliders or nulls. Still…it will be fun exploring!
It maps mouse keys to the numeric keyboard.
I tested it with Maya, and seemed to be rock solid for use with a wacom.
The only problem is it might be uncomfortable to have your fingers on the keypad all the time. I’ve played with other keyboard mappers that will remap what that remapped to whaterver…get’s kinda messy.
I spend a lot of time in Mirai at my day job, which is about the most ergonomcial software around. When I’m in Maya, my fingers start to hurt.
As a kooky sidenote, I use my left hand on the mouse during the day, and when I’m on my computer at home, I use my right hand. During the old days as a modeler at Viewpoint, we were compelled to use our left hands. Seemed petty at the time, but now I’m grateful for the ability to switch back and forth. My right hand would have been shot by now.
Wow, a fellow Mirai user…far and few between. What OS are you running it on? I’ve been trying to get it to run on XP but so far it’s been like turning lead to gold. And yes, Mirai has had the quickest, most intuitive interface I’ve ever used. Boning and displacements are a breeze. And switching between IK/FK is fast and seamless.
That sort of thing is best handled via an action. However, it would require that developers be able to get the mouse position info. This has been added to the API and will be available in the next update.
-lyle
ps: if time allows, I may write the action for this and publish the code, as well. Custom actions is an almost completely untapped resource; armatures have much untapped power. There should be a lot more armature actions out there, and this example (along with other forthcoming goodies) will spark mroe development.
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.