Hi,
Could anyone suggest how I could go about creating this animated post-it effect in maya?
I’m guessing colorAtpoint in some form? :lightbulb
thanks so much!
Hi,
Could anyone suggest how I could go about creating this animated post-it effect in maya?
I’m guessing colorAtpoint in some form? :lightbulb
thanks so much!
yes you could rely to colorAtpoint, but I think (might be wrong) that you can’t pass color to instance that way. Another way to do it would probably be to use Xgen or Soup. But else you can do it with colorAtpoint and normal object.
Hi there
There’re couple of ways one might design a shot like this but before going deeper, one should already know something about the renderer ( if one means to do this all by himself, I mean ) used, since it has some effect on how things will go.
But to put this on some order, this would be ( one ) of the ways I’d do this.
!! now this is the point where your renderer makes the difference
6) feed the read color information to the proxy, which will read randomly a flapping note from the cache.
this last phase is usually done by adding custom attribute to the proxy transform which is then read in the shader which is used to shade the cached, full res version of the flapping note
Can be done at least with RMS, Arnold or MR.
Other’s I have no experience whatsoever.
Hope this sheds some light on the subject.
/Risto
Soup has the copier node which can pass color info to instances and do time offset as well. The workflow takes a bit of time to understand and setup. Honestly it would take me a week to get it all working if I started right now.
I am going to be the lame loser who suggests the lamest of lame ideas.
Get your max ‘resolution’ post-its high x postits wide and make that the resolution of your image sequence. SOuP’s copier can quickly make a wall of postits into a single mesh that can be textured using old skool UV tools without much work.
UV your wall of postits into uv space keeping the pixel/postit aspect ratio in mind. It will take a bit of fussing to line everything up…‘a bit’ being a day or two MAX and then you can quickly render/tweak your animation and make it look awesome.
Just some rough ideas here…but hopefully something helpful.
~Ben
Here is a similar trick…
Create a default plane and up it subdivisions to the number of post its you want.
Then select all the vertices and do editMesh/DetachComponent. Now select all faces and do transformComponent. Set the local scale x+y on the transform component to something like 0.02. All the faces should now be small. Now do createUvs/planarMapping down the y axis. Delete history then select all faces again and scale them up again using the transform component tool. If you apply any textures to the mesh now all the faces will have a constant color from the texture, because the UVs are from when they were small.
You could then do addDivisions to get more more detail on each node then make the mesh cloth and animate the motion by creating constraints and forces.
OOOOoooooooooooohhhhhhhhhh.
Nice tiny UV tip! Make lining things up infinitely easier.
Have you used Maya before? 
Duncan, it would be so nice to have a colorAtPoint node
it would open up lots of possibilities for many effect, giving an uv position and an object and have it sample the color…my dream for so long.
The issue is that one needs to know how to sample the uvs that are input for the texture. The shading engine knows how to initialize the texture uv inputs for different object types when doing a shading sample, however non-shading engine contexts are not so well defined.
For the various non-shading things that support 2d textures we resort to a sort of hack… the 2d textures all have added sampling methods that evaluate the texture for a passed in array of uv positions. The entity, like say the cloth node, then initializes these arrays based on its data then calls this method on the texture. In these cases we use the graph connection to the texture just to get this method… we don’t pass data through the connection and we don’t fully update the network the way it happens in a shading texture evaluation. In some cases it can result in update problems do to the non standard dg evaluation. If we were to do it with data passing we would need to have an output uv array on the object that passes to the texture which would output a color and alpha arrays… this is more the way fields are evaluated.
That said, how you would want such a colorAtPoint node to behave? Where would you most want to use it?
If one plugged a value into the uv coord attribute of a texture then the value resulting from connecting the out color will be the value for that uv. However this is only 1 uv value… typically one wants to iterate over a range of uv values. (it is possible in a mel script to loop over a range of uvs, then inside the loop do setAttr on the uvCoord followed by getAttr on the outColor, but the colorAtPoint command is more efficient)
SOuP Allows this kind of functionality. The nodes that are most similar to a colorAtPoint node take a mesh input for what I assume is the UV data and a texture input. The result is an RGB array. Of course this array is mostly only usable by other SOuP nodes. But the workflow of plugging in a mesh and a texture is logical to me.
Something that behave the same as the nearestPointToMesh giving it a Uv coordinate and a mesh node, or eventually a shading group and having it sample the color around that point just like colorAtPoint behave. But would be better to have it sort out a RGB color than an array.
We could use it to sample color at certain point, drive blendShape, pass it the color of a other texture, drive position of objects around, the idea are limitless, actually the command colorAtPoint is fine, simply while scripting it’s mush better to create connection with node, and it would probably be mush faster.
Maya does need at some point something like soup that would extend the capacities of sampling any info anywhere in space with node and drive any attribute with nodes. Maybe bifrost is going to open up new possibilities, only you and couple others knows what’s we’re up too in the near future.
Thanks for your feed back
actually I never thought of that! this is a nice trick and I test it on a noise and it works, problem is to have the texture sample at multiple point you need to have the same number of corresponding texture, so in that case a node after the texture would be more efficient. plus it sample the texture not base on any object, but this could be fine for many situation.
thanks for that good trick!
edit:
Never mind what I said above, plugging the texture into an empty ramp and reading the corresponding color on the ramp actually works! this is in fact a super trick!
wow!! 
Sorry for my slow response I’ve been busy on a job but just saw the great responses from everyone. Thanks so much!
OMG… so easy… great help!!
I’ll will still investigate soup though.
many thanks 
Or, if you use vray, you can pass multiple attributes, including particle color, to instances.
It’s easy to set up, just add the per-particle attribute export to the particle object, enable rgb or whatever you want, and use a particle sampler info to pass it to the shader on the object that is being instanced.
I included a simple example