Yeah, I guess I should mention that I’m trying to stay renderer-agnostic and software-specific here (hence the thread in Maya Rendering :-)).
I took the time to clear my head and try to illustrate this better. Let’s hope the .gif works.
In the simplest terms, say there’s a shader for LEGO bricks. Now, all basic LEGO bricks have the same material properties, save for colour. Normally, textures define variation across a surface, but LEGO bricks only have 1 colour. A LEGO object is made of many different bricks, so 1 idea would be to store all material properties in a null node, hook them up, and only change the RGB sliders for each shader (as shown).
This works fine for LEGOs, since they only have what, like 100 colour possibilities. However, if I want to cover any significant portion of the RGB spectrum, the overhead becomes too big for Maya. It would make much more semantic sense, and hopefully introduce much less overhead, to have 1 shader defining the material properties, and storing RGB values in the objects themselves.
Taking this method to a silly extreme, I suppose you could pipe all material information in a scene, with custom attributes, to a single standard raytrace material.
This notion of 1 parameter to many shaders, as opposed to many parameters to 1 shader, is my issue. It’s unambiguously defined, and should be straightforward to explain to Maya.
@willanie What you’re saying sounds like the soluion (passing by reference in expressions). Could you please provide more details?
P.S. I looked for a way to tell MEL something like (pseudocode incoming)
shader.attribute = *(whateverObjectTheShaderIsAssignedTo).customAttribute
but couldn’t find anything. Intution tells me that’s because I don’t know how shading groups work. 
EDIT: A new post popped up as I was phrasing this. I think we’re on the same wavelengths, willanie, but I’m still not following you with the gamma node.