I think that both models will continue to be used into the future. With a single monolithic app, you have better integration of all the parts, and only one interface to learn. For the single user, this is very appealing because trying to keep multiple interfaces in your brain at once is a huge pain. For larger shops, especially where division of labor is practiced, different departments could supplement or replace their part of the pipeline with an individual app, with the monolithic app still serving as the backbone of the pipeline. Production studios that freely intermingle different apps have to create their own backbone of the pipeline that can store and translate data among the apps.
Building an entire pipeline from individual apps is probably only now becoming practical. Actually, it isn’t quite there yet. You have stand-alone apps that do modeling, UV unwrapping, and texture map generation well. You also have several choices in great stand-alone renderers, some of which are built off of the Renderman specification. But in the middle where you have layout, rigging, animation, dynamics, and lighting, there isn’t really much in the way of good stand-alone applications. I haven’t used Motion Builder before, but it might serve as a stand-alone rigging and animation app. But what would you light in? The lighting app has to be tied to the renderer to some degree. Shader creation in the texturing department is also tied in closely to the renderer. Particle systems tend to have to keep the renderer in mind as well.
Now that the modeling and rendering areas have been covered by several good stand-alone choices, perhaps the other parts of the pipeline are next. An animation package that took in OBJs or LWOs and spit out deformed meshes snapshotted per frame would certainly be possible. Likewise a lighting package could be written that reads in these object sequences and has light rigs that drive a Renderman renderer would also be possible.
The key to establishing a pipeline that incorporates multiple stand-alone apps is common data formats that they all can read. OBJ is a bit limited in the data it can store. LWOs are better, but their texture info is very Lightwave specific. DXF and 3DS are not really good choices. Houdini’s BGEO format is a really good step in the right direction for a common geometry format, but Side Effects hasn’t published or documented it well enough. There aren’t any standard formats I’m aware of that store a sequence of snapshotted vertex positions (Lightwave’s MDD format is the closest I know of, but it’s undocumented). There also isn’t a good standard way to attach shaders to geometry that would work across multiple renderers (shaders themselves have to be renderer specific of course).
Oh, one more thing. For a multi-app pipeline it is probably a good idea to not have the geometry stored within the scene file. You have files that represent your geometry, like LWO or OBJ. You have separate files that define the texture, like Renderman SL and SLOs. The geometry file simply specifies which polygons get which shaders, and the shaders can access all the painted weight maps and UV coordinates stored within the geometry file. The exporter that submits the scene to the renderer such as the RIB generator would have to tie these together. The scene file should access all the various pieces of geometry via a path to the file where the geometry is stored. This way geometry can be edited independently of the scene file, and the modeling app doesn’t have to know anything about the scene specification. Lightwave’s LWS scene files work like this, though the LWS specification itself is pretty limited. Maya and other apps can work like this to some degree if you use a lot of references, though the way Maya scene files are built such an approach would get unwieldy really quick. RIB files aren’t designed to read a scene back in well, so they wouldn’t work as a scene format. But the way RIBs use Read Archives to break out geometry into individual files separate from the main scene file is also along these same lines.
Perhaps an a-la-carte pipeline will be possible in a few years, but more standard formats and ways of passing data around need to be developed first. Should prove interesting to watch though.
Guy Who Spends Too Much Time Thinking Of CG Pipelines