EIAS Review


#21

Ok, now i understand. If camera would be (a standalone app) accessible for other apps, this would make sense. Maybe it would also help a bit to add .fbx output to Animator. I remember this was also a much wanted feature in the past. But for the rest of us, i´m very happy to hear that the Igors want to make Camera competitive with VRay, etc.

Regards
Stefan


#22

Well basic space rental is $36 per square foot. So a 10x10 booth (bare) would cost $3600 minimum. I believe there are additional charges if you want internet connectivity. Then it needs to be furnished to some degree. That will cost some money. Then you have fees for the card readers if you want to rent those. Promotional materials…that’s gonna cost something. Plane tickets for Brad and Phil. Matt could drive up from Dana Point. Hotel costs and other expenses while in LA. Yes… LA users could probably lend a hand my pitching in some machines to prevent EITG from having to ship machines in. You’d really need a new demo reel cut and that needs to be displayed on a nice flat screen tv…

Yah… you’re definitely close to pushing $7k-$8k all said and done…probably more. That’s money out of development time. I think it would be worth it…but when EI comes back to Siggraph, it needs to be done right and done reasonably in style. A card table and chairs with flyers printed out on an inkjet printer certainly doesn’t raise confidence that EI has returned!


#23

Well done article, Brian!
The approach should at least give those unfamiliar with EI the sense that
this is a professional application to be taken seriously. It may help pave
the way to wider acceptance of Tesla and Kryptonite when they arrive.

Could Kryptonite be a RIB renderer?

Jim


#24

Alas no… it is not.


#25

Thanks for the quick and clear answer, Brian.

One more if I may.
In your article you wrote:

“Taking a page from the Luxology / modo playbook, expect to see Tesla grow into a market of its own later through the year.”

Now that sounds an awful lot like Tesla will indeed become the foundation for a next-generation 3D application with rendering and animation. Can you clarify?

Thanks.

Jim Mulcahy


#26

Indeed. Tesla is intended to be a standalone product that is closely intertwined with the existing EIAS, but it will be sold separately outside of the “Animation System” and it will eventually have a direct link to Camera. (Which will encourage the additional sales of EIAS/Camera)

Animator will remain in place, being fully developed into v8 and beyond, however, if the reception of Tesla is exceptionally strong, I could see it taking on a life as its own in the form of additional animation capabilities as well. Its hard to say at this point. A new application on a new framework…its hard to resist the idea…but ultimately it will be the users (through their purchasing patterns) that will determine the future.


#27

…is useless if nobody knows about it.

SIGGraph

Booth

Do it now.

Oh, btw, get a booth at SIGGraph.
Did I mention that there should be a booth at SIGGraph?
No seriously, get a booth at SIGGraph.
Question: did you get a booth at SIGGraph? No? Get it now.

IMHO, Macworld would have been the right place to SELL EI 7.0. People go to Macworld to BUY stuff. But that ship has sailed.

IMHO, the ONLY reason C4D is were it is today is because Paul Babb marketed the hell out of it.

Sooooooo… Get a booth at SIGGraph.


#28
  I think what a lot of people don't get is that renderers like PRMan or 3Delight are not about RIB.
  They are about flexibility. RIB isn't even needed. 3Delight has allowed sending data directly to the renderer via the RI from the beginning (via C/C++ etc.) and Pixar has announced on their user group meeting in London recently that they are planning to do the same thing for PRMan 14.

They also announced Python support (effectively allowing to use Python in place of RIB, if somebody wanted). Apart from Pixar’s forthcoming ones, there have been Python bindings for RMan via CGKit since years and many places have been using that. Gelato used Python from the beginning and even though the renderer itself failed to impress me so far, their API is much better than the dated RI.

  From the perspective of today's pipelines, using a scripting language as the glue is a much better choice.
A language like Lua has the same parsing speed as RIB but all the advances of a full scripting environment (which RIB isn't). Python is 20 times slower then RIB or Lua to parse but you shouldn't really use text-based format to store geometry for the renderer any more anyway these days ... it doesn't matter that much if it really only is used to 'glue' stuff.
  
  Speaking of the renderer itself: it's the shading language that makes all the difference. Programmable shading gives one the flexibility needed for feature film VFX. Do I want to use classic depth shadows or deep shadows? Ray-traced? Use an importance sampled pointcloud with illumination from a set-captured HDR? Do I get GI through final-gathering, photon mapping, or a point-based technique?
  If I fire a ray, will that trigger another shader and if so, what will that shader do?
  
  It is not that hard to write an physical-based rendering engine that creates stunning photoreal images in very reasonable times. Even bloody fast.
  
  But if you add programmable shading to the equation, all of a sudden the user can do anything. All the assumptions about known execution paths you have buried in your code to squeeze out that last bit of speed need to be removed for the sake of generality.

  That's when it becomes a really hard task to write such a renderer. Even more so if it should be using the minumum amount of memory, allow 3d motion blur & depth of field & micropolygon displacement and overall: stability, stability, stability. Stability is so much more important than speed, it cannot be stressed enough. If the renderer I use is too slow, I can always buy more boxes for the farm. If it crashes, I'm fuc|<ed.

Bugs in a renderer are really expensive because they cost TDs time to work around. New blades for the farm are dirt cheap in comparison.

  I'm a long time PRMan & 3Delight user and was a beta tester of almost any RMan-compliant renderer out there in the past 14 years. From all I know I'd say it takes around 10 years to get a renderer from something that renders to something that resembles these swiss army knifes of rendering the big places depend on.
  And this has nothing to with how big your team of developers is. Nine women can't have a baby in one month.
  
  Adding a RIB binding to any renderer is a matter of weeks at most.
  Making a renderer a true alternative to the ones I mentioned above takes years and has nothing to do with RIB.
  
  .mm

#29

Mauritius,

You have some good observations there. After working at 3 large studios myself, its true that a well implemented, open architecture, scripting language is the key to solid integration for an application. I only suggested RIB compatibility due to EITG’s perpetual reluctance to license its renderer out to 3rd parties. So, with RIB, I’m thinking of bringing the mountain to Mohommed instead. At least those applications that could write out RIB files could potentially consider Camera as a potential alternative renderer. Python support is also a good idea.

Its my understanding that Camera achieves its rendering speed through a number of custom optimizations that have been its double edged sword from the beginning. Its fast as hell, but tends to cut corners here and there. That could potentially tap into your fear of stability. If Camera were to be capable of rendering RIB files, it would probably take more than few weeks to obtain that capability given that issue alone. But if not RIB what other potential file format would you suggest to give other applications access to Camera?


#30

Brian,

Thanks for your take on Tesla.
Let’s hope it attracts enough users to support full development of its potential

JM


#31

Well, I guess what I’m trying to say is that I don’t believe opening the renderer via a scene description language of any sort will have the desired effect because it’s not what people are looking for.
3Delight can e.g. read & render .mi files but it doesn’t have support for mental ray shaders. If you haven’t heard of anyone switching from mental ray to 3Delight it might be just for that reason. No shader support.

RIB support w/o programmable shading is as good as no RIB support at all. Besides, I doubt EIAS’ renderer would come out looking too good in a shootout with one of the high-end REYES renderers anyway. These are beasts of speed, flexiibility and stability. Most people have trouble understanding what the difference really is until they have used one of these in a very demanding production environment.

Cheers,

Moritz


#32

Wow, i guess EI should just pack its bags and go home. Were not ‘advanced’ enough to get
it anyway. Only when we join one of those cult like massive effects companies with programmers and renderfarms with renderwranglers? ; ) will we truly see the light. No thanks.

Camera is a beautiful renderer. Period. You can have whatever proprietary elitist software
you use and enjoy. Program away : ) sound exciting and so very creative, all that code.
an artists dream…


#33

You completely misunderstood me. I wasn’t commenting on EIAS renderer per se but solely on the chances of such a renderer (with its current feature set) penetrating the high end market.

.mm


#34

Try to remember Mauritius…EI “was” the high end market not that long ago. We’re working on making it the highend market again. There are a lot of people out there that would like to get their hands on Camera, I’m just trying to think of potential avenues to make that happen without giving the cow away with the milk.


#35

How many painters out there gather up the chemicals and lab equipment and make their own paint? I’d say none or close to it.

Next question: There is a lot of music software out there. How many non-musicians can compose a symphony?

Owning a copy of Illustrator doesn’t make one a graphic artist.

Having the tools does not give one talent.


#36

All true, but Camera does make me look like a render expert… without being one


#37

I think Maurititus was simply addressing my desire to see EIAS re-enter the high end film community. There its quite necessary to “mix one’s own paint” and that’s usually done through a flexible scripting language like Python or MEL and a diverse rendering/shader system/language.

Right now, EIAS has been pushed aside to act in the support role in those types of venues…if its even there at all, and quite frankly, there is no existing reason to adopt EIAS in a facility that is dependent on vast levels of intercommunication. EIAS is perfect for a self contained environment like a small shop or an individual, but in order to bring it out into the limelight again, its necessary to change some thinking.

If Camera is EIAS’ primary selling point, then how do we get it to play nicely with other applications? Right now, Animator is the only access point to Camera and Animator’s competitive edge is lost against other animation products. Thankfully Camera has kept pace.

So my thoughts were simply…how do we create new access points into Camera without giving away the technology? RIB integration/support was one idea.

Tesla will form the second access point into Camera. This will be a good thing because Tesla will be a standalone modern application that can be sold into existing pipelines and integrated. Kishore’s experience at Sony should ensure that such integration takes place. Once there, hopefully Tesla can forge a foothold into new markets that were once closed to EI. Maybe then, there will be greater willingness to bring Camera into their pipelines through Tesla.


#38

:applause:


#39

I believe Mauritius is saying that, first, an “EICamera for Maya” or such’s interfaces must be the ones in vogue right now for that to be appealing; second, the flexibility required could mean Camera becomes less advantageous: less speedy, more fragile.

Realistically, how much of a real measurable demand has been for EI to “open” Camera to other apps? Say, from the ILMers at the time, Maya users or whoever. I can sort of see this as a mid to long-term project, parallel to Tesla’s evolution, but if the demand wasn’t solidly there already then such an “OpenCamera” ought to be a very fine product indeed in all regards for it to attract these crowds, instead of a half-hearted just functional conversion.

I guess such a Camera ought to be able to go beyond EIAS in some regards, such as being able to ingest .OBJs and things, deal with whatever a Mental Ray or a PRman engine accepts from Maya or Softimage as plain-ish geometry and texturemap formats.

(Actually, I think another possibility could turn to be even more interesting: making Animator be able to interface with other renderers, but I guess that’s another war entirely)


#40

“If Camera is EIAS’ primary selling point, then how do we get it to play nicely with other applications? Right now, Animator is the only access point to Camera and Animator’s competitive edge is lost against other animation products. Thankfully Camera has kept pace.”

I’m not sure it is anymore. Maybe 6-7 years ago it would have attracted a lot of attention to have Camera opened up but that time has passed. Yes, it renders fine but there are an over-abundance of very nice renders available for one’s money these days. Some stuff it does very well but other it’s behind what is currently available.

But you have to ask, which studio will take a big chance on EI again after such a mixed history?

I think the best thing EI could do is to create a killer app with Tesla and incorporate Camera into it…and a modern version not a hacked together one. Create a buzz…show some innovation and stability as a company. Gain goodwill in the industry again - keep current technically - bring in animation and at that time re-evaluate if there is still any need for Camera ported.