nVidia announces Mango: Maya->Gelato connection


#1

SIGGRAPH—LOS ANGELES, CA—AUGUST 9, 2004—NVIDIA Corporation (Nasdaq: NVDA) introduced today a new Maya® 6 plug-in for NVIDIA’s award-winning, final-frame renderer Gelato™, bringing the first hardware-accelerated, final-frame renderer to Maya artists. The new plug-in, called “Mango™”, uses the widely-adopted Maya graphical user interface (GUI) and allows users of Alias’ market leading Maya 6 (and 5.0.1) software another choice in their menu of renderers.

In addition, NVIDIA announced the availability of the Gelato 1.0 renderer for Microsoft® Windows® XP as well as a Gelato Beta 1.1 version. Gelato provides ultra-fast, final frame rendering in conjunction with NVIDIA Quadro FX® graphics. And with Mango, users of the Academy Award®–winning Maya software can select Gelato as their renderer, producing images of the highest quality in about half the time,

“Mango offers Maya artists an integrated path to another great rendering option – Gelato,” said Shai Hinitz, Maya product manager at Alias. “Adding Gelato into a Maya production environment gives studios additional hardware-acceleration for final renders. Users will appreciate the fact that Mango provides the power and performance of Gelato while taking advantage of Maya’s familiar ‘unified rendering’ functionality. They can install Mango, select ‘Render Using Gelato’, and start rendering right away with the Maya rendering GUI and workflow they already know.”

http://www.nvidia.com/object/IO_14923.html

Now all you need is to upgrade your renderfarm with a QuadroFX in every box!


#2

“… users of the Academy Award®–winning Maya software can select Gelato as their renderer, producing images of the highest quality in about half the time,”

Cool news. Altough gelato is a final frame renderer,
it sounds like the days where we see final outputs while we work are approaching.

But rendering in half the time doesn’t sound like a breakthrough given the power of GPU’s. Another thing that comes to mind is ‘half the time’ compared to what? I expected nVidia to show a gallery of images rendered with gelato by now with benchmarks, comparisons etc…


#3

You’re absolutely correct(;)), personally I think that Gelato is a huge load of bull. But we’ll see what comes up…


#4

producing images of the highest quality in about half the time,

Now all you need is to upgrade your renderfarm with a QuadroFX in every box!

exactly…and for the price of the quadros that this runs on you could easily double or treble you renderfarms in physical machines…

not only that, but if you read the fine print you see that most renderman shaders can be used and maya’s basic shaders are interpreted. This suggests that your renders through software alone may not look the same as the hardware…+ you also lose the flexability of running other software renderers as well.

Ok if its free and you have the time to play, but dont think its going to set the world on fire…so far there doesnt seem to be a proper solution to rendering speed apart from faster or more boxes…

Didnt Intergraph try this years ago?


#5

I think your looking at it from the wrong way, this sorta thing is good for thoes freelancers 1-2 man teams who don’t have access to a render farm but have a decent video card.


#6

Freelances usually don’t have Quadro’s since those things cost a lot more than a second render box.


#7

I think it’s an interesting future product, but as it stands no.

Other than the points here about GPU pricing, has anyone seen any output from it that actually looks better than a Half Life 2 screenshot?


#8

believe me. seeing a pretty complex 2k image with full GI and occlusion rendering on a laptop in about a minute or less was very impressive. this was maybe 3 months ago and im sure theyve extended the software and the hardware a long way since then.

ultimately you have to look at where this fits in to your pipeline.
due to the high gfx card requirement i wouldnt expect many renderwalls would be setup to render using gelato.

however with the potential turnaround speed this theoretically can offer on your workstation, you could see it used really effectively in close-to-interactive displacement testing/building, modelling, texturing, occlusion rendering, skinning tests, previs.

while these arent final-renders per-se they are all crucial components of getting that final-render out the door and any speed increase im sure would always be welcomed.


#9

No OS X support. Why? Is it just because of the Quadro? I can’t see them using Direct X for this, mainly because there is a Linux (x86 flavors I’m assuming) port. So that shouldn’t be the reason this isn’t available to Macs. Ah well, maybe in another three or four years it will be available.


#10

personally I think that Gelato is a huge load of bull.

Unless Nvidia goes out of business suddenly, or a meteor hits earth, Mango and Gelato are the very thin edge of a huge wedge. GPU rendering is THE way of future rendering. Not just my opinion after seeing Gelato and hearing Larry Gritz talk, the President of Siggraph (ex-lead programmer of Maya renderer) told me the same thing. Don’t be like those Swiss watch-makers who met and discussed digital watches in the early 70’s, and unanimously dismissed it with ‘will never catch on’. :slight_smile:


#11

You misunderstood me the way I thought someone would misunderstand what I said :slight_smile: (so it was kind of a trap). I know that GPU rendering will be the great next thing but I also know something else which makes me think Gelato is, like I said, huge load of bull.


#12

So what is it you know that makes you think Gelato is a load of bull?


#13

That gelato is icecream-ish? So this product will actually melt in summer?

Well, I suppose it might actually melt if you rendered for a few months non stop.

I too would like to know what you know Para.


#14

If I’m correct, you’ll hear about it before christmas (that’s my pessimistic estimation). It’s kinda hard to answer without giving it away, but…well, edit: I couldn’t say even that, but was allowed to add edit #2.

EDIT1: Changed just a few keywords, no biggie.
EDIT2: Okay, definition is in place: “It” will not most likely replace Gelato, at least not directly.


#15

Dude, you’re making no sense to me at all. :slight_smile: Anybody care to make some educated guesses as to what he’s on about?


#16

I think he doesn’t have a clue what he’s on about.

He seems to be insinuating that nVidia are working on a completely different renderer which will replace Gelato.


#17

I remember an interview with somebody at mental images giving a hint on the future.
Realtime rendering via GPUs… “wait for 18 months” he sad. Maybe Para was referring to that.


#18

GPU rendering is THE way of future rendering

well, i’ll stick my head above the parapet and say that although it will play a more important role, unless the architecture changes dramatically it won’t be more than a minor addition…heres why i think so

a) cards and this software are manufacturer specific, which means that its a minority that will have this feature

b) cards cant render through any software, they have to be part of a specific pipeline that can be adopted

c) cards in the current architecture cant handle huge amounts of RAM and talk back to the main processor when they need more fast enough

d) cards are fast at low res, but fall flat on their arse when dealing with high res

e) at the moment, we only have windows to play with…

f) we’re only talking high end cards that are much more expensive than another box which is far more flexable.

It may develop into a usefull pre-vis tool in the early stages of rendering, but even that will could be inhibited by a bumpy nigly pipeline and thats if its compatable with your software.

If the pres of siggy is saying what he says then fine, but talk to the engineers at adobe and ask them why they wont adopt OSX’s new GPU supported abilities and they will say similar reasons as above.

We’ve been here before, Intergraph tried this, Renderdrive are trying a rendering solution, but bang for buck and flexability are blown away when you can get a 3.0ghz box for around $500.

If you’ve got the capability then fine, but do we really expect GPU’s to outpace, and deliver more than a CPU in the future as well as being x-platform and endemic across the market? me thinks not…we’ll see eh.


#19

I think a lot of people misunderstand exactly what hardware-accelerated “final-frame” rendering (as nVidia call it) is.

No-one’s talking about rendering a frame with the graphics card in the same way as you render a frame of a video game.

It’s about accelerating certain parts of the rendering pipeline with graphics hardware, which is, after all, a processor designed to be good at one thing: fast graphics operations. So…

a) The cards and the software (gelato) come together. mental ray is already supporting Cg shaders run on the gfx card. That integration will only become tight as more people adopt the technology.

b) That’s not a problem if you own the hardware-accelerated software. It’s conceivable that before too long, ALL rendering software will be hardware-accelerated.

c) See above explanation

d) See above explanation

e) What windows?

f) This is true at the moment, but as things go forward and these high-end cards develop, it’ll become much more worthwhile.


#20

ok to clarify

a) your talking high end cards…probably 1in a 1000 if not lower…and your only talking nvidia…what if ATI bring out a card that does the same sort of thing if not faster…gonna swap pipelines?

b) hardware supported software is fine…but there’s no mac support, no linux support…so although MR may support some of it, and maya may do, its hardly x-platform and there for other x-platform developers or smaller setups aren’t likely to jump on it. Remember the shaders it uses are interpretation, you wont get pixel per pixel comparisons, therefore every nutt and bolt of a renderer will need to be interpreted or written specific…then comes along ati or even someone else, changing the equation…unless theres an industry standard shader language its not going to get very a very broad reception.

c&d) cards are still pants at high res…nothing you have said changes that.

e) there is no osx, irix, unix, etc etc support…windows only…so what do multi platform developers do?

f) do you really expect this to trickle down to the bottom level to rival processor turnover?

What we’re looking at here is a tiny hardware development. Adobe have dismissed GPU rendering, perhaps other may take it up, but essentially your talking about the 3d users to take advantage…tiny % of graphics users…smaller % that uses compatable software…smaller % that has the cards…smaller % that needs to render on one machine and just doesnt buy another 1-2 for the same price.

If this is a revolution pending, then it really needs to get across a wider range of approaches and capabilities before it becomes the next big thing.

If in 5 - 10 years i can render my vray scenes pixel to pixel perfect comparison on my gfx card i’ll eat my hat…but as i’m using 1-2gb of textures, few million polys, external referenced models, and plugins on top now, i hate to think what complexity i will be doing in the future and how far cards will have to catch up even to stay still…they’ll be mini machines…hang on, wonder how fast machines will be then…do you really see cards catching up and even getting ahead?