View Full Version : RENDERMAN+CUDA = RenderAnts: Interactive REYES Rendering on GPUs

08 August 2009, 07:43 PM
Well I was not sure where to post this, so Iīm posting it here anyway, it seens a professor named Kun Zhou have made an incredible progress on a technical paper for a Reyes render on GPU using CUDA, he build a system using 3 GTX 280īs, and was able to render Renderman files in a new order of magnitude (experiencing 20x to 45x increases). The system however does not uses raytrace or any other advanced functions on Renderman for this inicial implementation but still is a amazing speed.

08 August 2009, 03:14 AM
Have you played Killzone 2? It looks a lot better than those demos and there's way more going on onscreen in realtime. Faking / baked occlusion and all that stuff for games will always be the road to go and real raytracing/raydiosity/occlusion passes/caustics/etc for film/viz. Realtime rendering appeals to neither of these fields:

games: where customers don't care if it's realtime raytraced, they just want it to look good and faking the basic bits let you do stuff like 3D motion blur and post processing in realtime, where you'd never get that with rt raytracing. Look at these screenshots ( and tell me I'd rather look at a physically accurate calculation of a few rays bouncing.
vis/film: it better look good and rt raytracing doesn't

realtime raytracing is a tech demo appealing to clients whose clients hate realtime raytracing. you do the math.

08 August 2009, 02:03 PM
Huh?... You mean youīre happy with multimillion dollars renderfarms or slow rendering?
I donīt care if its realtime... I want render speed, if its raytracing the better.
Cīmon, its Renderman with interactive features, the possible competitor for mental ray once CUDA is implemented.
Oh... and with this system you donīt have to have quadros, only good ol geforces.
which is the main advantage for cheap GPU rendering. And once the advanced features are implemented as the paper sugested, it will be the perfect product for the small shop.

08 August 2009, 04:47 PM
well like I said - it appeals to you as someone who wants to save time but not your clients if it's not capable of reproducing what the old tech is already doing. And as you said, Renderman isn't just a Reyes renderer so call me when it can jam all the fast displacement and raytracing stuff on the GPU.

08 August 2009, 07:51 PM
So... you dont believe in the future of GPU rendering?
I wonder what you think of Caustic One?

08 August 2009, 08:50 PM
hm ... that video was quite unimpressive ...
the car looks horrible, even for game standards (which usually have framerates in the 60ies ...) and the rest looked not better than some average in game sequences ....

nope, totally unimpressive ....

08 August 2009, 09:30 PM
The caustic rt demos were the same. unconvincing

What it doesn't do: realistic caustics (diffraction, attenuation, etc) in RT...

I'm looking forward to SSSRT...

08 August 2009, 03:25 AM
Well I give you that... it really sucks, but lets not forget that any of these products are out yet, they're all under early development, and since it's the programers making the test scenes for demonstration purposes and not a professional artist it will most certainly suck!

I much rather take a peek of early development stages than be completely clueless about the very thing we all will be using in the future.

By the way have you at least read the entire PDF? The developer even points out the algorithms he intends to use for raytracing, point cloud oclusion and sss integration to quote a few.

Iīm actually very impressed that he've gotten that far... it makes me wonder on the stage mental ray porting is at the moment, but we have no news on that do we?

Let's not be so harsh to dismiss technology only because is not pretty, making pretty things is our job not theirs:beer:

08 August 2009, 07:37 AM
To program this for the GPU is an impressive job. It can not handle all the fancy effects at the moment, but it will soon enough. Building something as large as a renderman compliant renderer is not something you do fast. It takes time. Just like the invention of global illumination or fluids. Programming for parallel devices is not easy and the speed gains he shows are clearly there. The vfx industry is still skeptical about GPU, but nvidia is breaking into much higher fields with CUDA such as medical visualisation and calculations for engineering - How long before the vfx industry starts to give more attention to the GPU?

Even ILM is starting to go down the GPU route as you could read from the article on Harry Potter. They still have a huge impact on the direction the industry (and prman) will take.

I personally do find this an impressive demo that already shows quite a bit of potential. Games are still hacking their way through most steps, taking shortcuts and thereby gaining speed. But being able to accelerate rendering of rib files whilst being constrained by some of the rendering algorithms to get the accuracy required that is great step forward. Thanks for posting it.

08 August 2009, 06:29 PM
What I am waiting for is a GPU based renderer that has good integration, good tech support, good raytracing, good AO, good motion blur, and good DOF. These are a bare minimum for a production ready renderer. This looks like a promising start but its got a long way to go. As excited as I am I think it will take a little while for GPU based computing to hit its stride.

08 August 2009, 08:26 PM
I actually saw a pdf of this a couple of months ago (by the format used, I would say it's the same author), back then it was rendering Bumblebee from Transformers at very fast speeds. It was very impressive.

As far as this demo goes...I found it to be very impressive. I'm not sure why people think it's not.
It pretty much explains how it renders RIB files more than twenty times faster than a CPU, how is that not impressive?

True, the examples are lacking, but a demo is never about that. It's about numbers and the potential of the product and how it can help companies that need to maximize rendering in a Film pipeline.

08 August 2009, 09:05 PM
Nah, that other paper is Lightspeed ( which is based on a different concept.

08 August 2009, 11:04 PM
Have you seen this?

Look promising.

08 August 2009, 12:53 AM
Nah, that other paper is Lightspeed ( which is based on a different concept.

Yup, you're right :)

It's a little confusing, they seem to be using similar icons on the pdf, therefore I assumed it was the same author.

All in all, alternate rendering technology seems to be picking up big time as of late. Makes me kind of excited. :)

08 August 2009, 03:13 PM
I find all this alternate rendering tech really cool. I am a bit skeptical about how these renderer will react in a production environment. And I agree, being that much faster is really impressive. If they can do one thing that much faster I wonder how much will carry through to other features when they are implemented.

08 August 2009, 07:19 AM
Holy mother of a 170-legged guinea pig, are we really that far? Reyes pipeline on a GPU? I wanna try it!

Yes the testscenes are not that beautiful. But this is due to the technical Background of the paper. It is possible to make waaay better Pics with the things already implemented.

I think an early application for such a renderer is to grind through your hundreds of shadowmaps. No need for advanced features on this simple task.

Hmmm displacement isn't mentioned in the Paper.

08 August 2009, 03:53 PM
Yes, if they made renderman compute all of its shadows with CUDA that would be sweet. You could do awesome deep shadows quickly. Seems like it would be a big gain for a small re-write.

08 August 2009, 06:19 PM
I think an early application for such a renderer is to grind through your hundreds of shadowmaps. No need for advanced features on this simple task.

:bounce: Great Idea!! That takes up so much time.

CGTalk Moderation
08 August 2009, 06:19 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.