Testing OpenCL path tracing with SmallptGPU & SmallLuxGPU (to become LuxRenderGPU)


#1

Not a strictly Blender topic, although I did use Blender 2.5 as a tool in the making of this little test video. Mostly I just had to post this on these forums because of my username and avatar! :wink:

OpenCL path tracing renderer testing, animation (and Bullet Physics)

All kidding aside, I’m very excited about the work David has being doing with OpenCL accelerated rendering. It’s definitely good news for all CG and Blender users.

More information on LuxRender and OpenCL.


#2

Really neat. How much RAM do you have? I guess what I am asking is if it went by better with RAM or does it still consume mountains of RAM usage?


#3

In this case RAM isn’t a factor (max mem usage was low), there’s no textures or much data, it’s all math. It’s entirely a function of GPU computing power in this case, not RAM.


#4

Wow nice work. Can’t believe how quickly these guys are picking up OpenCL. Very promising results.


#5

Thanks. Yes, I agree, ā€œDadeā€ is making amazing progress at an impressive speed. And it’s very interesting in light that GPU accelerated commercial renderers out there (Refractive’s Octane and Random Control’s Arion) are limiting themselves to CUDA.

BTW I attempted to do some animations with the latest SmallLuxGPU beta (animations created with Blender 2.5):

OpenCL path tracing SmallLuxGPU simple animation test (YouTube)


#6

And it keeps getting better. I rendered these with the very latest version of SmallLuxGPU… (models from Stanford)

And another…


#7

And the progress keeps on going, taking it to a new level…:

First sneak peek at LuxrenderGPU ! (Luxrender public forums)


#8

Looking pretty awesome.
I currently only run a GeForce8600, what sort of card (cheap, I have zero budget :p) would be sufficient to give me GPU/OCL ? I’m clueless about this stuff alas. :shrug:


#9

Apparently the GeForce 8600M GT does support it. Not sure if this relates to you though. The NVIDIA site should have detailed information.


#10

Oops, my bad, :blush: it isn’t an 8600, it’s an 8400 GS - I knew it was an 8… summat :banghead:


#11

I’m pretty sure 8400 has got cuda support as well,
so opencl support wouldn’t be a far off guess,
but yeah, getting ā€˜realtime’ results out of it,
would probably be a fantasy. :smiley:

I have an on-board 8200 mGPU which is just a integrated form of the same chip as the 8400,
but yeah, the cuda boost it gave in Ps CS4 was rather horrible,
I’m now running a 9800 gt, and that thing flies in comparison.

If upgrading for gpu computing a 9800 gt would be the minimum I’d suggest.


#12

Yeah, not exactly expecting that :wink: I know it’s a level entry jobbie, but my PC wont handle a high power hungry card. Just been on the NVidia side, the latest drivers (196.21) have OpenCL and CUDA support as well as OGL 3.2 so I’m downloading them and will install today (once I’ve made a restore point natch! :stuck_out_tongue: )


#13

Perhaps for affordable OpenCL you might want to consider the low-to-mid end cards boasting the very latest generation GPUs in them (be weary: new model names do not mean they contain the latest chips). Seems to me you don’t want older fast gaming chips that might have some OpenCL feature limitations?

Ideally, you might want to wait until hardware sites start including OpenCL banchmarking in their graphics cards testing, just to be sure which is the best value.


#14

Maybe a Luxrender OpenCL benchmark test to get more users, certainly there are more people knowing Cinema 4D due to Cinebench :wink:


#15

Yeah, either that, or a nice benchmark database with user submitted data could be handy, kinda like the cinebench database (http://www.cbscores.com/).
Firstly we’d need a nice benchmark scene though :slight_smile:

As for cards, the AMD 5770s are looking like very good performance for the price, built a system for a friend recently running one, and it’s fairly impressive.
Personally, I’m running an nvidia GTX260 at the moment, so I don’t really have much experience on the amd side of things, but it’s probably the card I’d go with if I were to upgrade on a budget right now.


#16

Actually Dade has already included a benchmark mode for his SmallLuxGPU test program!

Dade is still updating SmallLuxGPU, and has added some very nice features to the latest v1.3. He is also working on Luxrays and LuxrenderGPU at the same time (not yet ready for testing).

BTW, if anyone has a decent GPU, OpenCL, and wants to test SmallLuxGPU v1.3, I’ve written a Blender 2.5 render plug-in, there are versions available for both Blender 2.5 Alpha 0 and Alpha 1. If you would like to test and provide feedback on SmallLuxGPU, find it at the LuxRender GPU Acceleration private forums. Here’s a screenshot of the plugin in action and a quick render made with SmallLuxGPU v1.3:

credits: Michelangelo’s David scanned by Stanford for ā€œThe Digital Michelangelo Projectā€; Sponza Atrium was modelled (I think) by Marko Dabrovic.


#17

Damn, that captcha implementation on the luxrender forum registration page got me good, haha.
I hate those things so very, very much.

Hopefully the lockout time isn’t too long :slight_smile:


#18

With that
can we make movies without any problem?


#19

Unfortunately, no, not at this time. Right now the plugin spawns SmallLuxGPU and allows you to navigate around while rendering in real-time (see David’s original v1.1 video); the rendered image does not return to Blender for post-processing/animation. However, if you are a programmer, you could extract the exporter from the plugin and make a batch render / export / convert image loop Python script to make an animation using SmallLuxGPU.


#20

Another SmallLuxGPU v1.3 render, 1280 x 2400 pixels (zoom in for detail)
model scan credit: Visual Computing Lab (Istituto di Scienza e Tecnologie dell’Informazione)


1280 x 2400

[left]Edit: and another one testing bounced light / color bleeding (single light source).[/left]
[left]model scan credit: Jotero[/left]