PDA

View Full Version : Pixar point clouds in maya


living_for_cg
06-17-2010, 08:09 PM
Hi all render artists.
I was reading this article http://features.cgsociety.org/story_custom.php?story_id=5615&page=1 about pixar new solution for point based rendering.
I wonder if there is any way in maya_mentalray to have an almost similar result .

Cuni
06-18-2010, 04:31 PM
Do you mean being able to see photons and Final Gather points?
there's a utility that loads those maps into your workspace ...

beaker
06-27-2010, 06:04 PM
dneg has a nice opensource ptc for maya plugin on github
http://github.com/dneg
http://dneg.github.com/dnPtcViewerNode/

ctrl.studio
06-28-2010, 01:58 PM
I think he's just looking for point cloud based renderings in maya/mr, as on the other side best solution would be simply to have prman for maya ;-) Btw, both have nothing todo with GI or FG pre-calc points.

However in mray there's a crucial issue that puts pcloudz not as a natural extension to mray tool set (at least for the above pixar approach).. in prman you have micropoligons almost for free .. while in mray not. I still have to see what kind of results we can obtain with 'ordinary' meshes .. following is a pcloud surfel implementation ( points with a radius as the underlying tri area ) on mray.

http://img3.imageshack.us/img3/4391/mayamrpcloud.th.png (http://img3.imageshack.us/i/mayamrpcloud.png/)


ciao, max

slipknot66
06-28-2010, 05:21 PM
Thats intersting Max. And is it possible to use those pointcloud information to generate things like AO or color bleeding effects?

MaxTarpini
07-01-2010, 06:36 PM
well yeah, that's the goal. for AO we have already everthing we need for a bruteforce approach, kind of element-to-element approach. for color bleeding instead we need to bake irradiance to the points to have a chance to get some bleeding effect. For both however would be preferable to have some kind of acc structures for faster lookups. That's what I'm working on actually.

ciao, max

MaxTarpini
07-01-2010, 11:07 PM
here a mr point cloud with irrad baked into, ie. diffuse illum + shadows.

http://img64.imageshack.us/img64/9568/mayamrpcloudirrad.png

From here would be easy to go on, however going to use that raw pcloud just for test, as shading map .. you see what's the problem.. artifacts coming from poor tessellation (around half milion tris, ie. points).

http://img30.imageshack.us/img30/3151/mayamrpcloudirradshadin.png

Here instead rendered with a pcloud up to 3mil points .. interpolating a bit over the point lookup. et voilà. :-)
(black spotchies are bad geometry, no overlapping tris !! which is anyway a req for any pcloud or lightmap render approach).

http://img38.imageshack.us/img38/3151/mayamrpcloudirradshadin.png

I think I'll gonna try an sss impl.. a funny one is simply to use a rapid falloff for the point map as for lewis&borshukov approach but on point cloud lookup directly.

ciao, max

Sorath
07-03-2010, 07:12 AM
I'm keeping an eye on this thread!
keep us updated max.

MaxTarpini
07-05-2010, 01:45 AM
hi, had some time to refine displace and surface approx support as most of the time micropolys are needed to get fine details from pclouds. so I changed subject .. ;) and I applied to a body model a fine displace noise just to test micropoly support, around 25mils. as the overall base model is not offset by the diplace I just render the pcloud with displace data over the base mesh (750k polys).

http://img291.imageshack.us/img291/2441/mayamrpcloudirraddispla.png

took a couple of mins to bake the displaced mesh, while the render of pcloud only (the image above, where the pcloud is used as shading base) takes 20secs with minimal ram use. pclouds itself is around 1gb.

ciao, max

MaxTarpini
07-07-2010, 02:21 PM
ok, here mr point cloud subsurface scattering implementation ala pixar.

http://img682.imageshack.us/img682/1129/handssdiffusionpcloudma.png

http://img375.imageshack.us/img375/5374/handssdiffusion000.png

http://img180.imageshack.us/img180/3505/handssdiffusion001.png

http://img171.imageshack.us/img171/566/handssdiffusion002.png

http://img19.imageshack.us/img19/5311/handssdiffusion003.png

here I use a coarse point cloud just for test (around 100k points). 25secs for pclouds generations (irrad and sss), 16 sec for the final render (1300, 1000). take care that this is 3D baked SSS (on the pcloud) and as long objs pos and lights do not change, one can re-render just the baked sss with no extra time than the map lookup (of couse the detail is related to the pcloud density).

ciao, max

BigRoyNL
07-07-2010, 02:23 PM
Would love to see a step-by-step tutorial on how to achieve this with Mental Ray & Maya. So if anyone knows how-to I would appreciate the thorough explanation in the end. :)

And, nice results. So..

now, another question:
Is it worth it? (usable for animation?)

dagon1978
07-07-2010, 02:30 PM
max, that's peachy! :)
any chance to have a public shader for this?

MaxTarpini
07-07-2010, 11:56 PM
Well 'pclouding' as for this approach needs a whole set of tools, a workflow, to be consistent. That's not really a one-button solution. And I have also the feeling you need to be introduced quite a bit to that. We'll see maybe something of that while dev'ing the toolset. Of course I'l consider some betatesting at some point, and for that you're already on the list !

ciao, max

ndeboar
07-08-2010, 12:16 AM
fyi: 3delight has full point based occlusion, indirect diffuse and subsurface scattering, and the first license is free.

Also i'm pretty sure renderman for maya supports it, I know renderman studio does.

MaxTarpini
07-08-2010, 01:10 AM
fyi: that's just trivial at this point going to support ao and bleeding for mr pclouds. Plus I've already support for prman ptc .. so one may create a pcloud with 3delight and render it with mentalray. ;)

cavemen
07-08-2010, 05:31 AM
@Max

Count me in too.. This topic is definitely interesting.
Have managed to render out point cloud info in maya and use it in nuke but would love to know how to use it in maya like the prman style..

dagon1978
07-08-2010, 01:04 PM
Well 'pclouding' as for this approach needs a whole set of tools, a workflow, to be consistent. That's not really a one-button solution. And I have also the feeling you need to be introduced quite a bit to that. We'll see maybe something of that while dev'ing the toolset. Of course I'l consider some betatesting at some point, and for that you're already on the list !

ciao, max

thank you max :bowdown:
keep up the good work


fyi: 3delight has full point based occlusion, indirect diffuse and subsurface scattering, and the first license is free.

Also i'm pretty sure renderman for maya supports it, I know renderman studio does.

thanx fyi :) but actually i don't use 3delight, neither r4m and i don't want to use it :)

MaxTarpini
07-08-2010, 10:14 PM
had some time today to enforce map mgmt to the point to have it stable for a workflow where we can, as opposed to prman, build both irrad and sss maps from the same shot and then go to render them directly.

following is kind of grape mesh where a 750k pcloud map first have been baked and then re-used on-the-fly to generate the sss map. wanting to optimize the workflow we would have rendered first the grape mesh with high tessellation to get a dense pcloud. then switch back to the orignal poly mesh and render the pcloud on top of that.

this workflow should be effective also for testing animations where one probably would just skip the process of write pclouds to disk and reuse them on the fly frame by frame.

http://img29.imageshack.us/img29/1577/grape000.png

ciao, max

ristopuukko
07-09-2010, 07:57 AM
Hi Max

This looks _really_ interesting.

I've been working with maya-rat/rms-prman - pipeline for years but for the next
show we're stuck with mentalray so I'd be very interested in beta testing this stuff.

Keep up the good work.

/risto

MaxTarpini
07-09-2010, 02:42 PM
thanks, appreciated !

ciao, max

MaxTarpini
07-09-2010, 07:07 PM
hi, this is last step for the sss prototype material. and that's pretty important for mentalray.
as said above we don't have micropolys for free as in prmans, so sometimes we may end up
not having enough dense pclouds to avoid tessellation artifacts.

there're now two modes for the mr pcloud sss implementation:

the first we have seen till now, it does everything on pclouds. and we end up having a 3D
baked SSS. But the details will depend by the pcloud density.

a second mode is instead to compute irradiance on pclouds and from there build acc
structure to support sss in the render loop. that's pretty like what does misss_fast for
example. we do use pclouds instead of lightmaps. as soon we evaluate sss in the render loop
we lose the 3D sss baking but we gain much more detail. this may end up also very good for
fast testing.


below 3D baked SSS( sss is computed in advance, even offline, in a sss pcloud and then looked up at render time).
coarse tessellation = coarse sss.
http://img340.imageshack.us/img340/1689/handssdiffusiontessella.png


here instead we use the same irrad pcloud and after having put that in an octree we leave
to the render loop to evaluate sss on samples but based on previously stored irrad points.
That's still pretty fast. above took 24sec for the whole. here we have 33secs (without irrad
pcloud generation that comes from previous render).

http://img130.imageshack.us/img130/1689/handssdiffusiontessella.png


now I need to make the whole stable for multiple objects and general production scenes.
done that, with some luck, I should be able to go for a first round of betatesting.


ciao, max

ristopuukko
07-09-2010, 10:54 PM
Looking good, Max.

I'm on my heels for this...

/risto

MaxTarpini
07-12-2010, 12:08 AM
sss material shaping up.. I added support for fg but I plan to add pcloud color bleeding support asap.
ie. more than going to support existing workflow, the goal here would be to have an alternative one. :)

http://img4.imageshack.us/img4/1842/happybuddha.png

buddha model is pretty bad triangulated, I use a fine surface approximation to get more points for our cloud (1milion).
Baking diffuse+fg takes around 20secs. While building the mat I used SSS computed onthefly instead to bake it up
into another pcloud. fg is frozen for the final render (1mins 18secs).


http://img821.imageshack.us/img821/2931/happybuddha02.png

While baking the SSS takes around 2mins, then render with SSS pcloud takes 25secs.

ciao, max

MaxTarpini
07-14-2010, 06:55 PM
first attemp on ao from pclouds ! raw ao method atm.

http://img4.imageshack.us/img4/4323/aobuddha000.png

ciao, max

ristopuukko
07-14-2010, 07:51 PM
Looks like you need to do some refinements
still but you're clearly in the right path.

I'm dying to see your toolset for
creating these effects.

/risto

MaxTarpini
07-15-2010, 01:23 AM
yep, more than a refinement to the algorithm itself, we just need to run it n times to get rid of too dark areas.
this is due to faces which cast shadows but are shadowed themself.

here a two pass ao on the maxplanck model. plus the lookup interpolates up to 12 points.

http://img192.imageshack.us/img192/8172/aomaxplank005.png

2mins to bake the geo and the ao map. 8secs to render the ao pcloud.

ciao, max

cavemen
07-16-2010, 11:54 AM
@Max

Great going man... Do let us know when you would release the toolset for beta testing .. definitely count me in lol :cool:
One small question thought, have you generated the point cloud info from other softwares like Renderman or 3delight and then used with mentalray, or its all from native maya and mentalray that you have generated these point cloud info?

MaxTarpini
07-16-2010, 06:06 PM
nope that's all mentalray, not even maya. :)
edit: I mean I'm working in mayamr, but I'm not using anything related to maya API.


here I refined the stuff for AO. Need to speed it up however as it's still single-threaded.

http://img155.imageshack.us/img155/9298/aobug000.png

ciao, max

ytsejam1976
07-16-2010, 06:23 PM
Hi max. Simply great. :bowdown:

cavemen
07-17-2010, 06:01 AM
@ Max

So you are generating this point cloud information completely in mental ray then thats definitely coooool lol

Btw the AO stuff is impressive...

Jozvex
07-19-2010, 01:04 PM
Great work Max!

Count me in as someone eager to test!

:thumbsup:

Gabba
07-20-2010, 01:36 PM
Due to the fact that is a mental ray "feature" there is a way to see it on Softimage??? :rolleyes:
I will be more than excited to test it on Softimage :applause:

MaxTarpini
07-22-2010, 11:09 PM
Ehy Joz I see you have also 3Delight in your toolset, you're the best kind of candidate for incoming testing !

A skin attemp :) Of course for highpoly count models most of the times the geo tessellation is
already fine for good detailed pclouds. In facts I believe this approach will be just great for any
model coming out from zbrush or mudbox !

http://img14.imageshack.us/img14/7253/angellightback060030013.png

ciao, max

ytsejam1976
07-23-2010, 06:42 AM
Max please sign me for test?

chafouin
07-23-2010, 09:30 AM
Hey, could I also participate to the beta testing please?
Thank you :)

BTW, how long did it take to render the last image?

THExDUKE
07-23-2010, 11:04 AM
Hmm..I dont get the purpose for this right now...especially for the AO...why would I want such a workflow for AO?
Based on the images of the SSS on the hand i could image that this kind of setup might be relevant to gain more control for the later compositing. Anyway...it looks pretty interesting.

Hezza
07-23-2010, 01:57 PM
It means you can reuse the pointcloud data, so instead of calculating raytraced AO or global illumination for each frame, which is cpu heavy and prone to flickering, you can just calculate it once.

MaxTarpini
07-23-2010, 03:25 PM
yep, eventually there's a lot of literature around this approach to see also what are its pros and cons. Best workflow in mr would be to have AO from pclouds for distant points and then refine that with raytrace for closer stuff, I believe that's the same approach prman uses on colorbleeding.

Here a temp video to showcase the simple workflow to get AO from pclouds:
http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/pcloud_AO_000.html

sorry for the bad quality that's what I got from a free screen recorder.

this is the actual quality of what you see in the video.
http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/CaptureAO.png


ciao, max

THExDUKE
07-23-2010, 03:37 PM
OK..so as far as I get this, it is more of use in animations instead for stills right? Is it kind of compareable with the FG Rebuild thing?
We did some test with that for a FG map. Render e.g. every 10th or 20th frame and let build up a FG map so you can use the freeze option.
But this sounds like you dont even need a by-frame rendering. Just only one time? But what if I change camera angles and so on? Does the stored information include the whole scene or just based on the camera angle? Question is because if you for instance would create a FG map for a shot and then change the shot, the backside of your objects would differ in renderoutput compared to a fully rebuilt map on each frame.

Edit... you need to change the first links ending from .html to .swf ;)

MaxTarpini
07-23-2010, 05:06 PM
ok, a couple of things. If you're still unsure on what you do with classic ao methods, stick with them and do extensive testing there.

then eventually read the intro regard point cloud rendering in the first post to see if can fits on your knoweledges and local situation, I copyandpaste here some statements:

The use of point-based rendering techniques is reserved for complicated scenes, scenes that have lots of geometry and displacement shaders. Keep in mind point-based rendering is not a silver bullet, but is rather another tool for the technical director's bag of tricks.
It means two things at least: first if your scene just fits into the usual workflow of raytracing stuff, then that's all fine, - stay on that. second, pcloud renderings need a TD understanding on what you're doing, which means that you have already fullfilled any tech problem with your renderer and are looking for production approaches that may involve added complexity to the whole (and just to resolve an overall complexity that would not fit in raytracing approaches).

ciao, max

Hezza
07-23-2010, 07:50 PM
http://graphics.pixar.com/library/

some good reading

yogeshsherman
07-24-2010, 08:02 AM
Thanks for the info but unfortunately the link is not working ,if possible correct it and add it again so that people like me can understand it .

yep, eventually there's a lot of literature around this approach to see also what are its pros and cons. Best workflow in mr would be to have AO from pclouds for distant points and then refine that with raytrace for closer stuff, I believe that's the same approach prman uses on colorbleeding.

Here a temp video to showcase the simple workflow to get AO from pclouds:
http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/pcloud_AO_000.html

sorry for the bad quality that's what I got from a free screen recorder.

this is the actual quality of what you see in the video.
http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/CaptureAO.png


ciao, max

ytsejam1976
07-24-2010, 08:38 AM
http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/pcloud_AO_000.swf

bigbossfr
07-24-2010, 04:37 PM
This thread is very interesting.

:wavey:

ruchitinfushion
07-25-2010, 02:58 AM
From here
http://dneg.github.com/dnPtcViewerNode/
I downloaded *.zip file & it contains src code ..so can somebody compile for maya 2009 & 2011.

cavemen
07-26-2010, 02:26 PM
WOW !
Max
That was a definitly a lovely teaser lol ... :applause:
I am novel at this ...one small questing Max what happens when the objects animate ?
eg the sphere moves away from the torus ... does the point clouds update along with AO ?
or it works only on static objects ?

MaxTarpini
07-29-2010, 04:04 PM
God .. I've finally resolved a bug with arealights on pclouds baking.. I wasn't initializing the QMC sampler !! :)

http://img251.imageshack.us/img251/4414/mayamrpcloudarealightsh.png

http://img36.imageshack.us/img36/6430/mayamrpcloudarealightpo.png

In the while I added support to bake everything.. just specify a channel while baking and the same channel whilie reading. I provide both generic pcloud shading material and a pointcloud visualizer (it may display also surfels.. point with an area). Respectively first and second images here.

Now, back to color bleeding ! ;)

ciao, max


edit: @caveman.. of course for moving stuff you need to bake multiple (per-frame) pclouds.

ytsejam1976
07-29-2010, 04:07 PM
Gran bel lavoro Max


Good job. :)

MaxTarpini
07-29-2010, 04:33 PM
thanks man ! :)

ciao, max

ristopuukko
07-29-2010, 06:13 PM
Great work Max.

Regarding pclouds and moving objects,
I gather that you always bake in world space and if so, would the object space be very difficult to implement ?

/risto

MaxTarpini
07-29-2010, 07:39 PM
Ehy Risto,

well that depends from where we bake. If we bake from a geoshader, stuff comes in in obj space, if from a lightmap or a regular material, stuff comes in in world space. Generally in objspace you then have an instance transform matrix to convert it in world space. Pclouds themself are in worldspace generally, because state->point is the intersection point from where it starts the lookup, and that's of course in world space. Even baking in objspace while adding a global field matrix and then building an ad-hoc lookup-er.. SSS is linked to lights and AO is by-def global so I'm not sure we would introduce something consistent in this way.

Thanks for the feedback !

ciao, max

ristopuukko
07-30-2010, 04:21 AM
Hey Max

I understand your point when it comes to SSS.

Concerning AO, with pixar's implementation I used to write a shaders which bakes and reads
in object space so I could generate a complete AO - pcloud for a (example) static moving
spaceship/car/tank/you-name-it in the first frame and re-use the same pcloud for the remaining frames again and again.

...but I guess this is different since for starters you don't have
micropolygons for the creation of the pcloud.

I'm eagerly looking forward to see what you're up to.

Thanks in advance, Max

/risto

ndeboar
07-30-2010, 05:34 AM
Hey,

Its a two pass process, so you have to render your whole scene twice. In prman this isn't so bad, because it can spew out geometry lighting fast, but in mental ray i could image this would be a massive speed hit.

Secondly, prman/3delight have a massive amount of support behind it, vs this which is a 3rd party set of shaders, with no core support from mental ray it self.

So, i personaly think if you really want to take advantage of this technology, use a renderman renderer.

But maybe that's just me.

Good work though.

noizFACTORY
07-30-2010, 07:15 AM
Concerning AO, with pixar's implementation I used to write a shaders which bakes and reads
in object space so I could generate a complete AO - pcloud for a (example) static moving
spaceship/car/tank/you-name-it in the first frame and re-use the same pcloud for the remaining frames again and again.


By storing Pref into the point cloud and looking that up instead of P during the point cloud occlusion evaluation? And the object shouldn't deform, right?

Or is it similar to the method outlined in this paper (http://www.renderman.org/RMR/Examples/srt2005/sorenTrick.pdf) ?

This is some really impressive work here for mental ray, Max. :thumbsup:

ristopuukko
07-30-2010, 07:31 AM
noizFACTORY:

this thread is _not_ about prman but one post offtopic can be tolerated, I guess:



I just bake3d() with "object" coordsystem and then read the baked data accordingly.

With using maya__pref, I also can "bend" the pointcloud as if it was a normal
2D/3D texture using maya__pref so my object _can_ deform but
within reasonable limits though.



sorry Max (and all you others interested in pclouds in mr), now I'm finished
with prman in this thread ;-)

/risto

noizFACTORY
07-30-2010, 07:57 AM
noizFACTORY:

this thread is _not_ about prman but one post offtopic can be tolerated, I guess:



I just bake3d() with "object" coordsystem and then read the baked data accordingly.

With using maya__pref, I also can "bend" the pointcloud as if it was a normal
2D/3D texture using maya__pref so my object _can_ deform but
within reasonable limits though.



sorry Max (and all you others interested in pclouds in mr), now I'm finished
with prman in this thread ;-)

/risto

Don't get hysterical ristopuukko. I was merely conforming the method you used which _you_ posted here in the first place! My post was more in context of how the same could be done with Max's method. This thread is called "Pixar point clouds in maya" for crying out loud.

If we can discuss some aspects of how its done there then probably something good can come out of it for doing the same in mental ray. No point turning it into a this vs. that thread. One can always learn from every platform and find good things to implement it somewhere else.

ristopuukko
07-30-2010, 08:05 AM
Don't get hysterical ristopuukko.

No I won't - I'm not trained enough to use these media's
and since you can't see the smile on my face, my
(not native) english comes out too harsh.

I apologise.

;-)

/risto

MaxTarpini
07-30-2010, 02:20 PM
If we can discuss some aspects of how its done there then probably something good can come out of it for doing the same in mental ray. No point turning it into a this vs. that thread. One can always learn from every platform and find good things to implement it somewhere else.

Agreed .. in this sense I love histerical people :)

About obj vs world space for pclouding. I have an early geoshader which is able to read external geometry and bake directly from there. I was thinking for exampe to implement an instancing sys where we can use pclouds as proxies. For that we need obj space and an instance transform matrix in the pcloud. As soon I'll get back to that I'll see your suggestions in details.

ciao, max

Kel Solaar
07-31-2010, 08:02 PM
Impressive work Max, dunno what would be MR without guys like you :)

Its kinda sad that this solution is coming from a talented developer and not from Nvidia / MI especially since the time it has been implemented in PRMan.

Are you planning implementing blurred refraction support also ?

Keep it up :)

KS

MaxTarpini
07-31-2010, 11:52 PM
Hi Thomas,

thanks for the :)

Btw, take care that mentalray provides natively support for 'particle maps', with the RCMAP (render core map module). That's a generic API for dealing with point clouds (organized, BVH pclouds). Probably just as in Prman. Then in Prman as most of its features come from production R&D they have also implemented specific stuff for AO, SSS and color bleeding. While mentalray provides only the pcloud class, specific implementations are up to anyone can program them. It's like for kd-trees. They are just there, thread-safe and robust for anyone knows what to do with them (I use also that for SSS). That's MI philosophy to provide the 'system' and support for further user implementations. And frankly there's nothing to be sad about. :)

About blurred refl/refr, I'll look into those asap along with arealights support.

Btw, are you on maya win64 ?

cheers, max

MaxTarpini
08-02-2010, 06:04 PM
Phew .. occlusion computation is now multi-threaded !!

Frankly I thought I was going to make my pc exploding trying to do that from an _exit_ function.. but instead it seems working quite robustly ! For those interested, situation is that _init_ and _exit_ shader funtions are locked down to avoid concurrency, as of that they are single threaded. However doing so we have a safe place where to create our own threads, process data on those, close'm and return to main thread which will go further exiting the fnc.

Now, we're up to prman standards. As of that, it remains virtually to multi-thread octrees. That's not really trivial stuff. In facts also on prmans octree building is single-threaded. However, that's not the most time consuming operation, that being involved in parallel octree creation would be worth only for ie. CUDA support, ie. build completely an octree on gpu.. but I don't think for v.1 I'll have a chance to be already there.

http://www.ctrlstudio.net/__shaders/PClouds_Showcase/AO/capture-1.rar

ciao, max

BillSpradlin
08-03-2010, 08:08 AM
DD did a very similar implementation several years ago for "Speed Racer". Obviously I can't go into details regarding it, but it's great to see this stuff out in the wild. Keep up the great work Max, cheers!

ytsejam1976
08-03-2010, 08:53 AM
Great. Max, how are beta or release out for this geo Shaders?

Bitter
08-10-2010, 06:57 AM
This is some amazing stuff, keep it up!

I use Renderman and 3Delight as well as mental ray and Vray. But no single renderer gives me ALL of the better options in a single package. The pointclouds are something I truly miss when moving to a raytracer.

As with Arnold, a lot of rendering is moving towards a unified sampling system that's fast and handles a lot of samples for things like DOF and motion blur really well. You can use this system to bake other expensive effects and combine them into an overall system between them.

As a whole that approach would be amazingly fast to resolve detail well and eat very little memory. I am very much interested in seeing this in action more and more!

AtrusDni
08-11-2010, 11:04 PM
Nice work Max!! Count me in as subscribed to this thread. So CooL! :applause:

Devils1stBorn
08-13-2010, 12:40 AM
DD did a very similar implementation several years ago for "Speed Racer". Obviously I can't go into details regarding it, but it's great to see this stuff out in the wild. Keep up the great work Max, cheers!

Hey super star...what's been going on with you?

sharktacos
08-16-2010, 06:31 PM
Count me in too. Very interesting stuff!

MaxTarpini
08-17-2010, 11:54 PM
Ehy thanks for the support, I'm slowly coming back from holidays.. in the while I've multi-threaded SSS and I'm now playing a bit with that to smooth out the workflow. I think by the end of August or so I should be ready to begin releasing beta stuff with at least SSS and generic baking materials.

Here a cheap scene to test last advancements.. mainly the introduction of an 'irradiance' parameter to be able to smooth out artifacts when dealing with short SSS radii (mean free path length).. in facts the required point cloud density is related to the SSS radius.. for short radii we need to increase pcloud density.. to avoid this we can now blend the SSS with the baked irradiance to obtain the same efx without effectively dig with the SSS radius (and pcloud density.. ie. object tesselation).

http://img13.imageshack.us/img13/5518/sssnerdscoolsssbig.png

ciao, max

Bitter
08-18-2010, 01:12 AM
The SSS and baking some illumination per frame would be nice starts. I have a few ideas in mind to try. Mostly for illumination and reflections, etc. Those are the more expensive effects I want to mitigate.

Hopefully able to use an Rman type "baking" workflow with some raytrace features like they do then leverage some new things like the stereo rendering and progressive renderer for motion blur. Should give some exceptionally fast results.

ristopuukko
08-18-2010, 05:02 AM
Nice going Max.

Can't wait to get my hands on this...

/risto

zerogee
08-18-2010, 08:48 AM
Max, this is going to be awesome!:bowdown:

Is there a way to implement this for fur (shave and a haircut) also? A baked occlusion like in king kong would be a fantastic...

living_for_cg
08-18-2010, 04:37 PM
i checked it the day after starting this thread and didnt find any suitable response. But now i just came across it. So much conversations here. Max thanks for the response still need to read it all. Sure there is great information here.
Go on guys.

MaxTarpini
08-18-2010, 07:00 PM
a cheap test on some foliage. here is where SSS pcloud based approach should shine in all its glory. for a large forest for example, one may just back it to a point cloud and reuse that for every frame in a fly-through, the same ie. for static snow on landscapes and so on.

http://img85.imageshack.us/img85/5916/ssstreewithsss.png
No GI has been used, that's all light diffusion.

http://img839.imageshack.us/img839/6427/ssstreeirradpcloud.png
The point cloud with irradiance backed into. I used SSS evaluation at rendertime for fast testing instead to back it up into another pcloud.. (this is not avail in prman) it means that for every sample at rendertime instead to compute SSS on the actual irradiance, we do it on the backed illumination which comes as octree for fast lookups.

ciao, max

kanooshka
08-19-2010, 12:48 PM
Very intriguing. I wondered how long it would take before someone would develop ptc implementation in Mental Ray. When you get to the beta testing, count me in.

MaxTarpini
08-24-2010, 12:40 AM
Hi there, good weekend here !

I had to smooth a bit FG support as it was taking too long sometimes to bake FG due to mesh density. Now I fully support FG pre-baking with an option to place FG preprocess points at every vertex, triangle or every n. triangles, considering or not backfacing.

Other enhacements.. SSS pcloud map now contains also diffuse irradiance channel for improved workflow. FG baking process is now transparent, it is no longer a two step process.

I also added .mel AE templates to support material interfaces (damn.. I hate mel !!).
Plus the below scene will come with a set of sample scenes to discover pclouding in mray.


PCloud_bake_SSScattering:
http://img201.imageshack.us/img201/2345/sssmayaworkflow002.png

PCloud_SSScattering:
http://img409.imageshack.us/img409/2287/sssmayaworkflow001.png

You see, beside mib_color_mix and misss_skin that I use to add speculars, we have just the SSSbake and SSSread nodes plus the actual diffuse node to bake. SSSbake is used as a lightmap shader and is plugged into lmap port. SSSread can be used everywhere just like a material/texture node.

http://img138.imageshack.us/img138/2581/sssmayaworkflowtree.png

ciao, max

Bitter
08-24-2010, 01:00 AM
That's great! So I assume this means I can run mray in Lightmap mode only and then render after baking, correct?

MaxTarpini
08-24-2010, 01:13 AM
you can simply bake and render in the same go :)

http://img37.imageshack.us/img37/9628/sssmayaworkflowverboses.png


But you may also adopt lightmapping workflow and go to save out only your pclouds.

ciao, max

Bitter
08-24-2010, 01:29 AM
There will be situations where I will want to bake a map per frame and render afterwards. This will save me some time on multiple machines for re-rendering particular objects.

Bitter
08-24-2010, 06:14 AM
Something else. . .you might have said it but I missed it. . .

Are you only raytracing from the pointclouds or have you also implemented raster cubes?

SebKaine
08-24-2010, 09:08 AM
Max This is killing stuff you are making here ! you've done PTC integration for OC, Indirect, SSS in less than two Month and Alone !!! So here is my suggestion : Would You mind Working at autodesk in the Mental Ray Integration Team ? Cause i heard the team was in Holidays since 4 years now ... ;)

I'm also wondering why the Italian guy are so active in the mental ray community ? Francesca Luce, Alex Sandri, Matteo Magnazzi, Max Tarpini ... do you guys are forming a secret club or something like that ? ;)

ytsejam1976
08-24-2010, 09:43 AM
:D

Max Great. I 've a question, for really production, you want a big scene with leaf threes instance, to test the speed?
I send it to you. :)

MaxTarpini
08-24-2010, 12:59 PM
So here is my suggestion : Would You mind Working at autodesk in the Mental Ray Integration Team ? Cause i heard the team was in Holidays since 4 years now ... ;)
Touchè.. I've worked past 3 years for Autodesk and I can assure you none is in holiday there, ever ! ;)

Are you only raytracing from the pointclouds or have you also implemented raster cubes? Both are used for color bleeding from pixar papers, however I started with the Bunnel implementation so I still need to implement spherical harmonics stuff before have a chance to go on that route. That's why SSS and generic baking will be out before AO and Indirect.

Max Great. I 've a question, for really production, you want a big scene with leaf threes instance, to test the speed? Grazie, but atm I've done with foliage and tree stuff. Looking more for multi-milions displaced poly models.

edit: forgot to say I've added a parameter to deal with proc threads. we're almost gold.
http://img651.imageshack.us/img651/2762/sssmayaworkflow003.png

cheers, max

Bitter
08-24-2010, 07:04 PM
Fair enough. :-) I look forward to more progress on this!

Sidenote: If you do work for Autodesk, it would be great if this saw the light of day in an integrated release. Even if it's not "exposed" like the production shaders (for you know, 6 years now.)

MaxTarpini
08-25-2010, 02:15 AM
Another scene I'm gonna include with the shaders, an ideal one, the iceberg is already 1.2milion polys displaced, so I go straight to bake it in a point cloud and compute SSS from there.

http://img831.imageshack.us/img831/3751/sssglacier002.png
http://img713.imageshack.us/img713/7824/sssglacier001.png
http://img138.imageshack.us/img138/8251/sssglacier004.png
3mins the first image (where we bake diffuse+fg), 20secs the others.

ciao, max

cavemen
08-25-2010, 04:23 AM
My God!
Max you are really on a roll here .. teasing us with all the wonderful breakthroughs that you are achieving..
Good going !

mercuito
08-25-2010, 09:08 PM
This is really impressive, great work max! I have a few questions though..

- What is the ideal scenario for using this? I'm sure it wouldn't be suitable to use all the time, given that you need a lot of polys?

- Is it really faster then using mentalray's sss shaders or finalgather? If not, what is the advantage? memory?

- Is it a suitable workflow for animation?

MaxTarpini
08-26-2010, 12:42 AM
Above scenes are just good examples on where you may fire up this feature (point-based stuff) to smooth out your render workflow. It would be inpractical to bake to 2D a lot of foliage while generally when animated is also very prone to return flickering, plus you have already a lot of polys from the leaves. For the iceberg instead it would be just crazy going to parametrize it (uv map) and bake it on textures while fully raytracing it at every time would be just slow and flickering. Baking that in a 3D point cloud just resolve these problems for those situations.

For example here a very cheap camera anim with the SSS iceberg. 1mins the first frame (where diffuse is baked) 20secs all the remaining frames. 400 frames rendered in a couple of hours.

http://www.ctrlstudio.net/__shaders/PClouds_Showcase/movies/SSS_Icerberg.rar

ciao, max


For the remaining, there ain't better words to use regard point-based stuff than those already used on the initial CGTalk article, for the rest you'll have a chance to test it yourself asap.

The use of point-based rendering techniques is reserved for complicated scenes, scenes that have lots of geometry and displacement shaders. Keep in mind point-based rendering is not a silver bullet, but is rather another tool for the technical director's bag of tricks.

At Pixar it is the supervising lighting TDs who must make the decisions as to which rendering features are most appropriate for a particular shot. Color bleeding can still be faked when possible, but when a scene is complex enough, point-based color bleeding is used. Take the case of ambient occlusion, both point-based rendering and ray tracing are used at Pixar to create this effect, and it just depends on the specific details of a shot.

Bitter
08-26-2010, 06:48 AM
Out of curiosity, are you displacing this with something view dependent? I saw it pop, God knows I have issues with that at times.

The rendertime is great. I can certainly imagine how much is saved if you render to stereo at once (doesn't happen often.)

Anything that is baked for complexity makes sense for a pipeline where it's tedious to light manually or unrealistic to shove down a renderer's throat. This happens fairly often. Our renderfarm has mass amounts of memory and still has the occasional strain from a scene with many objects, etc.

MaxTarpini
08-26-2010, 12:32 PM
Out of curiosity, are you displacing this with something view dependent? I saw it pop, God knows I have issues with that at times. View-dependant stuff is not really welcome with point clouds. The pop you saw at some point is that I switched to a different machine, though that shouldn't happen anyway.. not related to pclouds I think that's a displace issue.

Ops, something I forgot to tell to Mercurito, is that, SSS does not require too much polys(points) as its result is diffusing, actually smoothing out, the tesselation or disc artifacts.

So for general HQ sub-D or Nurbs, you should be already all suited. Eventually one can add detail with surface and displace approximations, as mray is able to do flushable micropoligon displacement, that's a minimal thing to be involved with.

Be aware that anyway, a x64 system with at least 4cores and 6gb of ram is highly reccomended, for easy point clouding ! I'm not even going to release a 32bit version ..

ciao, max

Bitter
08-27-2010, 06:56 PM
I s#!t you not I keep checking this page twice a day. I'm ready for a paypal link for donations!

Ever since they created the map api I have been hoping for some "brickmap" love.

How dense has the point data been for some of these tests? Does it generate a massive file each time? And have you tried something deforming and created a cache per frame yet?

Kzin
08-27-2010, 11:45 PM
your last examples are great. the iceberg rendertime is really nice.
how fast is the pc sss compared to the normal one from mi? would make it sense to use it under dynamic lighting conditions, rendertime wise?

edit: and how about the ram usage. how much can be saved compared to mi one's?

MaxTarpini
08-28-2010, 01:10 AM
Memory is a problem only while baking, 1M points with irrad and SSS data are around 30megs once saved out, 25M points are around 1G. While reading the memory footprint is very minimal because the whole map doesn't need to stay in memory.

For scenes like the iceber one, pcloud approach is maybe 10x faster compared to anything else. Because we bake the whole model with SSS and diffuse (plus FG), so we don't really need FG at all on the remaining frames, plus SSS has been already computed.

However, for scenes like the CG character, once animated, it doesn't make too much sense to use pclouds; lucky the scene is a lot diffused, area lights and FG, so we don't need too much density in the pcloud, but anyway around 1/2M points; so that would be worth the use of pclouds only for much detailed characters. Then frankly for me, having just to produce the single shot, has been convenient to use a pcloud anyway because I left baked the diffuse light on skin while changing other stuff(camera view also), that gave a good overall feedback without having to re-compute SSS every time.

ciao, max

Bitter
08-28-2010, 01:39 AM
For me, most of the time saved will be for lighting and materials work. FG and Glossy calculation on multiple frames eats the most of my time in a render. Especially with motion blur and some static baked elements it will dramatically lower some of those calculations.

As you said then, the characters can be left unbaked. However, there are times when my "character" is a 3.2 million poly ship that doesn't deform, has lights, reflection, etc.

The time saved for baking would be amazing especially since it can be divided by material on the character and I can bake easier than the standard 2D lightmap approach.

Also keep that in mind with the workflow. Being able to bake on a transform node would be great but possibly out there for awhile. We have encountered assets with upwards of 20 materials applied to a massive scene hogging object and multiple layers.

Daniel-B
08-31-2010, 02:29 PM
Max, wonderful work! Do you think this method would be possible using MR in 3ds max as well?

Farins
09-01-2010, 08:33 AM
i follow that tread with much interest and curiosity.....and i just hope that can be avaiable aslo for 3ds max mr user..

ad ogni modo....siete davvero bravi!

MaxTarpini
09-01-2010, 12:35 PM
take care I've started my own thread with the shaders download here.
http://forums.cgsociety.org/showthread.php?f=87&t=915427

I won't be able to follow up on this thread so please for any question or whatever post on the other thread.

ciao, max

CGTalk Moderation
09-01-2010, 12:35 PM
This thread has been automatically closed as it remained inactive for 12 months. If you wish to continue the discussion, please create a new thread in the appropriate forum.