The delusional world of PBR

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

REPLY TO THREAD
 
Thread Tools Search this Thread Display Modes
  3 Weeks Ago
The delusional world of PBR

I know This is going to be verycontroversial, but I have to let it out and want every CG artist doing Texture/Look Dev/Lighting thinkabout.

So this thing called "PBR workflow" is taken over the texture/shading workflow mostly using substance and sutff last few years.

I'm not going to explain what is PBR cuz information is out there, but I'm going to claim that PBR is not something completely new. I feel like people started say PBR after game engines picked up its idea.

ALL the theories and aspects of PBR were there before it took over.

Energe conservation
Roughness / Glossiness / Specular / Metalic
Fresnel
Microsurface
Diffuse / Albedo
Linear Workflow

These concepts were there 10 years ago when we were using Mental ray and Reyes. But not in game engines. It's just bit different terminology and buttons.

So doing "PBR" with Substance painter doesn't mean previous way of work is not "PBR"
It's just different materials need different input values. That's all

If a shaderneeds metalic map input, it's better to make one. If a shader hasspec / gloss input, makeone too. You can make identical result in both ways.

I heard one guy saying "Oh because spec / gloss workflow is not PBR" and... that'sjust not true.

And even if that's not PBR you (think) know, doesn't mean it's wrong or inefficient.

I can make whatever material I want in Spec / Gloss workflow as well as Metalic/Roughness workflow.

It's wrong if you think PBR workflow is the only way to creating realistic materials.

To me, making only one material for one asset (usually have many different materials) is not efficient at all. Why? Everytime you get a feedback which happens all the time in production, you need to export everything again.

It's delusional if you think PBR is actually physically accurate. It's not. Everything in CG, is just approximation of what we see in real world. Raytracing is not physically accurate. Pathtracing is not either. So many shorcuts andassumption are being made when we click render button.

Let me give an example.



Same wood. Different look. one in the left is just simple wooden box. and one in the right is an piece of wooden stick but lights are passing through because it's very thin.

It means when very strong light hit wooden surface, it penetrates very thin layer of surface and than scatters like SSS material. of course there's diffuse, spec reflection too.

But think what you would do when creating wooden box like on the left.

Would you really create that thin SSS layer? or just use standard material?
Just standard material. Right?
But to make it more "physically accurate", you need to make that thin layer of SSS and make it almost invisible than use standard material underneath.I never saw anyone make wood shader like this.

Same principle goes forrubber materials , cloth material,2sided leaf material.
It's all cheating. but very efficient to achieve a certain look.

We're living in a world where creating dirt layer with simple black and white maps. So whycaught up so much how you creating "accurate" maps?
And one another reason PBR doesn't really matter, is our lighting setup, almost 100% of the time our light rigs arenot accurate as real world lighting. All those area lights and that HDRI environment you set, will give a decent look, but it's totally guess work and doesn't even close to real world lighting. Don't even need to mention the GI.

Look Development is not about being physically accurate or following the rules. It's about creating what we want to see on the screen. Whatever cheating you need. do it. Don't let this funny little concept of PBR hold your back. You want to put that danm specualr wherever you want!

I worked 2 animation studio and 2 visual effect studios so far and they cheat all the time. They turn off GI (very often), Reflection, Set different diffuse / specualr contribution of certain lights, they even use ambient lights.Physical accuracy? nah man.

My suggestion? Spend more time to train your eyes to have an ability to tell what looks real and what doesn't rather than having false belief some fancy software will do it for you.

Tell me what your thoughts on this.

----------------------------------------------------------------------------------------------------------------------------------------------
Original post :http://kgs716.wixsite.com/gunsik/si...al-world-of-PBR
----------------------------------------------------------------------------------------------------------------------------------------------

Last edited by kgs7165 : 3 Weeks Ago at 01:55 PM.
 
  3 Weeks Ago
The main benefit for me is the simplicity of it, instead of 10+ nodes I can just have it mostly all in one. I also like that I can be using a really wide range of software/render engines and have the same settings in basically the same place with the same names. I agree that PBR as a name is misleading though.
 
  3 Weeks Ago
Its all hype and nothing new (except in game engines)...
Most people in the industry know that.
A bit like 'the cloud'. Its more marketing than anything else.

Annoying, but people fall for it so..
 
  3 Weeks Ago
Buzzwords like PBR are intended to mask the fact that CG research has really slowed down in recent years.

There just isn't much new being invented in 3D/CG, and these people need to sell CG software to a new generation of CG artists.
 
  3 Weeks Ago
I'm not sure PBR is about thinking software will do it for you. I think PBR is used mainly in games for materials, which use reflection model instead of old tricks like painted into diffuse reflection\just specular models. Alnd for diffuse, today it's separated into occlusion and diffuse, so shadows are less baked into diffuse but rather calculated via occlusion or shadows.
I agree it's just a fancy name. Anyone who worked with rendering knows those "new" concepts. But for those who didn't, and only painted textures for old game engines, it might be new.
 
  3 Weeks Ago
Isn't recent PBR buzz about it happening real-time?

The &*#! at Autodesk faked this for one of their Maya demos a few years ago with some robots on a basketball court, but in 2018 Blender 2.8 will have it for real!
 
  3 Weeks Ago
Originally Posted by SD3D: Isn't recent PBR buzz about it happening real-time?


GPUs support shader programming languages like HLSL (DirectX) and GLSL (OpenGL) and have done so for years. You can write pretty much any kind of shader for games with these languages.

GPUs can also run shaders MUCH faster than any CPU implementation - a lot can be done in realtime.

So being able to "realtime shade" on GPUs with custom written material shaders is not new at all.

Its just that most game developers did not really take the shading capabilities of GPUs to "extremes" before.

I guess they thought that "what looks OK will sell decently", rather than "how do we create the most incredible game graphics out there".

Thank your lucky stars that Crytek made the first Crysis game - that was all realtime GPU shader effects and they were the first to really PUSH GPU effects hard.

If it hadn't been for Crysis and maybe also Bioshock, games would look a lot worse than they do today.
 
  3 Weeks Ago
Originally Posted by skeebertus: GPUs support shader programming languages like HLSL (DirectX) and GLSL (OpenGL) and have done so for years. You can write pretty much any kind of shader for games with these languages.

GPUs can also run shaders MUCH faster than any CPU implementation - a lot can be done in realtime.

So being able to "realtime shade" on GPUs with custom written material shaders is not new at all.

Its just that most game developers did not really take the shading capabilities of GPUs to "extremes" before.

I guess they thought that "what looks OK will sell decently", rather than "how do we create the most incredible game graphics out there".

Thank your lucky stars that Crytek made the first Crysis game - that was all realtime GPU shader effects and they were the first to really PUSH GPU effects hard.

If it hadn't been for Crysis and maybe also Bioshock, games would look a lot worse than they do today.
In the past everything was faked. PBR is perhaps an inaccurate acronym but the buzz to which i refer is about real-time ray-tracing. "Physically based materials" such as metals and glass being the ones to most benefit.
 
  3 Weeks Ago
Originally Posted by SD3D: In the past everything was faked. PBR is perhaps an inaccurate acronym but the buzz to which i refer is about real-time ray-tracing. "Physically based materials" such as metals and glass being the ones to most benefit.


PBR in games does not refer to realtime ray-tracing. Its just a shading technique for material shading. Games don't ray-trace currently.

The only games I know of that do ray-trace are a few mobile games that run on Imagination's real-time raytracing mobile GPUs.

https://youtu.be/rjvaxcM4g7c?t=2

But here is a cool video of Quake 2 with realtime pathtraced lighting:

https://www.youtube.com/watch?v=Bi1XJ9q1lGo

If realtime ray-tracing or pathtracing hardware makes it into future GPUs, this could well be the future of realtime rendering.
 
  3 Weeks Ago
I don't think PBR is a buzzword. In practical terms in VFX I think it's had a very strong impact on how people work. And I'm not talking about the technical definition of PBR, or going into unbiased vs biased etc, I'm talking about the practical impact these technologies are having on the workflow of asset development, signoff, lighting and comp.

The simplest way to explain the massive change PBR (let's just call it that) has had on VFX is to look at Arnold. and the huge take-over that's happened in the past 5 years. Almost everywhere in feature film has moved to Arnold, or is using something custom. There exceptions to this rule are getting less and less each day. This is a direct impact the PBR movement has had on the industry.

The other thing I think that's happening, and it's a trend that will continue, is that we're headed for a major reduction in AOVs. This requires a re-think about how you approach your workflow. True PBR workflows now don't provide you with reflection passes as separate AOVs - this is true, for example, with Manuka. If you work with Weta you can't just ask them to crunch the reflection/spec on a render because they literally don't have that as a seperated pass. And the reason, I believe, is to do with the philosophy of photorealism. If comp adjusts the reflection AOV of an asset they are breaking the rules on energy conservation. So, instead of doing all this stuff in comp, Weta is doing it all in lighting. Want more spec hits? Well adjust the lights so they're tighter, as an example.

I know the OP used the demonstration of wood in different lighting conditions to explain how we fake materials but I think that's actually on the way out too. Wood will have subsurface/translucency in the future - some places already actually do this. You might turn it off in certain situations because it's impact is so small, like when the frame is moving fast or the wood is densly packed, but it will have a contribution that's mappable. If you model a thin piece of wood though you'll definitely use this.

And this is going to have a big impact on how assets are developed and signed off on - we'll start using lighting setups for asset look-dev that show them in a broad range of environments, basically we'll treat them more as real objects and see hwo they work in a dark speccy environment, in flat lighting, in stark daylight etc. We'll need to do more of this because our lighters will expect objects to behave like real objects - so we need sign-off on them as we would with a real object, which means knowing how it behaves in all lighting environments. So lighters quickly match reality and now the creative part of lighting will become much more like creative lighting on-set (even more so than it is now in Arnold I believe). And asset sign off needs to be this more complicated thing now because, if the lighters can't get the result with their physically accurate lights, you need to go back to the asset to make changes to shaders. I think asset signoff is going to become an artform. Also I want to point out again that thin piece of wood and thick box of wood, even when made of the same material, are different assets. Thickness is a problem we're still solving in VFX.

Ok, so that brings us to the next part: why are we going down this path? Well, it's about adherance to reality and about common ground for discussion really. It's a direct outcome of moving into linear rendering. Now we realise we can map real world values we want to use that as the first pass of lighting. In many ways it's about being able to say "This is what this object would look like - mathematically provable". I'm not sure this is a good thing - I like creative lighting, and I don't always want to match the plate. But I do like being able to talk about lights in a practical sense, not in an abstract CG sense. It's also good to see first pass renders being fucking close to reality, that gives you a solid base to be creative from - rather than the creative part being in working out just how to make it real. But it comes with limitations (seriously, we hve to go back to relighting to just get a little stronger spec hit here?) and sometimes I wish we could go back a step.

For what it's worth, I agree that PBR is over-hyped and we've been many of the things under that banner for 10 years. It's also true that we could make things real with or without PBR workflows. But I don't think that's the point of this grand experiment. The point is that we want to start accurate with assets and lighting, and have a shared vocabulary between the practical world and the digital world which allows us to get to realism faster. What we do from there is the fun part.

Finally. I don't know if I like this. I'm extremely sympathetic to the OP's points. But the writing is on the wall, just go work at a place that's embracing this deeply and you'll see the impact it's having.
__________________
Critcal feedback example #62: "Well instead of the Stalinist purges and the divorce and the investigation ... it could be about losing a balloon."

Last edited by axiomatic : 3 Weeks Ago at 09:13 AM.
 
  3 Weeks Ago
Originally Posted by axiomatic: I don't think PBR is a buzzword. In practical terms in VFX I think it's had a very strong impact on how people work. And I'm not talking about the technical definition of PBR, or going into unbiased vs biased etc, I'm talking about the practical impact these technologies are having on the workflow of asset development, signoff, lighting and comp.

The simplest way to explain the massive change PBR (let's just call it that) has had on VFX is to look at Arnold. and the huge take-over that's happened in the past 5 years. Almost everywhere in feature film has moved to Arnold, or is using something custom. There exceptions to this rule are getting less and less each day. This is a direct impact the PBR movement has had on the industry.

Yes, we are all agreeing that arnold (and even before that vray, maxwell, ...) already had that workflow for some years.
It's just recently that it did became a buzzword though... To the point they are confusing people by never really is explaining what it is. "Just trust me, it's awesome and you must have it".
While its not new, before that it was just called physically correct shading.

Also, PBR has nothing to do with biased vs unbiased.
PBR is more something that prevents the artist from using physical incorrect properties (a common example would be a material that reflects more light than it receives) by restricting the properties and internally adjusting the way they interact more towards realism.
A non-physical based shader can do the same thing but since it's less restricted towards realism you need to actually know what you're doing to get a physical correct result.
And since most of the time people tend to aim for 'realistic/real world' behaviour its much easier and faster by using a shader that was designed towards that. It's almost like a preset but not quite...
I think it'll be quite some time yet before we get unbiased rendering in real-time. Games still have to use allot of biased trickery and approximations.

Last edited by ACiD80 : 3 Weeks Ago at 02:53 PM.
 
  3 Weeks Ago
>Look Development is not about being physically accurate or following the rules.. You want to put that danm specualr wherever you want!

Take it from someone who has had to live through the horrors of a lighting department that routinely used to create sets with 25,000+ individual lights, all lovingly placed by hand : you absolutely do not want to put that damn highlight wherever you want. Humans are extremely good at identifying materials from a very limited number of visual cues, but ironically, are absolutely terrible at converging the parameters of an empirical model. The material will typically look good under the conditions that the shading TD used to set the parameters, then start breaking everywhere else. It's not entirely surprising though : most people have trouble visualizing 3 dimensions, and a BRDF function has 4 (that and shading artists typically don't know anything about the physics involved either). Constraining the model to be energy conservative is the only way to guarantee that materials behave consistently in every shot. To put this in context : switching to PBR saved over 60% in lighting production costs.

>These concepts were there 10 years ago when we were using Mental ray and Reyes. But not in game engines. It's just bit different terminology and buttons.

Energy conservation has been around since before the Kajiya rendering equation because you can't compute global illumination without it.GI & path tracing just wasn't being used much outside of academia until recently (~10 years). Ironically, although adoption started roughly around the same time for both industries, AAA gaming has converted much faster than film. My theory is that artists in gaming studios were comparatively both much younger and had substantially less political power than their counterparts, but i could be wrong.

>Games don't ray-trace currently.

Considering the amount of techniques that involve intersecting visibility paths (SSAO, SSR, VXAO, VXGI, HFTS,...), i would argue that games have been "ray-tracing" for a while now ; they just aren't classical recursive Turner Whitted path tracers. I even vaguely remember a Siggraph talk about the game "Split/Second"describing how they trace a few thousands rays each frame to update a GI deferred rendering buffer for the static geometry.
 
  3 Weeks Ago
Thank you for all your comments! It's very interesting people have different thoughts on this.

Some I agree, some I don't

I get the point how PBR helps lighting process in production (cuz it's less work). But I disagree with I that idea putting reflection where you want.

There's risk you let ugly images pass and justify it because it's physically accurate. If you look at the real world, sometimes you can't really tell the materials of certain objects because you're viewing odd angle or lighting doesn't give you that nice reflections. That happens all the time in 3D too. There's case you can't really change anything but need to add that specular on a certain spot. So you need to 'draw' that specular on your render with cheating. You want to take control over your image, not just put lights here and there and hopefully give you a nice reflections.
It's literally same thing what photographers doing lighting on set. You shape your objects with your lights. Think what they do studio lighting for car magazine. They put reflections wherever they want.

And speaking of energy conservation in 3D world, I'd say it's also false physic. Sure our shader will conserve energy, but our lighting is all guess work. You just put numbers whatever looks good on your render. HDRI doesn't give real world dynamic range, Area fill is fake, directional Sun is also super fake. and you know the look of shiny surface like metal, is all about lighting. So all the shaders will conserve the "fake" energy input values you blindly put. "Physically based" term sounds too much in this case. It's more physically plausible than before, but way off from accurate.

PBR gives you some shortcuts, and productivity boost and these are nice things to have. My point is, you don't have to caught up on this too much. Utilize it's function, but if you want to break it, I don't think there's anything wrong about it.
 
  3 Weeks Ago
>There's risk you let ugly images pass and justify it because it's physically accurate.

Absolutely not what is happening : rest assured, the day an art director at Disney or Dreamworks lets a single pixel go unmolested "bacause physics" has yet to pass There are many ways to control reflections without allowing broken materials back into the toolbox. As far as i can tell, this level of fine control is only available in proprietary software at the moment (and mostly not at all in game engines). As you point out : a DP knows how to cheat these problems, so it is possible to work this out in the real world, without black magic.

>lighting is all guess work.

Little bit of both. Eventually, the equation has to balance out : if the material is right, then lighting will be right too. The main problem is that most studios aren't very rigorous with their pipeline implementations. Models need to be at scale, with consistent units, light decay needs to be set to a square fall-off, textures & colors need to be used in energy linear color spaces, etc etc. What i see in a lot of the "HDR" "PBR" AAA game engines today is that someone copy/pasted the correct formulas for the BRDF, stole a flimic tone mapping curve and called it "done", while some other aspect is still very wrong. Unfortunately, it only takes one mistake to unravel the whole shebang. Because eventually the sausage has to work, the incorrect data has to be compensated somehow ; since lighting is where the buck usually stops, these guys are often the ones forced to butcher the numbers. The same applies to VFX : it usually takes a studio a few iterations before all the kinks get ironed out and everyone is hip to best practices (it's obviously easier when you don't lay off all your artists in between...)

>you don't have to caught up on this too much. Utilize it's function, but if you want to break it, I don't think there's anything wrong about it.

I think you will change your mind if you get the chance to work at a place that has this figured out. With the tools getting better and more users properly trained, it's not going to be too long until we look back upon the dark ages of Phong and wonder how people were putting up with these broken methods.

Disclaimer : i have been grinding this axe since ~2001, starting with better BRDFs for human skin. There's still plenty of edge left
 
  3 Weeks Ago
Originally Posted by ACiD80: Yes, we are all agreeing that arnold (and even before that vray, maxwell, ...) already had that workflow for some years.


I don't think adherence to PBR is just about the renderer - it's a whole methodology of doing VFX that starts on-set and during pre and goes through to DI, where it's represented by ACES.

While some big places have been using PBR on some sequences for years (although I'd argue ~5 at most) I think it's fair to say that no one was really using PBR from asset through to comp until Gravity came along. Even things like Avatar were faking it for the most part. And the adoption of Arnold, which is a good indicator of PBR adoption - nothing more, has really been huge only in the last four years, since about 2013. In 2014 I remember having the conversation with a bunch of vfx supes that basically went "Yeah, Arnold's great, but everyone loses money on their first show with it as they adapt". REYEs was still ever present, as was rasterising (hell, they both see a lot of sneaky use still).

So I don't agree that we've had the workflow, as an actual tried and true industry practice, for many years. To me that's sweeping our dirty secrets under the rug.

But, Yes I absolutely agree that PBR is a buzzword. Yes, it's over used and talked about some sort of promised second coming. But at it's heart PBR has had, and continues to have, a massive impact on the vfx industry. Walk into any facility right now and you'll see people pushing less AOVs, PBR all the way, conservation in comp, real world lighting equivilents (down to blacks, reflectors and digital analogs of real world lights). Everyone now has had experience with ACES, an extendion of PBR into the DI world. And people are still making these changes.


Originally Posted by ACiD80: Also, PBR has nothing to do with biased vs unbiased.
PBR is more something that prevents the artist from using physical incorrect properties (a common example would be a material that reflects more light than it receives) by restricting the properties and internally adjusting the way they interact more towards realism.
A non-physical based shader can do the same thing but since it's less restricted towards realism you need to actually know what you're doing to get a physical correct result.
And since most of the time people tend to aim for 'realistic/real world' behaviour its much easier and faster by using a shader that was designed towards that. It's almost like a preset but not quite...


Unbiased renderers will eventually converge on a realistic result (given realistic scene values) while an unbiased renderer is selective about which interactions it simulates, even if it runs forever it may not converge on a realistic result. The difference is one of consistency.

If you want PBR you will embrace an unbiased renderer eventually. It goes with the mindset of PBR. Knowing that something has these real world values, so under this light with these real world values, it will eventually look like the real object. Could a biased renderer do the same? Absolutely. Yes. We might never be able to tell the image wasn't realistic. But, if someone feels somethings wrong with the biased render, then they are not only looking at scene elements, they are also bias of the renderer, the settings that lead to optomisation.

This is why AOVs are being reduced. People don't want to tweak the render in its elements anymore, they want to tweak the lighting because that's how real images are made. Tweaking AOVs in comp breaks energy conservation and we don't want that. The industry is changing from wanting optimised solutions (which is what AOVs are, an attempt to avoid re-rendering) to making sure solutions are provably realistic. If that's how you think then of course you eventually go towards unbiased renderers. This is all about core handling of realism vs. faking realism.

I'm not bashing bias in renderers, I'm just pointing out that what I see as PBR is more of an 'all in' for realistic, consistent, physically accurate (and mathematically provable) results. And where you go with that seems obvious. And that's the mindset thing here. Consistency, a fundamental base of realism.
__________________
Critcal feedback example #62: "Well instead of the Stalinist purges and the divorce and the investigation ... it could be about losing a balloon."
 
reply share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 06:15 AM.


Powered by vBulletin
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.