PBR is finally getting more and more attention. Unreal and Unity both supports it now. But how to render it in VRay in Max or Maya? Does VRay have a material or shader for it?
Thanks.
PBR is finally getting more and more attention. Unreal and Unity both supports it now. But how to render it in VRay in Max or Maya? Does VRay have a material or shader for it?
Thanks.
As far as I know V-Ray has had physically based material and lights from the beginning.
So as long as you are using their shader(VRayMtl) and lights you are more or less there.
I see VRayMtl has many maps in all different ways. But I am not sure, if PBR’s Roughness & Metallic maps equal VRayMtl’s Roughness & Glossiness? Will they give the same result, since they are not exactly the same shader?
I think the buzz came recently mostly because the game industry finally made a few moves.
cause a few game engines are starting to implement pbr algorithms in real time, people think its a revolution in cg industry…
its not, but, games will look nicer.
Pretty much all renderers around, except for Renderman, Modo renderer, or jokes like FurryBall are physically based. They always were. Even Mental Ray, if you use the material you were supposed to use all along (mia_materia, or Arch&Design in 3ds Max)
In Vray, it’s simply VrayMTL. PBR is here for years, and most of the experiences users are using it for years. Even the usual “artistic control” does not cut it as an excuse for using legacy workflow anymore. Whatever you were able to achieve 10 years ago can still be achieved with PBR, it just looks way better and requires way less effort to make it look good.
I don’t know much about PBR but as I understand it, it’s a games thing and how game engines are now using maps that render engines like Vray have been using for years. As for implementing the workflow in Vray it comes down to setting up the shader with the same maps in their equivalent slots.
PBR roughness would be used in the reflection or glossiness slots for vray. I have no idea how PBR metallic maps work, though. It seems to me they just make things look more metallic but this is achieved in totally different ways with Vray. Also Vray roughness is not the same as PBR roughness.
I’m replying to this post since I’ve looked around online and not found much on the topic.
Yes, you can use game style PBR textures in Vray, and get a very similar look out of those same maps, assuming that you do some maths on the input data and add extra shading nodes to the network.
I have a discussion and an initial proof of concept available at the link below. If anyone has any suggestions for improvements, please let me know and if they will work then I may be able to integrate them. Also, I plan on adding a feature to my MmmmTools Maya package that will automate the creation of such shading networks.
-Game PBR Shaders In VRay-
https://docs.google.com/document/d/1z8BP1_3gvpsJLswUVZX8lWUyU0hpZ55Dd-LCHo26c_g/view
Hi Joe, thank you so much for your great work and sharing with us.
Do you happen to have something similar for 3dsMax, so I can play with and give you some feedback?
Thanks again. :applause:
Use an OSL or GLSL shader with the matching Vray material. Or create one in ShaderFX and use the DirectX Material.
PBR is realtime/game technology so use realtime/game shaders that Vray supports.
-Eric
Could you please give a screenshot to briefly tell us what to do? Such as where to get an OSL or GLSL shader code files.
Deeply appreciate it.
Sorry, I don’t any experience working with realtime shaders. However, Vray has various ways to render those including DirectX, OSL, and GLSL.
-Eric
I don’t mean to necro this post but it’s for a good reason. There seem’s to be a large disconnect between the CGI artists and the game engine artists and even though I’m not a master on this topic I will attempt some detailed translations. Even though they are both called Physical Based Renderings (or PBR for short) they have bit of a different approach. Physical Based Rendering is nothing new to CGI veterans, but due to improving GPU’s the game industry is catching more attention. So please don’t be mad at the new kids on the block just be happy that PBR can be used real-time bot in games and CGI films. Heck Disney used it in Tangledto get faster render times and they even usedthe Unreal 4 Engine to render parts of StarWars: Rogue One in real-time due to game engines using PBR in their pipeline, but I digress.
In the CGI world since they have the luxury of taking the time to fully render materials in a scene based on actual factual physical properties they can and do. In this world renderings can take hours or days as most of you know, and have an astoundingly accurate lighting model, depending on knowledge of the render engine and how long you wish to wait. V-Ray uses the Anders Langlands shading method for some of it’s materials. (ie: alSurface) It’s a heavy material but accurate and this is what CGI artist want. In the game industry it’s all about real time and realism, one can’t have the other without a cost.
The initial question is asking about using GGX PBR shaders that utilize the Cook-Torrance BRDF that acts on a micro-faceted model opposed to the old Blinn-Phong/Lambert method of handling shading. In the old days most engines used Blinn-Phong after upgrading from the even older Gouraud method to handle shinny surfaces. Lambert was used to handle the diffuse of the actual texture blended in under the shine. These methods required a Diffuse, and Specular map to handle how shiny the model was and a grayscale Height map (later Normal map) for light reactive details. Now since then Blinn-Phong was aging and people demanded more! So graphics programmers moved to the more demanding Cook-Torrence lighting method for handling specularity. Because of this, a different set of textures are required to achieve the same effect. Now artists need an Albedo, Metallic, and Roughness map to interact with a GGX shader along with a Normal map of some sort.
Since GGX has caught on heavily in the game design world people have started writing shaders that work with render engines like V-Ray so they can get more accurate representations.
Here is a list of GGX based shaders for popular render engines.
V-Ray: http://www.shlyaev.com/rnd/37-cpp-category/54-ggx
RenderMan: https://renderman.pixar.com/view/cook-torrance-surface-shader-code
Modo( requires a little work ): https://www.youtube.com/watch?v=EfjyqZab9l4
So to answer the initial question…
Based on this video I gather that you can work backwards to bake your textures to be compatible with the new GGX BDRF shaders most game engines have adopted.
In the video they use a standard VRayMtl and under the BDRF area they set it from Ward to Microfacet GTR (GGX).
Refect = Diffuse Texture
RGlossiness = Roughness Texture (with inverted output)
Metallic texture gets used as a blend mask on a VRayBlendMtl masking the original VRayMtl
Honestly it might be easier to watch the video and work backwards. Through some trial and error it is possible to bake for a GGX pipeline using V-Ray. Though today, it might be easier to bake with Substance Painter in the end. Hope this helps you and future search engine enthusiasts!
PBR - Physically Based Materials
PBR Workflow - use maps instead of numerical values for your Diffuse, Rough, Spec, Fresnel & Normals
Using maps means the material will look the same cross application. Takes out the guess work
Redshift has had the option to set their uber material to use PBR / metalness workflow for a year now. I honestly don’t see the big fuzz, unless you use substance to make all your materials of course. But when creating materials from scratch, it doesn’t really matter as long as you don’t fudge the IOR up.
Right, but they haven’t been able to do it realtime until about 2004 when game engines started to adopt the Cook-Torrance method of shading which is an estimation and not a true representation. Whether or not PBR workflows are new or not has nothing to do with the initial question. I was explaining why a PBR workflow has caught the eyes of the game industry while trying to answer the initial question.