Is the future of Rendering Game Engines?Using Unreal/Unity as a primary DCC Tool

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

REPLY TO THREAD
 
Thread Tools Search this Thread Display Modes
  03 March 2015
Is the future of Rendering Game Engines?Using Unreal/Unity as a primary DCC Tool

I'll post this here and walk away..
Unreal Engine 4 - Kite Trailer (Tech Demo) running on GTX Titan X
https://www.youtube.com/watch?v=w6EMc6eu3c8#t=62
__________________
LW FREE MODELS:FOR REAL Home Anatomy Thread
FXWARS
:Daily Sketch Forum:HCR Modeling
This message does not reflect the opinions of the US Government

 
  03 March 2015
Looks great. Game engines have really started reaching that critical mass level where they're good enough for many things now.

For the stylized standard pixar-esc animation look or even other simplified art directions, I can see people using realtime engines for production - possibly quick turn-around productions.

For matching realism and things that require high polygon counts, going to obviously need a software renderer.
 
  03 March 2015
Man that looks great. Games won't need pre rendered cut scenes to make them look super nice anymore.
 
  03 March 2015
Yeah there's fewer reasons to have a CG produced cut scene with the capabilities the new real-time engines have

Games are having fewer pre-render cut scenes these days already. The main reason to have them vs doing them in-engine is to hide loading times.

A lot of the pre-rendered cut scenes are slightly higher quality versions of the game engine so they look seemless with the real-time graphics....and cheaper to produce.

Last edited by sentry66 : 03 March 2015 at 07:31 AM.
 
  03 March 2015
Originally Posted by sentry66: For matching realism and things that require high polygon counts, going to obviously need a software renderer.

But how do you respond to clients that will say something like, "the other company could do all this in real time, and it looked just fine to me?"
 
  03 March 2015
Originally Posted by Sthu: But how do you respond to clients that will say something like, "the other company could do all this in real time, and it looked just fine to me?"


Or for 10 bucks? Not an argument. For some projects it might work, for some not. I don't know about the engine much, but the lighting is quite usual, just a regular skylight outdoors. It looked just like ingame. Animation and direction is good, but not the lighting or shading by today standards. But it might become better in 5 years.
 
  03 March 2015
https://www.youtube.com/watch?v=wdZOSD1eT6k
I find Order 1886 even more impressive. After all its actual game : )
 
  03 March 2015
Originally Posted by ViCoX: https://www.youtube.com/watch?v=wdZOSD1eT6k
I find Order 1886 even more impressive. After all its actual game : )


It's artistically great, but technically it's scanline of 1990.
 
  03 March 2015
What is possible nowadays with realtime is stunning for sure. But there will always be a need for offline rendering and realtime rendering. That's simply two different goals, which ends in two different solutions.

Realtime content is optimized for frame rate. The goal here is 60 fps, not the very best realistic look. You have to fake a lot. GI, prebaked Lightmaps, Normalmaps, etc.

Stills and movies aims for realism. Time does not matter so much here. You can work with megapolycount, use time consuming render calculations, and so on.
__________________
Free Gamegraphics, Freeware Games http://www.reinerstilesets.de
Die deutsche 3D Community: http://www.3d-ring.de
 
  03 March 2015
Originally Posted by Sthu: But how do you respond to clients that will say something like, "the other company could do all this in real time, and it looked just fine to me?"


A workflow that allows you to preview in real-time and then render in high quality? I'd show them the real time first, if they say it looks good you've saved yourself further work. If they can spot the difference show them the high quality and explain the time requirements for each.
 
  03 March 2015
Originally Posted by mister3d: It's artistically great, but technically it's scanline of 1990.


"it extensively uses global illumination and physically based rendering... Combine that with the dynamic lighting, which has been rasterized twice for depth pre-pass, and a custom ambient occlusion solution which rivals HBAO, delivering soft shadows without any dithering"

I don't remember seeing anything like that in 1990 that was not ray-traced, and definitely not on the desktop. Trying to think of a single example.
 
  03 March 2015
Originally Posted by moogaloonie: "it extensively uses global illumination and physically based rendering... Combine that with the dynamic lighting, which has been rasterized twice for depth pre-pass, and a custom ambient occlusion solution which rivals HBAO, delivering soft shadows without any dithering"

I don't remember seeing anything like that in 1990 that was not ray-traced, and definitely not on the desktop. Trying to think of a single example.


It doesn't say it could be pre-baked, which probably is. But if it's UE4, it uses some clever techniques. It's just obvious it doesn't use raytracing at all. All diffuse and shadows. AO also helps to get rid of obvious lack of global illumination.
No, lighting definitely made a great step forward. I personally enjoyed MGS 5 prologue. It's just so far away from offline rendering yet.
I perhaps should say 1990-2000 timeframe and the end of it, when raytracing techniques were still too expensive to use even for cinematics.
 
  03 March 2015
Originally Posted by Tiles: What is possible nowadays with realtime is stunning for sure. But there will always be a need for offline rendering and realtime rendering. That's simply two different goals, which ends in two different solutions.

Realtime content is optimized for frame rate. The goal here is 60 fps, not the very best realistic look. You have to fake a lot. GI, prebaked Lightmaps, Normalmaps, etc.

Stills and movies aims for realism. Time does not matter so much here. You can work with megapolycount, use time consuming render calculations, and so on.


For the Kite demo it was running on the Titan X, they were shooting for the quality first and then performance after, it wasn't designed for the average PC to run at 60fps.
__________________
The Z-Axis
 
  03 March 2015
There are doubts that for "serious" rendering, game engines will become the norm of the future.

Basically for example, for something like Marmoset or Unreal, I believe there is no full raytracing implementation save for reflections. Everything else is based on Direct-X or OpenGL. This is sort of viewed as a "lower benchmark" against true full-calculation rendering.

So, I doubt that's going to happen. What has happened is a shift of resource usage to capitalize on GPU's. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to "fake" things done instead of using actual calculations (which will always be the goal of full renderers).

That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender's viewport, which may allow more detailed and real time rendering of models in the application's viewport and lead to a faster lighting model.

There are also developments to enable PBR for finished renders.

PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.
__________________
"Your most creative work is pre-production, once the film is in production, demands on time force you to produce rather than create."
My ArtStation
 
  03 March 2015
Originally Posted by CGIPadawan: Basically for example, for something like Marmoset or Unreal, I believe there is no full raytracing implementation save for reflections. Everything else is based on Direct-X or OpenGL. This is sort of viewed as a "lower benchmark" against true full-calculation rendering.

Hmm, no, it's not.
The fact it uses OGL or DX for certain things doesn't mean all the engine does is predicated on those libraries, this is a pretty gross misunderstanding of what the libraries do, what the tech does, and how those things mesh and are implemented.

Quote: So, I doubt that's going to happen. What has happened is a shift of resource usage to capitalize on GPU's. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to "fake" things done instead of using actual calculations (which will always be the goal of full renderers).

You haven't really seen or used much the tools discussed here, have you?

Quote: That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender's viewport, which may allow more detailed and real time rendering of models in the application's viewport and lead to a faster lighting model.

That is what the unreal rendering engine with the Disney approximation model for shader description does already.
There is also nothing in the new unreal PBR/PPS engine, or Unity's counter-offer, that forces you to run it at 60fps or something like that. They can be used for offline if you so wish.

Quote: PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.

It certainly hasn't, none of the acronyms people like to toss around PBR, IBL, or PPS got its start in games, or even an early adoption.
Debevec and others were doing those things offline a full decade before games even started catching the smell of it, and film was adopting them almost as fast as they made it out of the oven.

The title of the thread is a silly question to begin with. What is happening is, quite simply, a mix of convergence in technology, and the world catching up to heterogeneous hardware paths. Rendering is rendering these days, and a lot of efforts are going towards unifying things (with varying degrees of success) in terms of models and descriptions. The main issue is these efforts started late, and there is still a big gap between a lot of players, but something is better than nothing, and it will eventually smooth itself out.
Eventually things will converge and scale and the occasional specialization will determine what product you use on what platforms.

None of those problems have much to do with the games vs film approach people have when they look at this thing (although there is obviously such a gap because needs are different, but certainly not in the form or for the reasons perceived).
Common models, universal descriptors, ideal hardware path abstraction etc. are the current set of problems preventing full convergence. The whole games vs film thing or CPU vs GPU is being transcended and is only going to remain incidental for a few more years.

The current distinction, if you really want to draw one, is more across the shaders than it is the engine, and things like the post and how they mesh into deferred rendering shaders, which is something offline rendering doesn't care that much for while it's pretty important for 30/60Hz.
__________________
Come, Join the Cult http://www.cultofrig.com - Rigging from First Principles

Last edited by ThE_JacO : 03 March 2015 at 02:09 AM.
 
reply share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 06:21 PM.


Powered by vBulletin
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.