Is the future of Rendering Game Engines? ROGUE 1


#1

I’ll post this here and walk away…
Unreal Engine 4 - Kite Trailer (Tech Demo) running on GTX Titan X
https://www.youtube.com/watch?v=w6EMc6eu3c8#t=62


#2

Looks great. Game engines have really started reaching that critical mass level where they’re good enough for many things now.

For the stylized standard pixar-esc animation look or even other simplified art directions, I can see people using realtime engines for production - possibly quick turn-around productions.

For matching realism and things that require high polygon counts, going to obviously need a software renderer.


#3

Man that looks great. Games won’t need pre rendered cut scenes to make them look super nice anymore.


#4

Yeah there’s fewer reasons to have a CG produced cut scene with the capabilities the new real-time engines have

Games are having fewer pre-render cut scenes these days already. The main reason to have them vs doing them in-engine is to hide loading times.

A lot of the pre-rendered cut scenes are slightly higher quality versions of the game engine so they look seemless with the real-time graphics…and cheaper to produce.


#5

But how do you respond to clients that will say something like, “the other company could do all this in real time, and it looked just fine to me?”


#6

Or for 10 bucks? Not an argument. For some projects it might work, for some not. I don’t know about the engine much, but the lighting is quite usual, just a regular skylight outdoors. It looked just like ingame. Animation and direction is good, but not the lighting or shading by today standards. But it might become better in 5 years.


#7

https://www.youtube.com/watch?v=wdZOSD1eT6k
I find Order 1886 even more impressive. After all its actual game : )


#8

It’s artistically great, but technically it’s scanline of 1990.


#9

What is possible nowadays with realtime is stunning for sure. But there will always be a need for offline rendering and realtime rendering. That’s simply two different goals, which ends in two different solutions.

Realtime content is optimized for frame rate. The goal here is 60 fps, not the very best realistic look. You have to fake a lot. GI, prebaked Lightmaps, Normalmaps, etc.

Stills and movies aims for realism. Time does not matter so much here. You can work with megapolycount, use time consuming render calculations, and so on.


#10

A workflow that allows you to preview in real-time and then render in high quality? I’d show them the real time first, if they say it looks good you’ve saved yourself further work. If they can spot the difference show them the high quality and explain the time requirements for each.


#11

“it extensively uses global illumination and physically based rendering… Combine that with the dynamic lighting, which has been rasterized twice for depth pre-pass, and a custom ambient occlusion solution which rivals HBAO, delivering soft shadows without any dithering”

I don’t remember seeing anything like that in 1990 that was not ray-traced, and definitely not on the desktop. Trying to think of a single example.


#12

It doesn’t say it could be pre-baked, which probably is. But if it’s UE4, it uses some clever techniques. It’s just obvious it doesn’t use raytracing at all. All diffuse and shadows. AO also helps to get rid of obvious lack of global illumination.
No, lighting definitely made a great step forward. I personally enjoyed MGS 5 prologue. It’s just so far away from offline rendering yet.
I perhaps should say 1990-2000 timeframe and the end of it, when raytracing techniques were still too expensive to use even for cinematics.


#13

For the Kite demo it was running on the Titan X, they were shooting for the quality first and then performance after, it wasn’t designed for the average PC to run at 60fps.


#14

There are doubts that for “serious” rendering, game engines will become the norm of the future.

Basically for example, for something like Marmoset or Unreal, I believe there is no full raytracing implementation save for reflections. Everything else is based on Direct-X or OpenGL. This is sort of viewed as a “lower benchmark” against true full-calculation rendering.

So, I doubt that’s going to happen. What has happened is a shift of resource usage to capitalize on GPU’s. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to “fake” things done instead of using actual calculations (which will always be the goal of full renderers).

That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender’s viewport, which may allow more detailed and real time rendering of models in the application’s viewport and lead to a faster lighting model.

There are also developments to enable PBR for finished renders.

PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.


#15

Hmm, no, it’s not.
The fact it uses OGL or DX for certain things doesn’t mean all the engine does is predicated on those libraries, this is a pretty gross misunderstanding of what the libraries do, what the tech does, and how those things mesh and are implemented.

So, I doubt that’s going to happen. What has happened is a shift of resource usage to capitalize on GPU’s. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to “fake” things done instead of using actual calculations (which will always be the goal of full renderers).

You haven’t really seen or used much the tools discussed here, have you? :slight_smile:

That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender’s viewport, which may allow more detailed and real time rendering of models in the application’s viewport and lead to a faster lighting model.

That is what the unreal rendering engine with the Disney approximation model for shader description does already.
There is also nothing in the new unreal PBR/PPS engine, or Unity’s counter-offer, that forces you to run it at 60fps or something like that. They can be used for offline if you so wish.

PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.

It certainly hasn’t, none of the acronyms people like to toss around PBR, IBL, or PPS got its start in games, or even an early adoption.
Debevec and others were doing those things offline a full decade before games even started catching the smell of it, and film was adopting them almost as fast as they made it out of the oven.

The title of the thread is a silly question to begin with. What is happening is, quite simply, a mix of convergence in technology, and the world catching up to heterogeneous hardware paths. Rendering is rendering these days, and a lot of efforts are going towards unifying things (with varying degrees of success) in terms of models and descriptions. The main issue is these efforts started late, and there is still a big gap between a lot of players, but something is better than nothing, and it will eventually smooth itself out.
Eventually things will converge and scale and the occasional specialization will determine what product you use on what platforms.

None of those problems have much to do with the games vs film approach people have when they look at this thing (although there is obviously such a gap because needs are different, but certainly not in the form or for the reasons perceived).
Common models, universal descriptors, ideal hardware path abstraction etc. are the current set of problems preventing full convergence. The whole games vs film thing or CPU vs GPU is being transcended and is only going to remain incidental for a few more years.

The current distinction, if you really want to draw one, is more across the shaders than it is the engine, and things like the post and how they mesh into deferred rendering shaders, which is something offline rendering doesn’t care that much for while it’s pretty important for 30/60Hz.


#16

While I have an interest in all the technology and acronyms and try to learn them, I will readily admit I am “more driver than mechanic”. :slight_smile:

That said, the above feedback I used was passed on to me by nay-sayers of game engines (when I was actually sort of asking why we cannot have Marmoset-style “press F12 to take beauty render snapshot” renders yet in Blender).

I guess your point is “all roads lead to Rome”. Regardless of where it comes from, if it makes a pretty picture, we all “get there eventually”?

As always, it is nice to hear from one more knowledgeable on the matter. :slight_smile:


#17

It seems this demo has fully dynamic lighting with the following tech:

                 - [Ray Traced Distance Field Shadows](https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/RayTracedDistanceFieldShadowing/index.html) for the Sun *1
                - [Distance Field Ambient Occlusion](https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/DistanceFieldAmbientOcclusion/index.html) for the Skylight  *1 (medium scale AO wich doesn't suffer from screen space artifacts)
                 - Heightfield Global Illumination *1
       
       So apart from the distance field calculations for the meshes, there was no baking for the lighting involved. They even removed the lighting in the photogrammetry scans through Delighing to get the correct albedo textures without shadows.
                 
                 I recommend to watch the Making Of, which is even more impressive than the short in my opinion:

//youtu.be/clakekAHQx0

       Notable times in the Video: [0:53](https://www.youtube.com/watch?v=clakekAHQx0&feature=player_detailpage#t=52); [17:51](https://www.youtube.com/watch?v=clakekAHQx0&feature=player_detailpage#t=1064); [20:28](https://www.youtube.com/watch?v=clakekAHQx0&feature=player_detailpage#t=1227); [35:17](https://www.youtube.com/watch?v=clakekAHQx0&feature=player_detailpage#t=2117)          
                 AND in the Unreal Engine forums  Tim Sweeney himself stepped in and said the following thing:

Source: https://forums.unrealengine.com/showthread.php?60653-Large-Open-World-Kid-Kite-Demo&p=238888&viewfull=1#post238888

               [b]*1:[/b]

Source: https://forums.unrealengine.com/showthread.php?60653-Large-Open-World-Kid-Kite-Demo&p=241581&viewfull=1#post241581
Note: DF Shadows from 100m - 1200m means that from >Near Clipping Plane - 100m< shadows are shadow mapped (probably cascaded Shadow Maps from the same directional lightsource)


#18

Are the naysayers competent when it comes to rendering tech? Personally I would suggest that, as a driver, you don’t listen to naysayers when you see something inspiring, and instead you try and drive.
With Unreal being massively supported, entirely free including from royalties for pre-rendered content, and actually quite approachable there is no reason not to if you have the time.

I guess your point is “all roads lead to Rome”. Regardless of where it comes from, if it makes a pretty picture, we all “get there eventually”?

Before it was a world of tricks held together with duct tape. When the world moves towards a physical model then unification and convergence of efforts and results isn’t an impossibility any longer. That’s happening.
Game and Film have moved a long way from their distant and disjointed relationship and now take inspiration from each other on a daily basis, that means a lot more done in one ends up available in the other.

The difference isn’t any longer game vs film, it’s simply how long you’re willing to wait for a frame. If it’s one 30th of a second on a PS4 then you still have to cling to some tricks, especially with shading (deferred vs inline and instant use of the sampling), but that doesn’t mean the underlying tech for everything else remains removed from your needs as it used to be.

Look at the content creation demo linked above, and get the latest UE. The naysayers are probably the same people that in 2003 were telling me that sculpting and retopo would never catch up, or the same that ten years before again thought consoles would always use cartridges.


#19

To be fair, I think these guys weren’t even the true devs at Blender Foundation. I’d imagine the real devs and Ton Roosendaal have always had something closer to your line of thinking.

Personally as a “driver”, I try to take in everything I think is useful to get the quickest
time around a bend.

Anything pretty from as far back as Unreal 3 is useful to me, and any image
I see in UE4, Unity 5, etc I consider inspiring.

I try to figure out how things like Image Based Rendering, for example, are done and I try to “hackjob” it anyway into our projects. :slight_smile:

Never tell a driver he can’t take that kerb or cut that corner. :smiley:
So you might say I only give them limited credence. But I do admit I’m “no mechanic”. :slight_smile:

That all definitely sounds delightful! :slight_smile: I had been yearning for a time when everything “just renders as it should and does so quickly” everywhere.

I think part of my confusion really is wondering if I should wait for Blender Cycles to reach this point, or if I should use an engine like UE4 or Unity 5 as early as now since I’ve already felt the “convergence” as you call it was going to happen at some point and I’d gotten into a few arguments recently with people about it, but I never had the technical knowledge to back up my feeling. :slight_smile:

If it’s about getting there earlier, I have been thinking about Unity 5 precisely for that role. :slight_smile:


#20

If cycles or whatever catches up to them use it, switch to it or do so on the side, until then why not use what already does the stuff you need?
They aren’t hard pieces of kit to learn, and an understanding of the physically based model approximated the way UE does it (I don’t know about Unity tbh) won’t hurt when you’ll move to another software. Sure, Disney’s “metalness” might be dropped for something else, crushed edge transmission/suppression tricks might differ, but the general feeling for, and understanding of, lighting in an engine like that will move across just fine.

I would understand waiting for the free alternative if the current offers were expensive, but we’re talking about free stuff and with no legal obligations of any sort for your use case.

I would also suggest, when you turn around someone else’s opinion on another forum like you did in your first post, you mention it ahead of the content, and if you feel it’s authoritative also link the source. It came out pretty peremptory and personal instead of a second hand pass-it-on, which doesn’t work out great unless you absolutely trust the competence of those it came from, especially the whole DX/OGL blurb which is usually index of a gamer level of understanding rather than developer or even competent driver.