Unreal Engine 5... Billions of polygons, realtime GI


#1

I guess in 2022, the question of Corona vs Redshift vs Vray or Arnold will be over… We’ll all be rendering stuff in a game engine :

16 Billion polygons, straight from ZBrush and Megascan with multibounce dynamic GI in realtime, on a playstation 5 (that means AMD hardware support too!).


#2

Oh my actual God. That is beyond amazing. Makes the viewport in C4D look like I’m drawing it myself with crayons. It is, indeed, unreal.


#3

Unbelievable, thank you for sharing!


#4

Holy %$^&*&^% Crap.
Amazing.


#5

Jaw drops. Speechless…


#7

For this and other reasons I’m wondering if I should have invested in learning Unreal rather than Unity. Stunning.


#8

I wish there would be a “Unreal Engine 5” render engine which can be used like Redshift, V-Ray and Co directly in the 3D app.


#9

A lot of the gorgeous real time look you get in game engines is due to light/AO baking. You can do the same thing in c4d as far as static baking. But c4d doesn’t have the equivalent of light probes.

In UnReal and Unity you spend quite a bit of time preparing and baking that data…and re-baking.


#10

This is one of the major changes in Unreal 5 though…using Lumen, you get fully dynamic real-time global illumination. No light maps and no baking necessary.


#11

These demos are incredible, no doubt, but they’re also highly reliant upon MegaScan data. Limiting a lot of creativity to existing assets that can be re-purposed.

What they’re showing here is nothing short of phenomenal, especially for the whole virtual production crews gaining traction in the market. But at the same time, it begs the question: What if I want to do something outside of the realm of MegaScans? And I think the answer there is simple. Make assets yourself.

So if I’m tasked with creating something unique, not based on this data. The requirement to make those base assets would be a massive, massive undertaking.

I’m actually confused as to how megascans were even used here. Simple displacement for newly constructed rockfaces. That would seem to be the case. Or if there’s a bigger library we’re not allowed to access yet.

Either way. A game changing tech demo, unlike anything ever seen before in real time.


#12

They use Megascans, sure, but it also has ZBrush sculpts straight from the app plus other photogrammetry models. There’s nothing here that means you have to stick with Megascans – it’s just that Epic also owns that now. The engine simply accepts huge-polycount assets, irrespective of where they’re from.

Also, I think they’re raw LOD0 scans – probably no displacement (do game engines even support it?)

Can you imagine if Unreal 5 powered C4D’s viewport (sigh).


#13

I’m thinking the insane amounts of money Epic has been making with Fortnite has led to some serious R&D budgets.

I do have to say after doing lighting in games for so many years that I’m looking forward to real-time bounce lighting!


#14

We’re definitely going to testing how much of our pipeline we can move to it. I wonder if there is any leftover Fortnite dough to speed up the compiler? I hope?


#15

Btw, if you use Unity, make sure you get the Heretic guy they released in the asset store for free. Or on GIT.


#16

Yeah, that demo is really amazing. From what I understand, this is not using raytracing. Would be interesting to see how and if this can be combined with ray traced effects (reflections, caustics, …). I know this is really exciting, but I am still not convinced this is as flexible as viewport needs to be, especially if you are not just dealing with fixed geometry (yeah i know animated meshs are not fixed in that sense, but you know what I mean) but with procedural stuff. I hope I am wrong though…
As an additional Renderer with a close integration, this seems to be a no-brainer though.