A thought on future Viewport Development

Become a member of the CGSociety

Connect, Share, and Learn with our Large Growing CG Art Community. It's Free!

 
Thread Tools Display Modes
  02 February 2018
A thought on future Viewport Development

I know, I know... we've talked this topic to death; but a thought occurred to me the other day.

Many of us dream of using VR or AR to work within our scene: to tilt our head to look around our models, or standing up and physically walking around a scene in our office. There are many intriguing possibilities.

One extremely important aspect of both VR & AR is the need for very high frame rates, as low rates lead to confusion, motion sickness, and just irritating lag. Currently, frame rates in the C4D viewport aren't ideal - but no matter how fast the viewport gets, we'll always find a way to throw a heavy scene at it that brings frame rates to single digits (or fractions thereof) which is clearly unacceptable for VR.

Since no one wants to have their entire VR screen brought to a halt due to scene processing, viewport technology will have to be developed that decouples the processing of the scene's camera from the processing of the rest of the scene's contents - allowing the viewer to tumble around the viewport fluidly at 100+ FPS while the rest of the scene's contents are animated/updated at a lower frame rate.This would be a huge help for usability even if you're not using VR, as the user perception of speed would be significantly improved.

Maybe that thought is old hat, but it just struck me for the first time, and I thought it was interesting. Maybe we'll see a development like this in R20 (fingers crossed).

-- Luke
 
  02 February 2018
Edited.

Seems i made a mistake with my post which was not much related to the subject . Apologies.

Last edited by Bullit : 02 February 2018 at 12:02 PM.
 
  02 February 2018
That decoupling should actually not be too hard to achieve.

In any kind of animation software, you have a variable that tells the software which frame on the timeline it should evaluate and put on the screen at any given time.

There is a function that tells that variable to go +1 at a given time interval grabbed from your PC's system clock, and that function can be manipulated.

So for example, if evaluating a character rig or other complex objects is causing slowdown, you could get C4D to evaluate and update that 3D element every 3 to 4 frames only instead of every frame.

The animation would look a bit more staccato - a fast moving object might skip through space instead of smooth motion, kind of like you've messed with the shutter of a camera - but you'd get higher FPS in the viewport.

This might make heavy scenes much easier to playback with smooth viewport cameraa even if you are not using VR, so its not just useful for head mounted displays.

Of course things like particle simulations, fluids, rigid bodies, hair dynamics and cloth that DO need to be evaluated one frame at a time would be a problem with this. Skipping 3 evaluated frames and evaluating on the 4th would throw the simulation's behavior off.

But you could have a function/button that caches the animation of objects with this particular need before you put on your VR helmet.

You could also have a function that manipulates the evaluation of elements based on viewport FPS. When slowdown begins to happen and FPS drops, the function would automatically evaluate the 3D elements that are causing the slowdown in steps of 3 to 4 frames or larger to allow FPS to stay high.

Another option is the good old bounding box - if something particular is slowing everything down, display as bounding box (no evaluation) or display the element in its last, fully evaluated state. You'd see where the object is in the 3D scene and what its shape is. But whatever slow thing is being done to it - fracture and shatter on a nice big vase for example - would not happen.

So your idea is probably pretty doable technically.

It is also something that would be great for games.

In a game where 50 soldiers are running across a battlefield, keeping the framerate smooth while not updating the soldiers themselves every frame might not look too bad actually.
 
  02 February 2018
Originally Posted by LukeLetellier: Many of us dream of using VR or AR to work within our scene: to tilt our head to look around our models, or standing up and physically walking around a scene in our office. There are many intriguing possibilities.


Thought I would mention, in case you weren't aware, many artists are already doing this using my plugin:https://www.4dpublish.com/vr-viewer

It has its own OpenGL implementation due to the same issues you mentioned. It allows you to stand up and walk around your scenes, teleport as well as grab and inspect objects in your scenes. And it also has an in-built baking system to add ambient occlusion to your objects.
 
  02 February 2018
Originally Posted by kbar: Thought I would mention, in case you weren't aware, many artists are already doing this using my plugin:https://www.4dpublish.com/vr-viewer

It has its own OpenGL implementation due to the same issues you mentioned. It allows you to stand up and walk around your scenes, teleport as well as grab and inspect objects in your scenes. And it also has an in-built baking system to add ambient occlusion to your objects.

I've used Kent's VR features and can vouch for them. Obviously to achieve reasonable playback there are some compromises. But I got some simple c4d animations working within VR.

A few Saturdays ago I created a c4d model that corresponds exactly to my VR play space and adjoining desk. I have a colored floor plane in c4d that matches my Vive play space. If I go up to the edge of the wall in c4d...I hit my real wall in the physical space. If I touch the computer desk in c4d/VR...I'm touching it in real-world space....as I've constructed an exact real-world/VR match. It's a little trippy to have VR and physical world unite. I won't explain what I'm exploring but I will say that it was a fun exercise in itself. I might even try to model my keyboard and see if it's accurate enough to type on without removing the HMD. (I'll need to pin the keyboard down so it stays aligned with c4d model/VR).

One evening this week I'll be building in a green screen for a second wall, so when filming there will be more latitude to move around.
__________________
C4D R19 Studio, MODO 902, VRAY, Octane, Cycles. PC/Mac.
 
  02 February 2018
I agree with the decoupling. Most of my viewport slowdown seems to happen not because the GPU is overloaded or its a massive scene but I suspect its trying to refresh OGL textures and find bitmaps or making generators work. I can have a huge scene move really smoothly one day and a relatively simple scene stall the next. Not to mention the simple regeneration of viewports moving from quad to single view or changing workspaces.

On an unrelated matter I hate the way the object manager resets itself when moving between workspaces. Completely pointless.
 
  02 February 2018
What will slow down the GPU is mostly the number of polygons, textures, lights and OpenGL materials in a scene.

The rest is Cinema4D on CPU evaluating things like generators or a character rig.

For example, when a character's arm bone is rotated, the CPU has to calculate how each vertex of the mesh being deformed is translated in 3D space.

Same with things like cloth. Every cloth vertex is tied to other cloth vertices with something like a virtual spring.

The more vertices/springs are needed, the more calculations the CPU has to do.

Decoupling is a nice idea. It could do wonders for some types of work.
 
  02 February 2018
Originally Posted by kbar: Thought I would mention, in case you weren't aware, many artists are already doing this using my plugin:https://www.4dpublish.com/vr-viewer

It has its own OpenGL implementation due to the same issues you mentioned. It allows you to stand up and walk around your scenes, teleport as well as grab and inspect objects in your scenes. And it also has an in-built baking system to add ambient occlusion to your objects.

Yes, that was actually a tool that I was thinking of as I wrote the post. I don't own a VR headset yet (my brain's natural stereoscopic processing has issues, and I'm waiting for tech to improve before I dig in. Fingers crossed for Magic Leap) - so I haven't been able to try it out - but the demos I've seen have scenes that are much simpler than what I'm normally working with. But it's a fantastic start, and I'm looking forward to seeing how it develops.
 
  02 February 2018
Originally Posted by LukeLetellier: Yes, that was actually a tool that I was thinking of as I wrote the post. I don't own a VR headset yet (my brain's natural stereoscopic processing has issues, and I'm waiting for tech to improve before I dig in. Fingers crossed for Magic Leap) - so I haven't been able to try it out - but the demos I've seen have scenes that are much simpler than what I'm normally working with. But it's a fantastic start, and I'm looking forward to seeing how it develops.

Luke you should calibrate your expectations. People use the term 'real-time' rendering in many ways. In the context of Blender Eevee someone might say 'real time' when describing anything from 1 frame per 2 seconds up to a nimble 20 frames per second. For rendering to video that's fast. But VR needs frame rates from 30fps up to 90fps (or more). 90fps is the expected norm for assured user comfort if one is talking a shipping game/experience.

VR achieves that kind of frame rate through a whole host of tricks and compromises. With Kent's product for instance you are going to get modest anti-aliasing, no ambient occlusion (unless baked) and one has to be modest in scene complexity. You'll be limited in the material channels you use and with the types/features of light. You won't get depth of field or motion blur.

VR games are currently inferior in graphics to other games because of the frame-rate requirements, but they make up for it (and surpass IMO) traditional games by the sense of immersion and the more human interplay w/scene elements.

Otoy is claiming that Octane 4--integrated with Brigade--will allow for 90-120 fps VR playback. The company has over-promised what they can deliver (or perhaps I should say when they can deliver) in the past. We'll see. Even if they can deliver that....it likely will be limited to what can be achieved in Unity and Unreal.
__________________
C4D R19 Studio, MODO 902, VRAY, Octane, Cycles. PC/Mac.

Last edited by IceCaveMan : 02 February 2018 at 08:37 PM.
 
  02 February 2018
Originally Posted by IceCaveMan: Otoy is claiming that Octane 4--integrated with Brigade--will allow for 90-120 fps VR playback. The company has over-promised what they can deliver (or perhaps I should say when they can deliver) in the past. We'll see. Even if they can deliver that....it likely will be limited to what can be achieved in Unity and Unreal.


Brigade is a distributed realtime pathtracer as far as I know.

In other words, you'll get your 90FPS VR playback, but only if you have room full of 32 Core Xeon machines networked together.

That has BIG uses - in high budget filmmaking for example. James Cameron could shoot the next Avatar with actors on mocap and see the final rendered CG result in realtime on a monitor on set.

It'll make making 100 Million Dollar films a lot easier than it is today, because you see the CG as you are shooting, not after the VFX studio has added it in post.

But you'll need deep pockets to do that. Unless OTOY has found a crazy way to speed up pathracing, your VR helmet will probably be fed by 100K Dollars worth of CPU rigs or thereabouts.
 
  02 February 2018
Originally Posted by IceCaveMan: Luke you should calibrate your expectations. People use the term 'real-time' rendering in many ways. In the context of Blender Eevee someone might say 'real time' when describing anything from 1 frame per 2 seconds up to a nimble 20 frames per second. For rendering to video that's fast. But VR needs frame rates from 30fps up to 90fps (or more). 90fps is the expected norm for assured user comfort if one is talking a shipping game/experience.

Exactly. This is the point I was making at the beginning. If I'm working within the C4D viewport in a VR environment, I'm perfectly alright with the scene contents updating their animation at a stuttery 10-20 fps as long as my camera can move around the scene at 100 fps. Think of it like having a 12 fps clip alongside a 60 fps clip in the same AE comp. It's the only way the entire concept of working within VR will even be practical.

(side note - I'm not looking for fully rendered scenes like this - just the standard viewport.)
 
  02 February 2018
Originally Posted by skeebertus: Brigade is a distributed realtime pathtracer as far as I know.

In other words, you'll get your 90FPS VR playback, but only if you have room full of 32 Core Xeon machines networked together.

That has BIG uses - in high budget filmmaking for example. James Cameron could shoot the next Avatar with actors on mocap and see the final rendered CG result in realtime on a monitor on set.

It'll make making 100 Million Dollar films a lot easier than it is today, because you see the CG as you are shooting, not after the VFX studio has added it in post.

But you'll need deep pockets to do that. Unless OTOY has found a crazy way to speed up pathracing, your VR helmet will probably be fed by 100K Dollars worth of CPU rigs or thereabouts.

Otoy's products don't employ CPUs. VR as a rule doesn't use the CPU much at all...nor do game engines or Blender's Eevee.

Octane features different kernels with algorithms that are targeted to different use cases. The kernel for VR would be obviously optimized for speed, speed, speed and might be useful on a Hollywood set for live previews. But of course final renders will use a different kernel--or entirely different renderer-- and won't be close to real time.

The VR that Otoy is looking at for VR employs several techniques including light fields. Their vision of the future also includes AI driven de-noising, volumetric rendering, foveated rendering. None of it would be possible without modern GPUs.

These videos/demos are from a couple of years ago, but one can learn about Oty's light fields here:
https://home.otoy.com/render/light-.../<br /> <br />Perhaps a better demo/explanation is here:
https://www.youtube.com/watch?v=n9oILsWlW1U
__________________
C4D R19 Studio, MODO 902, VRAY, Octane, Cycles. PC/Mac.
 
  02 February 2018
Originally Posted by LukeLetellier: Exactly. This is the point I was making at the beginning. If I'm working within the C4D viewport in a VR environment, I'm perfectly alright with the scene contents updating their animation at a stuttery 10-20 fps as long as my camera can move around the scene at 100 fps. Think of it like having a 12 fps clip alongside a 60 fps clip in the same AE comp. It's the only way the entire concept of working within VR will even be practical.

(side note - I'm not looking for fully rendered scenes like this - just the standard viewport.)
I would recommend 4DPublish. The VR feature is useful and a lot of fun and will likely see some further development. Kent is perpetually enhancing the toolset. That and there are so many other great uses for 4dPublish.
__________________
C4D R19 Studio, MODO 902, VRAY, Octane, Cycles. PC/Mac.
 
reply share thread



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
CGSociety
Society of Digital Artists
www.cgsociety.org

Powered by vBulletin
Copyright 2000 - 2006,
Jelsoft Enterprises Ltd.
Minimize Ads
Forum Jump
Miscellaneous

All times are GMT. The time now is 04:16 AM.


Powered by vBulletin
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.