Maxon Aquires Redshift


I’m hoping this amounts to a solution other than Cycles4D for XP rendering. Cycles 4D seems quite powerful in this regard, but for some reason the demo and the demo content didn’t inspire me enough to want to learn it. I find it very slow - even on a multi GPU setup, even with 40 cores on a dual Xeon Gold; it crashed constantly on my own tests and made me want to turn my attention; should I be in the mood for a steep learning curve, toward Houdini and vex.

Give me the same power with Cycles4D on a fast gpu engine like Redshift, throw in point based particle rendering with motion blur, give us the capabilities to really tap into XP, mograph, fields; things that the ugly C4D native render engine can “kinda” do; with all the beauty pass power of Redshift, and I for one, will be more than happy to throw money at them.


Good point on sketch/cel, i forgot it.


I used Cycles on something recent and was very happy with it. I don’t have any GPU facility so it rendered on the CPU. Averaged about 2mins per 1920x1080 frame

But Cycles recent announcement and the Redshift acquisition have really got me thinking seriously about a pc . I think it’s highly unlikely Apple will come up with an Nvidia friendly machine.





I came to the same conclusion four years ago…and built my first PC.

Yesterday I built my third PC and it only took about two hours. I have poor vision so it’s more challenging for me. If I can do it…anyone here can too.

This system will serve primarily as a database server, but I might occasionally deploy for some 3d network rendering (sketch and toon stuff). This was my first time with AMD Ryzen (6 core 2600 model). All system parts only cost about $600. For grins I ran Cinebench 20 and it pulled a score of 2862*. I put an old Nvidia 970 card in it.

The system boots up fully in less than 15 seconds. Even under load temps are good with stock cooler and just one case fan. It’s crazy quiet and has been glitch free so far.

Windows 10 has become a nice platform and w/Ryzen you can build the base system real cheap and save funds for the GPUs.

Now I wouldn’t recommend this build’s parts for a real 3d creation PC…You’d want robust Power Supply and big motherboard. But I estimate a guy could build a system with two 2080 TI for around $3,800.

*The CB score is 20% faster than my 3-year old 6 core Intel system. That Intel CPU cost 4x as much. It also nudged past my 8 core Intel daily driver 3d system, LOL.


Re. Cycles speed…
I just did a comparison w/a very simple scene. Render times…

With new 6 core CPU:

With 1080TI and 1080:

Same scene, same computer but using CPU versus GPUs
218 seconds versus 21

Edit: update. I tried 970 by itself on different computer. It took 88 seconds.


IceCaveMan, that’s good to hear about the PC building. It seems doable and would save some $, but the thing holding me back is not knowing a thing about what to do after physically building the rig.

Doesn’t the motherboard have to be flashed or something and then you have to have the correct firmware running for everything?

I’m completely clueless about that final set of steps.


Been doing a lot of testing here with Cycles and I’m extremely impressed with the speed and quality. Denoising, extremely fast DOF and Motion Blur, Random Walk SSS and team renders solidly. The node material system is very intuitive. I’m surprised it’s not getting included in the same lists as Redshift and Arnold? Is it because its so new?


Also, Cycles lets me team render using my Mac cpu’s and windows gpus at the same time. Really impressive.



I’ll reply in a new thread as we are moving away from the thread topic.



Some quick Cycles tests here:
PC 1… 6 core CPU and Nvidia 970:
GPU : 94 seconds
CPU + GPU: 65 seconds

PC 2… slower 6 core CPU but w/Nvidia 1080 + 1080TI:
GPUs : 20 seconds
CPU + GPUs: 74 seconds

Depending on your scene and system…you might get a little boost adding the CPU…or actually incur a big penalty.


I agree. I really like Cycles. Was really robust on a job recently and yes the node system is super powerful and very easy to understand


Ok, the example was indeed extreme but I still believe that C4D is the most forgiving app in 3D, which is a technical field that often requires a lot of trials and errors to get what you want.


Does anyone have any info on what to expect for Redshift RT. Are there demo videos or anything. I have seen it mentioned in forums but can’t find any more info on it.


Some limited info here:


Thanks for the link - good to see that RTX is getting some love on this front.


As far as AMD vs Nvidia’s CUDA is concerned, they (AMD) are trying to make it easy for developers with a tool that converts existing CUDA code to C++ (that can be compiled on AMD hardware) with as little manual effort as possible required.


Pretty disappointed that rolling it into the main package seems to be off the cards for now. As a Cinema 4D user, why should I be excited about this ?
If it’s really because they have to get it working on mac first (I know this was speculation), then that seems a pretty poor decision to me. As a Windows Cinema 4D user, I’m tired of being held back by the need to accommdate the mac platform, which just looks less like a platform for 3D work every day.


Full integration into Cinema 4D is actually a major project and nothing you can do within a few months, at least not if you want to have something that isn’t just a hack.
Cinema users should be excited because Maxon just made a big comitment to GPU rendering.


I was meaning more bundling the existing plugin with the Cinema Studio license.
It becomes a much more attractive option for us when we know that our whole team & clients can open & render the scenes.
I guess I will have to be patient, I do trust Maxon’s record of making good thorough integrations. But some competitors have good bundled GPU options for a few years now.