Switch to PC?


A mac user since starting with the 13" screen B&W “Macintosh” in 1988. Will always be a Mac user to some extent. I hate Windows and prefer MacOS.

But I just ordered a monster PC stuffed with three 2080tis so I can keep up with the rest of the team’s redshift production work. Its been over 5 years since theres been a reasonable high end macpro release, and even that was not an ideal 3d workstation. I’ll keep my eyes open for whatever the future holds for Apple, but I needed to make a choice for what will get the work done today.


There’s a certain bizarreness to the fact that Apple is seemingly locked in to the poorer choice for professionals in both CPUs (Intel) and GPUs (AMD). If they instead put AMD CPUs in with Nvidia cards, they could absolutely clean house.


In 1985 I purchased one of the first Macs. I was a Big Mac fan boy–went to Apple trade shows and saw Jobs do a few of his amazing keynotes. Owned a good number of Macs over the years. I was certainly a Mac fan boy.

Four years ago I decided to move towards GPU rendering (Octane) and built my first PC. A year later I built a second.

I still have a MacBook for some of my work but my 3d work is 100% Windows. I’m glad I switched…no reason to elaborate. The reasons are now obvious to anyone who is paying attention.

Welcome Joel to the “Dark Side.”


When companies reach a certain size/ market value
they become immutable in their confidence that they know
better than the great unwashed masses

Apple has so much cash reserves that
they are very nearly a sovereign nation.

The mighty United States Department of justice demanded that Apple unlock
one single Iphone of a terrorist after his massacre in California.

Apple blythely said " sod off"
and the FBI scampered off and procured their own third party "hacker to perform the task.

That petition for NVIDIA driver support is the very definition of “an excercise in futility”


Oh, I don’t think for one second it will get Apple to change its mind. But at least it signals that a lot of people are unhappy with the direction things are going. There are nearly 7,000 signatures on there – many of which could be potential 2019 Mac Pro buyers (or not, if Nvidia GPUs aren’t an option.)

The story also cites some high-profile Mac users unhappy with Apple’s stance. Signing a petition might be futile in the long run, but doing nothing feels even worse.

For what it’s worth, I’ll be voting with my wallet if the new Mac Pro isn’t a worthy successor to the ‘cheesegrater’ MP. I’ll switch to a PC for CG and just keep the Mac for daily mail, web and writing duties – plus maybe a bit of noodling/rendering. I’ve been a Mac fan forever, but like many, I’m running out of patience with Apple’s current inactivity, stubbornness and lack of vision.


I believe I heard that Apple’s iphone revenues are in the trillions. I dont really think I need to say much more than that :slight_smile:

And icecaveman–thanks :slight_smile: Yeah, Im not crazy about it, worked on Windows as needed in a previous job and it was quite frustrating at times. Just little things you take for granted as a mac user that you need special software for or whatever. But Im going kicking and screaming into the future (really the present actually).


While I completely understand professionals not trusting Apple on this one, it would make no sense at all for them to release another expensive, non-upgradable, AMD graphics limited “pro” machine after publicly admitting they screwed up the 2013 Mac Pro. They would be savaged in the press, nobody would buy them, and that would be that. The 2013 Mac Pro used up what remained of their professional customer loyalty. There’s a few people left waiting to see what happens with the 2019 Mac Pro, but if it’s not what it needs to be, even the die hard Mac fans will leave it for dead and buy PC’s. Even if Apple delivers, it’s going to take a long time for a lot of pros to trust them again, since they haven’t exactly had our backs over the last several years. My bet is that they will release a very expensive but powerful machine, with an Nvidia graphics option and/ or some new technology that sets the machine apart. Otherwise I just don’t see the point in them bothering.

As for switching to PC now, it’s the obvious best bet for Cinema 4D work if you need a new machine. If the 2019 Mac Pro turns out to be a beast, you can let them work the bugs out and switch back to Mac on the next upgrade cycle. If Apple blows it again, you haven’t wasted a year waiting on them.


After 3 years mac I go back to windows. Too many hardware problems, too bad service on the Apple side.

Today I ordered my new computer. I’m looking forward to the performance of modern hardware. I will miss Final Cut and the Retina display.


Apple have never once had the back of the professional 3D market, even though many of us have stuck with the platform and hardware, and used it successfully for 3D for decades. If it weren’t for the classic Mac Pro, and typically the 12 core option, i suspect most of us would have jumped ship many years ago. It’s testament to the build quality and flexibility that one Mac design had, that kept it in daily use for many of us, some 7-9 years after its release. This Mac Pro has now reached ‘Vintage’ status in the US, whilst for the rest of the world it’s classed as obsolete, so this really is the end of the road for these machines in terms of future OS support, even though I suspect some of us will still hold on and use them for as long as our frustrations can be tolerated!

That said, unless the new Mac Pro 2019 has ALL the flexibility of a modern PC, regardless of the design and build quality, I can’t imagine too many 3D pros will go for it compared to what I suspect will still be a significantly cheaper PC build. We could be amazed and shocked, but lets be honest, Apple has been lacking innovation, and still increasing their profit margins enormously for a few years now. I still love the Apple products I have and use, but I have absolutely no love or respect for the company they have become.


Have you guys seen the new AMD card? Looks like a serious competitor to the 2070/80.
If only Octane/Redshift were ported to Metal I would not even consider jumping ships.
I watched Nick’s PC unboxing on GReyscale gorilla and couldn’t help but think that this was a wrong move: the noise, the heat issue…Then the disappointing cinebench score that forced him to look into the configuration settings…
I’m through with all that. i just want something plug and play.


Yeah, I saw that card. 16 GB or vram (!), great performance, $700.

If Otoy ever finishes porting Octane to Mac using Metal instead of showing off slides in powerpoint of it running on Macs and iPhones, I’ll be in heaven.

As cool as that card is - we’d also have to hope Apple supports it in eGPUs. Who knows if/when that will ever happen. My AMD based eGPU really works smoothly - plug and play, no issues. I don’t want to hack any part of my setup, either.


What renderer do you use alongside your AMD Egpu? Pro Render?


Yup! Just fooling with it at these stages as I continue to learn. Not super happy with the speed (certainly better than the RX 580 in my iMac), but I’m hoping for a 20.5 proRender bump early this year, like they did for R19 last year.

The 2018 C4d update brought Apple metal support support and a 10-20% boost in speed. I’m hoping Maxon adds the AMD de-noising tech that is already out for the other ProRender releases. It needs it.

Honestly, I’m super close to installing windows on my iMac and using U-Render, the open-GL based C4D render engine, with my eGPU for extra oomph, and just leaving ray tracing behind.


Noise is something you can choose to have or avoid for the sake of $50. Silent cases are available and much quieter fans can be fitted, its pretty much a $50 difference in parts. Heat, there’s no difference. A mac and pc can use the same components, A 130 watt 18 core cpu on one platform is the same on the other. A 300 watt gpu on one platform is the same as the other. Plus if youre considering external gpu cases. These will make much more noise any a reasonably configured machine.


Sure but…I just don’t want to bother. This is (was) the beauty of apple computers and the reason why lots of pros agree to pay a higher price. These machines are elegantly built and work, with a longer life span than that of a PC.
Nick’s low cine bench scores were apparently caused by a driver problem. This is typically what I don’t want to tinker with.


re:noise. Buy a cheap machine, get a cheap machine. Buy a quiet machine, get a quiet machine. Not sure how lack of choice is the superior choice.

Anything to back up the claim they last longer? Just asking from a guy who had 2 iphones die due to exploding batteries, gone through 3 flimsy apple-made iphone cases before giving in on their 4 month life cycle, 4 iphone charging cables over the same time, 3 mbp power adapters with frayed cables. a mbp died from an exploded battery, a retina mbp died from overheating, a 30" cinema screen filled with dead pixels and 2 cheesegrater mac pros which shorted out 6 mechanical harddrives between them. But yeah, they totally last longer. cough

re: Drivers: Updating a graphics driver for better performance vs having to download cuda libs and hack drivers to get an external card working, not sure how the latter on a mac is a better experience.


We all have different hardware experiences. I’m in the same boat as imashination - two MBPs, both lemons with exotic hardware issues that rendered them unusable, both had logic boards replaced only to have exact same symptoms reappear in days. MacPro cheesegrater no longer boots. Windows boxes chugging away.

We work on computers every day; we should probably be able to get our hands a little dirty with drivers and cables and such - just as its good for the average vehicle owner to know how to change a tire, get a jump, and even change the oil.


Sorry I created this thread to better understand what the perspectives on Mac are rendering wise. PC’s are great machines so I’m not discussing what platform is better although the current state of Mac machines and software environnement directs to PC’s and Nvidia cards.
I have been a Mac user for 20 years after using PC’s for a decade. My personal experience is that Mac simply work better with less or easier maintenance.
My 2009 cheese grater is still up and running. My G4 last almost as long with just a graphics card replacement (ATI). That’s my personal story. I don’t try to convince anybody and respect other people’s choice. My take out of this thread is that GPU rendering will still very much stay a Cuda thing, at least for another year or so.


My 2007 white, plastic body, intel Macbook has been rendering
movie frames for the last five years with a reboot every few days to
make me “feel” Ive truly cleared the ram.

It survived two pre explosion “battery swellings” that I managed to
notice and remove before catastrophe.
I no longer bother replacing it and just run it of the AC power adapter
as it never leaves the house.

I had a heavy object drop onto the track pad and kill it off
I still use it for my old Non cloud adobe CS suite also.
I have it attached to a samsung 26 inch synchmaster NC240 monitor.

So indeed longevity varies from user to user.

I have two windows PC’s for my Character animation work.


Hey Mash. Just throwing my experience in there for no other reason than to argue, cause afterall, this is what forums are for :slight_smile:

I have a 24" Apple display monitor that I bought in 2003. Still works great, Color and brightness is good. No dead pixels. When I bought it the expectation was that it would last about 5 years.

My 2008 macpro is also still chugging away fine.

Been using iphones since they came out and never had a problem with one.

I love my macpro trashcan and the mac osx ecxperience. Its pretty robust for most jobs. Its just the eGPU thing that was the dealbreaker for me since moving to Windows. Im not going to get into all the analogies people use (fancy car vs jalopy with a powerful engine, etc) to describe the experience of Mac to PC, but I will say the ONLY thing I am enjoying so far about the PC experience happens inside of C4D. :slight_smile: