Macs could soon take performance leadership for 3d work


#21

For a moment there I thought you tried to disingenuously use false equivalence and strawmen arguments, oh wait, you did!

Shipping millions of phone size SOCs is no measure of how Apple will fare when shipping workstation grade and sized SOCs. I saw Apple’s leading shill suggest 128 core ARM chips may be planned, yeah that won’t be small and will likely be a monolithic design such that an error in the wrong place could write off the whole chip not just a core like AMD’s efficient chiplet design. AMD, nVidia and Intel are going down the chiplet route for CPU and GPUs, there’s a reason for that.

To my knowledge no one here said shared memory was a ‘terrible idea’ but how does that work with discrete GPUs that Apple have said they’re developing? Surely dGPUs which are not part of the SOC will not have shared memory. If I’m wrong and they do have shared memory then it’ll rule out PCIe based discrete GPUs. Trashcan 2.0 Incoming?

Who is James Orbach? Is that Jules’ less well known brother? I’d take anything OTOY says with a massive pinch of salt. Didn’t he say Octane on an iPhone was amazing too?

The 2019 Mac Pro looks like an anachronism yet it is priced 2-3 times the price of a well engineered PC workstation which is objectively much faster. If Apple thinks it can rock up and pull the same silly pricing stunt but with ARM chips and Apple GPUs, manages the odd benchmark win it’ll end up being just as big a flop as the 2019 Mac Pro. The fanboys on forums might declare victory (none of them actually buy this stuff anyway) but everyone with sense will still go for the bang for buck option and VFX shops will still buy PCs from suppliers that offer on-site support. More and more Indie shops will continue their switch over to the PC because that’s where the value is.


#22

Infograph, I wasn’t responding to you, FWIW.

Workstation sized CPUs and SOCs tend to be very expensive, yes, often because their large die size leads to low yields. But that is true throughout the industry. And Apple will happily charge very high prices for their premium products, as I’m sure we can all agree.

And quite frankly now one knows how Apple will deal with building their upcoming processors. Will they use chiplets like AMD? 3D packaging? Or simply leverage the expertise they have developed over the past decade?

Apple’s phone SOCs may have started out small, but they are hardly so now. And they seem to have transitioned from the 88mm A14 to the 119mm M1 fairly handily.

Maybe this effort will fail. Who knows. But I’m looking forward to seeing what happens.

[And my apologies to Jules Orbach for getting his first name wrong].


#23

Good post, NWoolridge. You are correct in noting Apple’s position. No vendor has anywhere close to the experience Apple does with modern chip design.

x86 is not modern and it’s on the way out. We’ll still see them selling en masse for 3-5 years, but the writing is on the wall. AMD made a big splashy announcement this week: “Hey we’ve got an ARM-based processor! It’s ‘almost ready.’” Meanwhile Nvidia goes out and BUYS Arm!

ARM can simply do far more with less juice. And it can do more computations per cycle. Apple is way ahead right now with this chip architecture and their integrated business model is a massive advantage.

Yes Orbach from Octane has praised Apple’s chip architecture, saying it’s the performance-per-watt champ. No one who knows anything argues the point. Can they scale that? Wouldn’t bet against them.

All that said, I’m buying an x86 PC for my next computer. It’ll be my last x86 purchase.


#24

I’ve been singing the praises of Apple’s SOCs on other forums, when I said Apple would ditch AMD GPUs and go their own way with their own in house designs I was descended upon by Apple fanboys saying I hadn’t got a clue what I’m talking about. Despite Apple making it vert clear in their own developer documentation.

I’ve said many times Apple’s SOCs are extremely interesting and will be not be lacking in performance. I’ve had to explain to Apple fanboys on Blender Artists that integrated GPUs does not mean slow using the example of the next gen consoles.

When someone writes ‘No vendor has anywhere close to the experience Apple does with modern chip design’ you know they’re a clueless imbecile. This is not to trash Apple but it’s simply an untrue statement. Apple license designs from ARM and their GPU tech has been acquired because they destroyed a British company to get it. Yay Apple you’re so wonderful, let’s not let the truth about this vile company get in the way of the virtuous image it likes to represent.

To say x86 is not modern as if ARM is vastly more modern is again utterly clueless, both architectures rooted from the 1980s, ARM is marginally younger by about 5 years or so.

Let’s bring some realism to the conversation and explore how Apple can scale up their designs from impressive phones and laptops to workstations. Can an SOC be made out of chiplets? It doesn’t look like it as it is a System on Chip which includes the system memory. Apple’s workstation SOCs may be very different from their portable SOCs and dispense with certain parts of the SOC. My guess is they’ll use multi-SOCs connected via some highspeed bus to scale up but that’s just a guess. I struggle to see how massively high core count SOCs can be fabricated affordably in such low volumes, the iMac Pro and Mac Pro are rounding errors in the Apple computer sales let alone the wider PC market.

Maybe Apple’s wafer costs will be spread over all 5nm products and can drive down the cost due to huge volumes at TSMC but how do they overcome yield issues as the SOCs grow in size? The only thing that makes sense is multi SOCs unless they are not going to be bound by price and the sky is the limit in pricing for the Pro Mac models, though any performance victory is a pyrrhic victory if it comes at an absurd cost to the end user.

I’m interested to see how this pans out not because I want to go back to the Mac but because I just don’t understand the economics of it.


#25

On the other hand Nvidia is also making Soc, certainly knows a thing or two about general computing, parallelism, graphics, AI accelerator and now owns ARM.


#26

Yes. Nvidia has made a very interesting pivot.
-They’ve been lead GPU player
-They are effectively pushing AI
-They’ve learned to tie in powerful software layers like CUDA, RT, GRID
-Now they own ARM

I’d put them ahead of Intel and AMD as the company most likely to dominate independent chip design as we move into the new era.

AMD is smoking it w/CPUs now, and will kick ass for the next year or two at least. But will hit the wall hard if they can’t transition effectively away from x86.

It’s hard to see a path for Intel unless they do something crazy with Graphene or some new unforeseen technology.


#27

-Five years ago I was here yelling that 3d CPU rendering was doomed. It was.

-Four years ago told you Nvidia would grow bigger than Intel. It did. Intel was 4x bigger then. Investors now say Nvidia could be next trillion dollar company.

-Now here to say that x86 is in countdown mode. It is.

You want me on that wall. You need me on that wall. :laughing:

The future battle?
-Apple Silicon
-Nvidia’s Progress with ARM and integration
-Amazon’s ARM-based Gravitron2 (server market)
-Open Source Risc-V
-Qualcomm
-Ampere
-AMD if they haul ass in transition from x86
-Fujitsu’s ARM (server market)
-Microsoft gets serious in custom chip market?
-Samsung mimics Apple again?
-Perhaps others in RISC…it’s a strange new world.


#28

Someone is trying to reinvent themselves as some sort of techno prophet and appears desperate for attention.

Reading tech press and regurgitating company PR a Prophet does not make.

CPU rendering doomed? LOL what an arse you’re making of yourself, the CPU still the most used device for 3D rendering by a LONG way. The future is cloud based hybrid rendering, any renderer limited to GPU is walking dead long term.

So you’re the guy telling us x86 is in countdown mode. And? What do you want, a fucking medal? :medal_sports:


#29

Apple license designs from ARM and their GPU tech has been acquired because they destroyed a British company to get it.

The first clause is false, and the second is tendentious.

Firstly, it is important to know that ARM has two major sets of IP: its Instruction set architecture (ISA), and its reference CPU designs (microarchitecture). Apple does not license ARM CPU designs. Apple has an ARM architecture license (the most expensive and all-encompassing one), which allows them to implement the ISA on their own microarchitecture designs. Since 2012 they have been designing and improving their own microarchitecture.

This is why Apple was not interested in buying ARM, and why they couldn’t care less who bought ARM: they already have everything they need since they have an in-perpetuity right to use the ARM ISA in their own designs.

Apple’s microarchitecture is completely independent of ARM’s reference designs, and diverges specifically in the depth of their out-of-order execution queues, and in the width of their decoder blocks. Of course, Apple has also been able to add to the processor many special purpose blocks, like ML acceleration units, image processing, and other capabilities integrated with their software stack.

Contrast with, say, Qualcomm, who despite having an architecture license, just released the Snapdragon 888, which just packages ARM reference designs (one X1, three A78 cores, and four A55 cores). They haven’t made an equivalent investment in new design work.

The tendentious part involves, I assume, Imagination Technologies. Apple licenses some of their IP, which I assume is somehow destroying them…?


#30

I’m not going to play semantics with you, the ARM ISA is still a design that is licensed! FFS. Apple did not design this, it’s not their own work is it?

If Apple was so setup for the future as you like to present then they wouldn’t need to keep relicensing from ARM. Apple would not have been so unconcerned about the nVidia takeover of ARM as you like to present because of the history between nVidia and Apple. nVidia have said licensing terms will not change but after the merger has been approved by regulators watch things change.

Before throwing accusations of biased viewpoints around you should really do your homework better, Apple destroyed Imagination Technologies when it attempted an aggressive takeover. Imagination Technologies is no longer a British company and now owned by Chinese venture capital and is a shell of its former self thanks to the sanctimonious corporation you’re such a fan of.

Despite failing in their takeover bid and destruction of Imagination’s value Apple still have to come cap in hand to license certain IP because they couldn’t get hold of the patent portfolio at a knock down price.


#31

I’m not going to play semantics with you, the ARM ISA is still a design that is licensed!

Well semantics are important, and in the context of a semiconductor discussion, it would surprise me if anyone knowledgable about the area would read “Apple license designs from ARM” and think that the object of that sentence is the ISA.

Apple did not design this, it’s not their own work is it?

Well, since Arm Ltd. was founded as a joint venture between Apple and Acorn Computer, and Apple initially licensed the ISA when they had a huge financial stake in the company, in a sense it was.


#32

Are you suggesting the ARM ISA was not designed and therefore not a design licensed by ARM?

A financial stake is not a design it’s just a financial stake. Maybe you think Apple was preparing for this moment 30 odd years ago? Prescient right?

And now nVidia will likely own ARM and take it down a path that benefits nVidia first and foremost. nVidia likes to sell silicon and my strong bet is that licensing terms will change to reflect that. nVidia will want nVidia CPUs and GPUs in mobile tech not licensed IP. nVidia is not ARM and they are big enough to do to Apple what Apple do to others, our way or the highway.


#33

Are you suggesting the ARM ISA was not designed and therefore not a design licensed by ARM?

Of course not. I would have thought that my meaning was clear, but I apologize if I was not specific enough. In the context of a discussion of CPUs and SOCs, where the whole focus of the discourse has been around how specific CPU and SOC designs have been derived and implemented, when someone makes reference to licensed “designs” (note the plural) the natural inference is that the speaker was referring to set of specific microarchitecture designs. While an ISA is a designed spec, I would never refer to it as “a design” or “designs”. To do so would be sloppy, and I was giving you more credit than that.

And, as much as they might like to, NVidia cannot retroactively alter the terms of an agreement their subsidiary entered into in good faith. Apple has the right to use the ARM ISA for as long as they see fit. If NVidia adds to or changes the ISA, Apple has no need to implement those changes unless, once again, they feel the need to. The fact that they use their own microarchitecture means that they are more or less completely independent of whatever NVidia does.


#34

The entire world of computing is experiencing upheaval.

Google, Tesla, Facebook and Amazon weren’t making hardware some years back. They are now, along w/numerous other international monoliths. This will massively impact Intel and AMD as their biggest customers won’t be buying nearly the same volume.

New players like Ampere are emerging with compelling offerings in the server market.

And with Risc-5 emerging companies like Whirlpool can suddenly enjoy boutique (incredibly custom) chip designs of their own.

The common theme? These emerging players are ARM based. Such designs are more affordable to produce and come with more power efficiency. Power efficiency = huge energy cost savings.

Meanwhile most personal computer users prefer laptops where energy-efficiency is vital. ARM again is the obvious choice.

All these trends point towards the rise of ARM and the death of x86. And currently no one is close to Apple in performant Arm chips. (the aforementioned server vendors are focused on low-price, low-energy, low-performance chips operating en masse)


#35

I came across this post on an ultra-geeky hardware forum:

The windows world is going to follow because MS is already itching knowing their surfaces now look like even more hot garbage compared to what Apple is doing. They just need a proper chip to compete with M1. Yes Nvidia sees it coming… no other reason to spend what they did picking up ARM. They could have made a chip with just a license. They are looking for the mind share that comes with being the modern day “intel” controlling the cloners licenses. Its all about the mind share… they want ARM to = Nvidia. And considering what they paid for that… I have ZERO doubt they already have a new chip in the works, and it may be a lot closer to prime time then people would think. I suspect by this time next year we will be talking about M2 vs Nvidias new super duper Windows friendly octo Core ARM + Ampere+ graphics solution.

It’s just one guy’s opinion, but his take could prove prescient. He’s certainly not alone in anticipating this. Might be off 6-18 months in the timing…but seems like a slam-dunk, viewed from my task chair.


#36

Another fine example of your propensity to indulge in strawman arguments. Where did I say nVidia would seek to retroactively alter terms?

In the same way Maxon shafted long term MSA owners by either shipping them to less favourable rental terms or even more unfavourable perpetual pricing nVidia will lose no time in moving licensees onto less favourable terms. It’ll be tough for Apple to forgo new instruction sets that their competition will have access to just to stay on better terms. So you can delude yourself Apple is in control all you want but the reality will prove quite different. nVidia are the shithouse company of shithouse companies they will pull any stroke necessary and Apple is in no place to dictate any terms. Apple couldn’t dictate to nVidia which is why nVidia removed their GPUs and driver support from the Mac. As far as I’m concerned they deserve each other.

Apple and nVidia will just be fighting over scraps the ultimate winners will be Chinese tech giants. In the same year as China has eradicated extreme poverty we see enormous food lines and 30 million more people without jobs, health insurance and soon to be without homes in the US. Get used to it this just an early view of the future when Silicon Valley is supplanted by Huawei and Co. More geniuses are born in China than anywhere else on the planet per year, their education system is world leading and no amount of lies about security or sanctions will stop China ending the unipolar world. The greed of the western neoliberals who shipped the manufacturing jobs to China for their own profit sealed their own nations’ demise.

ARM, x86 won’t matter to most of the world we’ll be using Chinese designed and manufactured tech on FOSS. China has launched the first 6G test satellite before the average back water hick has reliable 3G.


#37

Die Size of Intel chip in Mac Pro: 698mm
Die Size of Apple M1: 120mm

There is so much room and heat envelope to add more cores. Exciting times.

U.S. working with the amazing free folk of Taiwan = CRUSHING IT AGAIN.

The real question these days is can AMD, Nvidia, and Apple manufacture enough product to meet the enormous global demand.

Taiwan’s TSMC will be building a $12 billion chip plant right around the corner from me here in Arizona. (Just five miles from my home)

Tesla, meanwhile, now based in Texas, is busy saving the world from fossil fuels. Stock has reached half-trillion dollars. I hope to buy one of their vehicles in '21 or '22, if I’m lucky enough to work up their list. Did I mention the Arm chips in their vehicles?