Interest in New GPUs?


#1

After doing no 3d for 6 months, I’ve suddenly been doing a lot of it lately…and have been using a combo of c4d R20 and blender. Have an eyeball on the GPU announcements that are looming. With diminished income I probably have no business even thinking about a new card. But can’t help myself.

Here’s what I’ve heard from the rumor sites:

-Countdown timer to next Gen announcement is here:


Announcement: Sept 1 / Release Sept 17 (The RTX 3070 will follow in October, and RTX 3060 in November )

-Live stream of event I believe is only on Nvidia web site.

-Rumors are that naming will change, and the replacement for 2080TI will be dubbed 3090. Or perhaps the replacement for Titan is 3090.

-Cards are purportedly 40-50% faster than previous generation

-Though at one time Nvidia was going to be jumping to 7nm TSMC process, it now looks like they are opting for Samsung’s faux 8 nm process. I say ‘faux’ because in truth it is an adapted 10nm process.

-VRAM bumps, both in type of memory and in amount. Some are saying the 3090 will feature 24GB of blazing GDDR6X.

-In the past Nvidia has brought a lot of innovation to the market: CUDA, accelerated physics, AI based denoising and upsampling and then two years ago: RT. Now there are rumors of something called a ‘traversal coprocessor.’ Purportedly this dedicated hardware will also be targeted at Ray Tracing. So something to keep an eye on, though I’d suspect it will take Otoy, VRAY and Redshift technicians some time to fold that into their rendering tech, just as it did with rt cores.

One pricing rumor suggests:
3090: $1400
3080: $800
3070: $600

Now the not-so-good news:
-The new gen is purported an energy hog
-And runs hot
-May require special power connector
-Is physically supersized. The 3090 is said to take up 3 slots and is massive in width, height and lenght.


#2

That really is a big boy… :open_mouth:


#3

Yes, CRAZY big. I would think about going open case if I get one.

The traversal co-processor might be just as ‘big’ from an impact standpoint.


#4

I’ve been holding off buying a motherboard / case for a new system - because I wanted to check what the new card requirements might be in terms of PCIE slots etc - and I’m very glad I did…


#5

If I get one I’ll won’t even bother try to fit it in any case, I’ll run it in a open mining frame at the end of a Pcie cable extension with its own power supply. I’ll migrate my other cards with it.
That monstrosity would suffocate any well ventilated case and I have no patience to deal with watercooling.


#6

if that is the real size, it’s a ridiculous monstrosity, and I wont be buying it.


#7

Is that a real photo? It looks like it’s twice the size of the other model despite being made on a smaller process.

Also, If they need a heatsink and fan this size, then it must draw a lot of power too. I understand why Jensen took it from his oven for the ampere reveal… It CAN also bake sausages on its grill. It’s a BBQ too.

It looks like they “brute-forced” their way to stay at the top, but even+50% performance doesn’t seem like such an achievement anymore when you see the size of this thing.

Let’s hope this isn’t a 2000$ card either…


#8

The image is slightly off, someone posted an updated version with the pcie slots matched, but it isn’t far off. Mostly just a perspective/camera distance thing.

Anyone wanting to do gpu rendering without spending a fortune I would still recommend a 1080Ti or two. Maybe the 30x0 will be something amazing, but until then the 1080Ti was a relatively great card. With the 3000 release there should be plenty up on ebay.

Not sure why the size of the card is bothering so many people. Near enough any desktop case has an extra 6 inches of wasted space at the front ever since 3.5" and 5.25" drives rarely get used these days, and even if they do you can often mount 2-3 of them in other case locations. If the space is such an issue because you want a small case size then the regular 2-slot 3080 instead of the 3-slot 3090 will still be fine. Though with that said, even Apple managed to learn their lesson with trying to make pointlessly small and un-upgradeable systems.

Also not sure where the “no case can cool these cards” idea comes from. If your case can’t cool a single 350watt 3090 then a pair of 2080ti cards at north of 600watts must have been anathema. Some cases are just flat out rubbish at cooling, others are much better. I have a 7 year old corsair obsidian which keeps a pair of 2080ti cards and a 200 watt overclocked i7 perfectly cool. I also have behind me a BeQuiet darkbase case which has all the cooling capacity of a gentle fart; I’ve had to remove literally all side, front and top panels to keep it from cooking itself.


#9

im not sure how much shock/weight a pci slot is meant to be able to take. it looks like it would break something. IMO, the pci slot is now passed its sell by date, if this is where things are going… those little plastic clips are not going to secure that sort of lump. It would make sense to make a kind of double length XXL pci slot, adding more lanes,and solving the power needs without cables. . there is plenty of room for length, but it just dangles there. I like how apple solved a lot of the real problems in their “new new” mac pro’s. PC’s could borrow some of that wisdom…


#10

I would certainly agree with you if the PCI cards weren’t also secured by a plate and screw at point of egress. I don’t find that they bounce around or seem insecure once properly screwed in.


#11

I still doubt it would survive being sent by post. Normally I build my own machines, but since my last brand-new MoBo was delivered defect which caused a lot of grief in the build, I consider getting a built and tested machine. But imagining a graphics card like this at probably two point seven tons weight, with no fix point on the far side at all… no, not going to have that delivered as pre-build.

Fortunately I still got some time before needing to replace my current PC (on GTX1080 as of now) so let’s see whether someone finally comes up with a case solution…


#12

There’s no way pcie slots are going anywhere, they’re simply too entrenched, even the next two generations are prepped at this point. The best solution is likely going to some additional supporting structure like apple has done. It would be funny going all the way back to full length PCI or Zorro slots with support brackets at the end.

Also keep in mind that it isnt the weight on the pcie slot that is the problem, it is the twisting torque of it trying to rotate downwards. So long as the 1-2 screws on the back plat are attached and so long as the card has enough stiffness to prevent twisting, all should be fine.

Shipping computers mostly use inflated foam packs to keep eveything in place, the user needs to remove the foam blocks before they use the machine.


#13

I see no reason other than dogma to have to jam-pack every thing into one box, the used market is awash with mining equipment giving the video cards an upright position, no stress on the PCB, much better cooling, with Pcie4 lane splitting I can have 6 video cards maybe up to 8 on one machine.


#14

agreed, but is still interesting that apple where able to go where the pc market can not.
Interestingly, there is also that some interesting stuff that never really went mainstream. so its not like nobody had thought that longer pci card, could be a good thing. I would like to see the power cable gone, and the pci adapter extended to deliver the power requirements. But I wont hold my breath :slight_smile:

apple where able to make a board, that is not be totally bound to current pc conventions.

anyhow, i am going to predict that this is not going to be a sweeping success, and that NV are going to have problems, going forward.


#15

Depends on what you consider an acceptable solution. Most mining cards have been of the 1060 geforce variety or rough AMD equivalents, I wouldn’t recommend these lower end cards for gpu rendering at all, the space, hassle, power requirements are just plain old annoying.

You also need to consider cooling solutions; 8 cards in one office room will heat it up too much, even on an open bench they will need directed cooling to avoid heat problems. Miners can get away with this because they tend to under volt and underclock the gpu to make them more efficient as the mining tends to be far more memory speed dependant. Neither of these are options with gpu rendering and so they will be spitting out their full heat potential. In real terms you’re looking at a 1500w - 2500w electric heater.

Keep in mind that although pcie 4.0 is twice the speed of 3.0, that doesn’t actually mean you can split a x16 pcie 4.0 slot off into two x16 pcie 3.0 slots for older gen gpus, the generations have the same physical connectors and so will have to negotiate down to pcie 3.0 speed from the start.

If you plan to put this in another room and use it as a render node, you then have the transfer speeds over the network on top of the chopped down pcie lane speeds to consider. In real terms you will find a pair of 2080Ti cards will beat out a mining machine with a dozen 1060 cards every time.


#16

Jules Urbach, founder/owner of Otoy (Octane renderer) says that the iPad has the best performance-per-watt he’s seen for 3d rendering.

Go figure. ARM processors may be more in our future than we understand today. I guess it’s no surprise that Nvidia is trying to buy their way into that market. What’s weird is that ARM can operationally function both as CPU and GPU.

As an aside…I did predict five years ago on this very forum that–in time–Nvidia would pass Intel in market cap. Back then Intel was 3x bigger. But yes that happened this month. Nvidia bigger than Intel.

I’m not that smart…but did observe that GPU’s were emerging as bigger mover in computing than CPU.

However, blink, and things change. Perhaps ARM will reshuffle the deck in coming years.

For now…3090 looks awfully sweet.


#17

Market capitalization has nothing to do with how big a company is, is just a stock market metric. So no, Nvidia is not larger than Intel at all, in fact is several times smaller if you look at their income(what they earn from their sales).
Stock market could crash tomorrow for whatever reason and both Intel and Nvidia market cap may worth only a small percentage of what worth today, but that doesn’t mean that suddenly they stop to earn tons of money.


#18

Intel and Nvidia are going in two different directions…and that’s reflected in stock val. Stock valuation reflects emerging/anticipated reality.

I’d much rather be a Nvidia employee, exec or stock holder right now over Intel. The street says the same.

To the larger point about GPUs: notice that Intel has entered that market and will be launching their own graphics cards. Whether it’s gaming, simulating, AI…or for us, 3d rendering…GPUs are the action.


#19

Stock valuation doesn’t reflect reality in any meaningful way, there are tons of example about that, i.e. company like Theranos had estimated value of several billions $. With that I’m not sayng that Intel is doing great, but simply that you should always take market valuation with a large grain of salt. Sorry for the OT.


#20

I don’t think we should worry about straying off topic. There is very little activity on this forum. Civil chat is a good thing in today’s world.

Yes Theranos suckered a lot of people in…quite a story that is worth studying btw…but they never really even had any clients. Nvidia has hundreds of millions of clients…and some of them are the biggest companies in the world.