this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Hardware

47 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
top 23 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 11 months ago (2 children)

If they were readily available they would be a tempting option! But no, Nvidia continue to be frugal with vram in the consumer space...

[–] [email protected] 1 points 11 months ago (2 children)

Unfortunately we need to be careful what we wish for. As AI processing becomes more and more popular, if a card like the RTX 4070 had 20 GB vram they would probably all be snatched up by companies to use for AI processing on the cheap, and suddenly 4070 and higher all go for $2000 like the mining craze all over again.

[–] [email protected] 1 points 11 months ago (3 children)

I doubt any big, mainstream western corporations would use GeForce cards for AI, even if they have the same amount of vRAM as workstation cards (and they'd never have the same memory as AI-specific cards such as the H100). Hobbyists will for sure, but they make up such a tiny portion of the market that I doubt it would change the overall demand. Chinese and Russian companies will, I guess, but I don't think that would have the same effect as crypto mining, which has immediate returns without any expertise for individual users, unlike AI applications.

[–] [email protected] 1 points 11 months ago (2 children)

Plenty of people using 4090s tho

[–] [email protected] 1 points 11 months ago (1 children)

Exactly, even though Nvidia removed NVLink it’s still a popular card for deep learning coz it’s relatively cheap for small labs.

[–] [email protected] 1 points 11 months ago (1 children)

NVLink died because PCIE Gen5 can handle incorrect now, no external hardware needed.

[–] [email protected] 1 points 11 months ago (1 children)

PCIe 5.0 cannot do what NVLink does, and NVLink isn’t dead at all.

[–] [email protected] 1 points 11 months ago

NVLink is no longer supported on the Ada Lovelace GPU architecture that powers Nvidia's flagship RTX 4090 graphics. Replacing NVLink is the PCIe Gen 5 standard. Nvidia will use the freed up space from the removal of NVLink to cram in more AI processing capabilities.

https://www.windowscentral.com/hardware/computers-desktops/nvidia-kills-off-nvlink-on-rtx-4090#:~:text=NVLink%20is%20no%20longer%20supported,in%20more%20AI%20processing%20capabilities.

The RTX A6000 ADA also drops nvlink. No idea why the downvotes

[–] [email protected] 1 points 11 months ago

Hobbyists, sure. Startups, maybe (from what I've seen, they're far more likely to rent a server or something). Mainstream corporations, definitely not.

[–] [email protected] 1 points 11 months ago (2 children)

I doubt any big, mainstream western corporations would use GeForce cards for AI

Why wouldn't they?

[–] [email protected] 1 points 11 months ago (1 children)

Nvidia has things like the A series for this sort of stuff, and at their level of use they'll be buying at a distributor or direct from manufacturer level depending on size.

Some small places will want 4090s and things though, and then yeah maybe they'll buy from stores and that but I doubt that'll be enough to kick demand up and cause prices like the crypto boom.

[–] [email protected] 1 points 11 months ago

There’s no advantage of the A series unless you plan to use the unique virtual GPU features. Nvidia added a toggle for the 4090 to make use of ECC, which apparently the memory can already do, to turn it into a professional card with less VRAM. It’s otherwise nearly identical to the A series cards, minus the NVLINK to connector two A series cards together.

[–] [email protected] 1 points 11 months ago

Official support, drivers, bulk orders direct from Nvidia, staying on Nvidia's good side

[–] [email protected] 1 points 11 months ago

Doesn’t matter what western companies would do, matters that countries accounting for a billion+ people and second largest economy would use them.

[–] [email protected] 1 points 11 months ago (1 children)

these companies they pre-order them, they arent like your average miners that go to AIB order a bulk like black market.

[–] [email protected] 1 points 11 months ago

Well there’s still only a finite number of them being fabbed and manufactured. Whether they custom order them in bulk or not it still reduces available supply for other customers.

[–] [email protected] 1 points 11 months ago

Frankenstein 3080M 16GB is interesting, 3080 mobile so roughly on par with desktop 3070 but with 16GB VRAM...and they're pretty cheap.

https://www.alibaba.com/product-detail/Graphics-Card-RTX3080m-GPU-GeForce-RTX_1600880058343.html?spm=a2700.shop_plgr.41413.35.39f45071W40035

I'm tempted, the frankendriver project fixes the driver issue with these cards (at least for now) and it seems like most people have a good experience with them...

[–] [email protected] 1 points 11 months ago

The 3080s yearn for the mines!

[–] [email protected] 1 points 11 months ago

You can either die a bitcoin hero or live long enough to become an AI villain

[–] [email protected] 1 points 11 months ago (1 children)

I feel like we are going to have another every gpu is out of stock moment

[–] [email protected] 1 points 11 months ago

Fortunately AI isn't marketed as being a get rich quick scheme for non-businesses

[–] [email protected] 1 points 11 months ago

Imagine if they weren't so afraid of you putting something in a server and gave us the option to simply buy the damn thing with a 90 degree connector and a superior single moving part design that doesn't give a single shit what case you put it in?

No, no, it's the chinese fault.

[–] [email protected] 1 points 11 months ago

i long wondered why something like that wasn't happening in occident. Pull old gpus and put like 40gb of cheap ram on it with multiple channels to make cheap cards. Then i remember occident's all about the money and "intellectual property" claims.