this post was submitted on 19 Nov 2023
2 points (100.0% liked)

Hardware

47 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 11 months ago (1 children)

I’d like to see a rumor they solidified the manufacturing process!

Prices coming down across the board. I can dream I guess…

load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (1 children)

Weird article, regards kopite7kimi as a quality leaker but at the end says GDDR7 is "other rumors" which should be taken with a grain of salt, when it was hope who affirmed it.

Btw, would this be N3B or N3E? Did anyone follow on NVidia orders?

[–] [email protected] 1 points 11 months ago

They take what they feel like and discard others. It’s typical for them

[–] [email protected] 1 points 11 months ago

You know mean DisplayPort 2.1 that will actually work? The only thing AMD can ever make work are space heaters

[–] [email protected] 1 points 11 months ago (2 children)

Why not add it to the refresh cards?

[–] [email protected] 1 points 11 months ago

"new" "major" feature to help sell the new cards. Let's say you got a 4090, upgrade your panel, and then realized you're gonna need DP 2.1 because you got a 20k super HD oled monitor... time to drop $ for that 5090.

There is "Display Stream Compression" or DSC that I believe can help with the bandwith constraints from 1.4a to 2.1 but it's compression and may cause latency/quality loss - in general you just lose future proof as AMD already has DP 2.1 meaning they can fully support the 540hz monitors for example or any high quality panel.

[–] [email protected] 1 points 11 months ago (1 children)

The refresh cards still use the same silicon, just with a different amount of disabled functional units. Enabling DP 2.1 might require an expensive partial redesign, which Nvidia doesn't really have any incentive to do, since people will buy the GPUs anyway.

[–] [email protected] 1 points 11 months ago (1 children)

What is different about AMD’s W7000 such that they can offer higher DP standards support than the RX 7000 consumer cards?

[–] [email protected] 1 points 11 months ago

I don't know, but do they? Both support DP 2.1 ...

[–] [email protected] 1 points 11 months ago (5 children)

I feel like the number of people that care about being able to drive a display faster than 4k 120hz is rather small, bit weird to dunk on NVIDIA for that

[–] [email protected] 1 points 11 months ago

Nvidia always makes it a big deal when they do support the newest standards.

[–] [email protected] 1 points 11 months ago (1 children)

It might be nieche for now but Nvidia skimping out on it in their 4000 series was just weird given the price of these products.

[–] [email protected] 1 points 11 months ago

Wait until you see the price of the next gen!

[–] [email protected] 1 points 11 months ago (1 children)

How can the internet pass on the ability to dunk on Nvidia.

I bet you’ll have comments going on about shit VRAM too

load more comments (1 replies)
[–] [email protected] 1 points 11 months ago

Expensive cards will be connected to expensive monitors, and expensive monitors like the Samsung Neo G9 uses this connector

load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (2 children)

Yes we know. A lot of people wouldn't shut about the Nvidia GPUs not having it, as if it was that important.

[–] [email protected] 1 points 11 months ago (1 children)

there’s barely even any monitors anyway.

it’s like nvidia and the consoles: AMD can do whatever they want but the market penetration isn’t there until nvidia is onboard. Monitors are a low-margin high-volume business and you can’t support an advanced product that tops out at 10% addressable market.

Let alone when that brand’s customers are notoriously “thrifty”…

[–] [email protected] 1 points 11 months ago (3 children)

It's not just about what you need today, it's also about what you need in a couple years. If I pay $1600+ for a video card you can rest assured I expect it to be used for more than a couple years. Skimping on the ports seems like a bizarre choice.

[–] [email protected] 1 points 11 months ago (3 children)

So you need more than 165hz, 10bit, hdr, and 4k in a couple of years? Because that's what hdmi 2.1 on a 4090 is running for me. I agree. They could have done better on the ports, but to the majority of users, the hdmi 2.1 has enough bandwidth tbh.

[–] [email protected] 1 points 11 months ago (2 children)

Not just need, but be capable of driving, too. Even a 4090 wouldn't be able to run most games at the resolutions and refresh rates we're talking about, and I doubt someone buying an insanely expensive monitor and the most expensive consumer GPU on the planet would then play games on low/mid settings.

[–] [email protected] 1 points 11 months ago

maybe there's someone who needs 8k text clarity for web browsing, or someone doing productivity work who just happens to need 240Hz? Competititive Excel pros? :P

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 1 points 11 months ago (3 children)

But but but there's that one Samsung monitor everyone trots out every time the subject is brought up! Obviously every 7000 series owner has one, though I can't fathom what game even a 7900XTX could drive at 4K super ultrawide at 240Hz.

load more comments (3 replies)
[–] [email protected] 1 points 11 months ago (5 children)

I wonder why everyone holds AMDs display port 2.1 in such high regard. Its barely more bandwidth than hdmi 2.1 since its not the full UHBR20 80GBPS

its UHB13.5 at 54GBPS on RDNA 3 vs 48GBPS HDMI 2.1 on ADA GPUs

[–] [email protected] 1 points 11 months ago (3 children)

There was some kind of recent issue with a new fatty monitor requiring 2.1, right? I remember people rubbing it in Nvidia's face over it. This article just reads funny, like dunking on Nvidia for not putting money into something that had literally 0 tech available for it in the forseeable future and then laughing at them for adding it when there's finally tech to use it.

[–] [email protected] 1 points 11 months ago (1 children)

Probably because people expected a $1000+ GPU to have one years worth of incredibly foreseeable future-proofing built into it.

[–] [email protected] 1 points 11 months ago (1 children)

That would be silly of Nvidia, giving you things like vram and connections to plug in your monitor. How else will they sell you am expensive card every 2 years?

[–] [email protected] 1 points 11 months ago (1 children)

Does no other GPU maker other than Nvidia make 8GB GPUs?

Does anyone offer more than 24GB for gamers?

[–] [email protected] 1 points 11 months ago (1 children)

Rx 7600 8gb Arc a750 8gb

?

load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (2 children)

Samsung Neo G9 (G95NC) 57". It's a dual 4K width (7680x2160) at 240Hz, but existing Nvidia cards can only drive it at up to 120Hz. Radeon 7000 series can do the full 240Hz at native resolution.

https://www.displayninja.com/samsung-s57cg95-review/

load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (5 children)

Because HDMI is a pain in the ass and everybody who is using a PC is expecting to use DisplayPort.

[–] [email protected] 1 points 11 months ago (2 children)

Which monitor are you running dp on right now?

[–] [email protected] 1 points 11 months ago (2 children)

I have 5 PCs and 8 monitors in my home and they are all using DP except one that's really old and uses DVI. DP is the standard connector for PC monitors.

[–] [email protected] 1 points 11 months ago (1 children)

What do you think most people globally use dp or hdmi?

[–] [email protected] 1 points 11 months ago

On PC monitors? DP.

[–] [email protected] 1 points 11 months ago

Everything in my home uses HDMI.

I also don't use monitors. LG CX OLED as main monitor, sometimes hook a PC up to the regular LED bedroom tv.

No interest in using displayport, HDMI is fine for the vast majority of use cases. Not sure why we need competing connectors. Frankly, not sure why we don't all just switch to USB-C long term.

There's just no reason to not standardize with USB-C for every display.

[–] [email protected] 1 points 11 months ago

Most of them? What kind of question is this?

[–] [email protected] 1 points 11 months ago (2 children)

The only time we use HDMI is when you want to connect you're PC to a projector or a surround system because the standard there is HDMI.

load more comments (2 replies)
[–] [email protected] 1 points 11 months ago

Im using it to connect My PC to my OLED TV and AV Reciever. Way better than a monitor to me

load more comments (2 replies)
[–] [email protected] 1 points 11 months ago

They don’t care about that detail

[–] [email protected] 1 points 11 months ago (1 children)

48 vs 54 is roughly 12.5% more bandwidth. In the PC hardware world, 12% isn't often considered very close.

That 12.5% increased bandwidth allows this 8k monitor, to be easily run at 240hz 10 bit color with DSC.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (2 children)

I sure hope Blackwell has DP 2.1, anything else would be a bad joke. Ada should have already had it.

[–] [email protected] 1 points 11 months ago

I find it quite strange nvidia didnt try to upsell it in Ada GPU, like only offer DP 2.1 on AD102, AD103. The lower end get support next gen.

load more comments (1 replies)
[–] [email protected] 1 points 11 months ago (1 children)

Great. Now add more VRAM across the stack.

[–] [email protected] 1 points 11 months ago

I think this is just about a given, tbh. nVidia GPUs are going to benchmark like shit in 2025 with low VRAM totals now that we're at the point where the games needing more VRAM are actually out.

Although that's mostly going to be a thing for the mid-range (and low-end if they return to that market). For 4070+ GPUs VRAM is weirdly low, but doesn't seem to cause any problems yet (and honestly I don't see it being a problem until the next console gen).

[–] [email protected] 1 points 11 months ago

Can NVIDIA start normalizing larger VRAM capacities (>12GB VRAM)?

I'd gladly pay for a card with 20GB of VRAM. The closest is a 4090 but it's like $3K AUD over here.

[–] [email protected] 1 points 11 months ago

In another year a GPU will use DP 2.1! Breaking News!

[–] [email protected] 1 points 11 months ago

Tech articles would have me believe this whole lack of DisplayPort 2.1 was bigger problem with the 40 series than reality.

[–] [email protected] 1 points 11 months ago (3 children)

These will go nicely with DisplayPort 2.1 monitors that should be available by 2030, or perhaps even earlier.

[–] [email protected] 1 points 11 months ago

There are already many available now, most notably the Samsung 57" ultrawide

load more comments (2 replies)
[–] [email protected] 1 points 11 months ago (3 children)

All fine and good. As I work with video I'm more interested in them adding HW acceleration of HEVC 10bit 4:2:2. Codec that a lot of cameras today shooting in. Sony, Canon, DJI and more. Intel Arc is the only GPU (and CPU with iGPU) that support's it as of now. Supported since Intel 10 Gen, so well before 30-series cards.

load more comments (3 replies)
load more comments
view more: next ›