this post was submitted on 28 Nov 2023
1 points (100.0% liked)

Hardware

47 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
top 29 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 11 months ago (1 children)

I never thought Intel would get rid of Arc dGPUs, but now the rise of AI could be a big reason and solid reason for share holders to keep Arc going anyway

I don't see AMD beating Intel in AI

[–] [email protected] 1 points 11 months ago (1 children)

Do we have any clue when to expect battle mage?

[–] [email protected] 1 points 11 months ago

CES is the tentative announcement prediction, with wider availability in Q2.

[–] [email protected] 1 points 11 months ago (2 children)

Praying for a 3 way fight in the mid range.

[–] [email protected] 1 points 11 months ago

They need to beat 4070 at the beginning of the cycle. Or at least 4070 Ti level mid-cycle. And that's just old x070 non-Ti level performance to begin with.

[–] [email protected] 1 points 11 months ago

Praying for a 2 way fight at the top range.

[–] [email protected] 1 points 11 months ago (2 children)

Arc is realistically a bigger threat to AMD than it is to Nvidia. The second half of the 2020's will be AMD and Intel competing over second place for desktop dGPUs.

For mobile, Arc iGPUs, while obviously not matching dedicated GPUs, can realistically offer good enough performance to some people who want to do light gaming, then stepping up to a low end dGPU just to make sure Minecraft, Fortnight, etc. can at least run may not be worth the extra cost.

Either way, I think Intel's heavy focus on putting Arc in all of their Core Ultra CPUs and heavily focusing on iGPU can be a potentially bigger disruptor than their desktop dGPUs, at least in the nearterm.

[–] [email protected] 1 points 11 months ago

Not to mention the inclusion of XMX cores in Arrow Lake and presumably beyond could provide XeSS video upscaling similar to what DLSS is doing, all without a dGPU

[–] [email protected] 1 points 11 months ago (2 children)

Arc is no threat whatsoever to Nvidia, not unless Intel manage to scale up the architecture to enterprise-grade levels and develop something akin to the CUDA API.

[–] [email protected] 1 points 11 months ago

They have a better chance of doing it than AMD do.

[–] [email protected] 1 points 11 months ago (2 children)

Intel's competitor to CUDA is oneAPI and SYCL. Intel poses no threat to Nvidia GPUs in datacenter in the near term, but that doesn't mean Intel won't still secure contracts.

Intel's biggest threat to Nvidia is against Nvidia's laptop dGPU volume segment. Arc offers synergies with Intel CPUs, a single vendor for both CPU and GPU for OEMs, and likely bundled discounts for them as well. A renewed focus on improving iGPUs also threatens some of Nvidia's low end dGPUs in laptops - customers don't have to choose between very poor performance iGPU or stepping up to a dGPU, and now iGPUs will start to become good enough that some customers will just opt to not buy a low end mobile dGPU in coming years.

[–] [email protected] 1 points 11 months ago

Not to mention that Intel could have consumer AI tech in nearly -every- laptop sold in 5 years with just an intel iGPU. Not to mention mini-PCs etc etc, especially if LNL pans out well. Thats a scale of deployability that Nvidia simply cannot compete with.

[–] [email protected] 1 points 11 months ago (1 children)

A renewed focus on improving iGPUs also threatens some of Nvidia's low end dGPUs in laptops - customers don't have to choose between very poor performance iGPU or stepping up to a dGPU

AMD has had iGPUs in laptops for a long time now, and the better CPUs for more than a couple of the past few years, yet laptops are still sold with Nvidia dGPUs even when they have decent AMD iGPUs.

It might kill the lowest of the lowest end of laptop dGPUs, but I think Nvidia's pricing is doing that faster than Intel's success with Arc.

[–] [email protected] 1 points 11 months ago

The issue with AMD laptops is availability and the mixing of generations under similar SKU numbers. There's only a handful of Zen4 laptops in the wild, and they're mixed in with Zen2 and Zen3 parts, leading to a confusing experience for the average buyer. So, people will either go for an Intel laptop, or find an Nvidia dGPU laptop for the 'upgrade'.

[–] [email protected] 1 points 11 months ago (1 children)

Let hope Intel continues. Nvidia seems to concentrate on ai which isn't good for gamers and having a sole manufacturer left (amd) to make a monopoly won't be good either.

[–] [email protected] 1 points 11 months ago (1 children)

Why isn't it good for games?

[–] [email protected] 1 points 11 months ago

"gamers"as s hole, just in case of misunderstanding.

Nvidia are earning af. ton of money on ai hardware, they would be fools to not move their manufacturing capacity more towards ai and not consumer graphics card. This will make graphics cards more scares and expensive. Just look what the crypto boom did and still does, graphics card cost an arm and a leg and will get much much worse with the ai boom

[–] [email protected] 1 points 11 months ago (1 children)

People fighting in the comments about what should or could Nvidia/AMD/intel do

[–] [email protected] 1 points 11 months ago

summed up its, Intel cant be Nvidia. AMD is screwed.

[–] [email protected] 1 points 11 months ago

So Smooth Sync is just not working in most games? Like what if you tried it in something totally unexpected like The Witcher 1 or 2? Something old, or something brand new? Is it a white list where they select which games to enable it for? Or a black list where they disable it for certain games exhibiting problems?

[–] [email protected] 1 points 11 months ago (2 children)

and still high idle power usage (cant get mine under 15w) my nvidia / amd cards idle around 4-5w even with screen attached.. (im using a a380 in a server for video encoding/decode)

[–] [email protected] 1 points 11 months ago (2 children)
[–] [email protected] 1 points 11 months ago (1 children)

iGPU just bypasses the dGPU. That shouldn’t be necessary.

[–] [email protected] 1 points 11 months ago

You can also do this with Nvidia and AMD GPUs, kinda the whole way laptops work to conserve battery is to just shut off the dGPU when not in use.

[–] [email protected] 1 points 11 months ago (1 children)

It's cool and all, but changing the whole platform just to help the gpu's idle power consumption sounds like a terrible solution.

[–] [email protected] 1 points 11 months ago

Especially when you can just buy an AMD or Nvidia GPU instead and get almost all the power savings that way.

[–] [email protected] 1 points 11 months ago

When ARC was first announced I was pretty excited to get Intel's super low power iGPU and their GPU splitting tech (GVT-g) in a discrete card. Then they canceled all future GVT-g development and the cards crapped the bed on idle power consumption.

[–] [email protected] 1 points 11 months ago

This sub is just a site for this long hair potato guy?

[–] [email protected] 1 points 11 months ago

Where is intel's future honestly without GPU's? Second rate CPU's? Manufacturing silicon for everyone else? I just don't see how Intel continues as an industry leader without expanding into other segments like GPU's. They have sold off a bunch of other businesses in the last few years. Maybe I just don't understand their business enough.