corruptboomerang

joined 1 year ago
[–] [email protected] 1 points 11 months ago

Yeah, even the Nikon 50mm 1.4D vs the 50mm 1.4G the difference in image quality is night and day. The D is also tiny compared to the G. Unfortunately, it's a somewhat immutable fact of physics that good quality optics are big and heavy.

[–] [email protected] 1 points 11 months ago

Mate I'm still riding (or dying) with my 4700K! On my third motherboard for it, but they're super cheap now.

And yes I'm probably going to be skipping LGA1700 (as well as LGA1200). Unless you count laptops then I've got a 6th Gen and looking at adding an 8th Gen.

[–] [email protected] 1 points 11 months ago

From memory this project is somewhat abandoned. But it's a cool idea.

[–] [email protected] 1 points 11 months ago (1 children)

Varies greatly based on encoder. NEVC vs CPU etc

Not asserting this isn't the case, I've not noticed it, but I can't see why this would be the case for the actual encoding. Decoding I've seen it make a difference but that's mostly the pre-Skylake iGPUs using a poor implementation of QuickSync.

[–] [email protected] 1 points 11 months ago

Honestly, for normal people things hard drive is hard drive. If it's a hard-core high-performance DATABASE, always spinning ZFS pool then MAYBE it'll matter. But for just storing data, like a normal use case, heck even a heavy normal use case like photo/video storage where you're caching on an SSD for editing but fairly intensely reading/writing back to the drive, it's fine.

The only time it'll probably matter is if it's someone else's money, then just get the expensive drives so you don't get blamed (enterprise), it's some super intense database or something, or Security systems there are some benitifs for a drive designed to be CONSTANTLY written to.

[–] [email protected] 1 points 11 months ago

Not really. Only difference is the lack of iGPU, in theory MAYBE some lower heat OUTPUT & perhaps some cache differences, but they're all more theoretical then real. I've never seen anyone even measure any differences between the iGPU and non-iGPU versions.

Really an iGPU is a great power save, and can improve performance of low intensity tasks like video Playback etc. As well as offloading a secondary monitor to the iGPU and running background tasks etc. All in all an iGPU is an objective upgrade in the real world.

Like MAYBE if you're extremely cache and ram limited it MIGHT in theory make a difference but far far far more factors are far more important than iGPU vs non-iGPU versions.

[–] [email protected] 1 points 11 months ago (4 children)

I mean if you're creative enough, probably nothing.

This is kinda like asking what can I do on a lathe that I can't do on a mill. It's more what's better suited to be done on one or the other.

CPUs are more generalised; they have a deep and complex instruction set and feature list. While GPUs are shallower and far more specialised, but do tasks that parallellalise more readily... Like calculating a metric shitload of triangles.

You can see CPUs used to 'push pixels' in older computers since that's all they had.

[–] [email protected] 1 points 11 months ago (1 children)

Can I point out that 'Smart Devices' have stopped trying to be Smart, and just trying to Data Harvest.

[–] [email protected] 1 points 11 months ago

VPN, Cloud storage, cloud hosting.