441
submitted 1 month ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 134 points 1 month ago

ANTI UPGRADE?? WHAT THE FUCK

[-] [email protected] 189 points 1 month ago

Intel is well known for requiring a new board for each new CPU generation, even if it is the same socket. AMD on the other hand is known to push stuff to its physical limits before they break compatibility.

[-] [email protected] 27 points 1 month ago

But why? Did Intel make a deal with the board manufacturers? Is this tradition from the days when they build boards themselves?

I thought they just didn't care and wanted as little restrictions for their chip design as possible, but if this actually works without drawbacks, that theory is out the window.

[-] [email protected] 46 points 1 month ago

Just another instance of common anti-consumer behavior from multi billion dollar companies who have no respect for the customers that line their pockets.

[-] [email protected] 20 points 1 month ago

They used to dominate the consumer market prior to Ryzen so might have something to do with it but I got no evidence lol

[-] [email protected] 16 points 1 month ago

Intel also sells the chipset and the license to the chipset software; the more boards get sold, the more money they make (as well as their motherboard partners, who also get to sell more, which encourages more manufacturers to make Intel boards and not AMD)

[-] [email protected] 8 points 1 month ago* (last edited 1 month ago)

There are many motherboard manufactors but only 2 CPU manufacturers (for PC desktop). Board makers don't "makes deals" so much as have the terms dictated to them. Even graphics card manufacturers made them their bitch back when multi-GPU was a thing - it was them who had to sell their Crossfire/SLL technology on their motherboards.

[-] [email protected] 2 points 1 month ago

guess who sells the chipsets to the motherboard manufacturers

[-] [email protected] 42 points 1 month ago

They've been pulling this shit since the early days. Similar tricks were employed in the 486 days to swap out chips, and again in the Celeron days. I think they switched to the slot style intentionally to keep selling chips to a point lol

[-] [email protected] 17 points 1 month ago* (last edited 1 month ago)
[-] [email protected] 7 points 1 month ago

thats why we are in dire need of open source hardware.

[-] [email protected] 9 points 1 month ago

We have open source designs (RISCV also have GPU designs) but we don't have manufacture power open sourced yet

[-] [email protected] 5 points 1 month ago

Are there any projects to develop that capability that you know of?

[-] [email protected] 2 points 1 month ago

No, there isn't yet, there's the most i could find, but it's not machines

[-] [email protected] 3 points 1 month ago

i dream of a world where the process will cheapen out enough like pcb design, where you can just submit the design you want and they will fab it out for you.

with more players coming into the game because of sanctions, i hope we are now on the path.

[-] [email protected] 3 points 1 month ago

Yes, i hope so too, as for now, semiconductor lithography at home is impossible due how big and complex these machines are, so i have same opinion as you are

[-] [email protected] 2 points 1 month ago

https://www.cia.gov/readingroom/docs/DOC_0000498114.pdf

Soviet Computer Technology: Little Prospect for Catching Up

We believe that there are many reasons why the Soviets trail the United States in computer technology:

  • The Soviets' centrally-planned economy does not permit adequate flexibility to design or manufacturing changes frequently encountered in computer production; this situation has often resulted in a shortage of critical components

especially for new products.

[-] [email protected] 11 points 1 month ago* (last edited 1 month ago)

If you're only response to criticism of capitalism is ((communism)), you may just be a cog in the corporate propaganda machine.

[-] ZombiFrancis 3 points 1 month ago

I mean they went with a literal cia link.

[-] [email protected] 0 points 1 month ago

Thanks for the link to the unbiased study by... the CIA? Huh. Yeah I trust them.

[-] [email protected] 1 points 1 month ago

The paper was from 1985. Was the CIA correct?

[-] ZombiFrancis 2 points 1 month ago

Marginally. The paper analyzes the capabilities as they existed in the 1980s, but doesn't draw strong conclusions as to why that may be. It does demonstrate how reliance on central planning results in inadequaciea when said central planning is not operating well, though.

The paper doesn't really mention it but the central planning of the USSR was actively reeling from Brezhnev dying, Andropov dying, and Chernenko either dying or about to die at the time the CIA thing was written. So yeah, correct is an accurate if imprecise way to put it.

[-] [email protected] 1 points 1 month ago

Yeah it’s more a criticism of the ussr in the 80s. Central planning with more tech focus and more democracy would likely not face that specific issue.

But also there’s room for shit like kanban communism which definitely wouldn’t have these problems

[-] [email protected] 14 points 1 month ago

IIRC, the slot CPU thing was because they wanted to get the cache closer to the processor, but hadn't integrated it on-die yet. AMD did the same thing with the original Athlon.

On a related note, Intel's anticompetitive and anti- consumer tactics are why I've been buying AMD since the K6-2.

[-] [email protected] 5 points 1 month ago* (last edited 1 month ago)

They had integrated the L2 on-die before that already with the Pentium Pro on Socket 8. IIRC the problem was the yields were exceptionally low on those Pentium Pros and it was specifically the cache failing. So every chip that had bad cache they had to discard or bin it as a lower spec part. The slot and SECC form factor allowed them to use separate silicon on a larger node by having the cache still be on-package (the SECC board) instead of on-die.

[-] [email protected] 2 points 1 month ago

AMD followed suit for the memory bandwidth part from the K62 architecture. Intel had no reason to do so.

[-] [email protected] 4 points 1 month ago* (last edited 1 month ago)

It's been at least since the "big iron" days.

Technician comes out to upgrade your mainframe and it consists of installing a jumper to enable the extra features. For only a few million dollars.

[-] [email protected] 32 points 1 month ago

But otherwise upgrade parts would be too affordable!

[-] [email protected] 10 points 1 month ago
[-] [email protected] 5 points 1 month ago

No, "user security".

this post was submitted on 01 Jun 2024
441 points (99.6% liked)

Technology

55690 readers
4614 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS