169
this post was submitted on 16 Oct 2024
169 points (98.8% liked)
Technology
59581 readers
3164 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is a sign of ARM approaching the "enough" level. I remember the times when it was actually important to buy the latest PC at least every other year to have enough power to run a basic office suite or similar programs with acceptable speed.
Nowadays, you can staff offices with about any PC off the shelf - it is powerful and big enough to fulfill the needs of the majority of users. Of course there are servers, there are power users, engineers running simulations, and of course gamers who need more power, and who still fuel the cutting edge of PC building. But the masses don't need to be cutting edge anymore. A rather basic machine is enough.
Here comes the ARM: For many years, ARM-based chips were used as SOCs, running anything from washing machines to mobile phones. But they have grown bigger and faster, and I can see them approaching the point that they can cover the basic needs of the average office and home user - which would be a damn big chunk of the market. It would be enough for those needs, but it would be cheaper and in many aspects less troublesome than Intel and AMD. Take for example power consumption in relation to computational power, where ARM is way better than the old and crusty x86 architecture. And less power leads to less cooling requirements, making the machines smaller, more energy efficient, and less noisy.
I can see ARM-based systems approaching this enough level, and I can see that Intel and ARM are deadly afraid of that scenario.
This!
I have wondered for a long time when we'll hit that ceiling (ssd size, cpu power, ram, ...) and I think it's about right now. There are not many exciting PC hardware news nowadays is another sign IMO.
I also windered for a long time why I shouldn't have a mobile phone PC, or more like "where are they?", I have an old Xiaomi redmi note pro 9, 4+4 core with 6+2GB RAM (Whatever that +2 means), 128GB storage and, well, graphics. For not expencive.
It could be an OK home computer.
A little bit of interesting times ahead!
I think it will not take long until there is a cell phone/PC hybrid: you plug your cell phone into a base and can use it with a normal desktop interface on a screen with mouse and keyboard. A bit like the Nintendo switch.
Samsung DeX? My Galaxy Note 9 already has it.
The Steam Deck is a similar concept
But, my prediction for the majority of users is that the device will just connect to a vdi infrastructure that you pay monthly for.
I mean, ARM chips have been at that level of performance for at least a decade by now. Normal people's most strenuous activity is watching Youtube, which every cellphone since what? 2005? could do.
The thing is that's very much not the actual situation for most people.
Only Apple really has high performance, very low power ARM chips you can't really outclass.
Qualcomm's stuff is within single-digit percentage points of the current-gen AMD and Intel chips both in power usage, performance, and battery life.
I mean, that's a FANTASTIC achievement for a 1st gen product, but like, it's not nearly as good as it should be.
The problem is that the current tradeoff is that huge amounts of the software you've been using just does not work, and a huge portion of it might NEVER work, because nobody is going to invest time in making it behave.
(Edit: assuming the software you need doesn't work in the emulation layer, of course.) You might get Photoshop, but you won't get that version of CS3 you actually own updated. You might get new games, but you probably won't get that 10 year old one you like playing twice a year. And so on.
The future might be ARM, but only Apple has a real hat in the ring, still.
(Please someone make better ARM chips than Apple, thanks.)___
Back in June, the new Snapdragon X processors were a lot more efficient than their x86 based counterparts. I can personally attest to much lower levels of heat generation.
I agree with the sentiment, but IMO this is a PC and Windows problem. I would also extend this beyond pure comparability. I say this for a few reasons
All that said, I've had zero issues with emulation so far. I never personally used a M1 max when they launched, but from reports of that era the current Windows experience is at least as good as that.
Valve is currently working on some tech to let x86 games work on arm64.
Cannot wait to see their solution.
That is a long shot at best. Games are hungry for power and resources, and adding an emulation layer, even a transpilation system between x86 code and ARM processor will not actually improve the situation.
Windows* runs on arm. Microsoft Surfaces user arm processors. Even with windows apps being emulated... It's basically enough already.
Windows* RT but Microsoft dropped the differentiator this time around.
It's not just the surface devices anymore. In June this year, a fresh wave of ARM powered laptops from a verity of different OEs launched. There are offerings from Dell, HP, Lenovo, Samsung, Asus, etc.
It always drives me insane when I have to spec out a $4k system for execs that use it mostly to browse Facebook and LinkedIn. At least the devs get the same systems.
I've seen worse. A group at the university was using the IBM mainframe for basically everything from their terminals. To reduce load on the mainframe, the university spent a load of money to buy a cluster of workstations with crazy specs and software, each one more expensive than a big new car back then.
I visited them shortly after they got those killer machines. For comparison: in our university department, we had green serial terminals connected to an old VAX 11/780. They had those shiny new workstations with GUI on high-resolution (for that time) color monitors. My friend there logged in - and his autostart just opened two terminal sessions on the IBM mainframe, where he did all his work just like before. He was happy that he had the terminals in a windowed environment, though, so he could easily open and handle several sessions on the machine at the same time.
Jira is the new reason we need to constantly upgrade
And edr solutions
I own a Lenovo Yoga slim 7x Gen 9, which is powered by a Snapdragon X. It certainly checks the "good enough" box. I use it primarily for photo culling/editing (I'm a holdout dedicated camera user). It is more than fit for purpose there, stays cool, is slim, and although I know the fan has come on a few times I wouldn't have known if it wasn't on my lap. When I bought mine, it was also one of the better deals - you could upgrade to 32 GB of memory and a SSD for under $125 in total. The SSD also isn't soldered, but the memory is. The 3k OLED display is amazing, but if you want the ultimate battery sipper it's probably not the best choice. I still get tons of runtime per charge, but am somewhat sad that I lose about 5% charge per day thanks to the laptop not really being off while asleep.
The biggest downside is linxu support is very hit and miss depending on the laptop in question, which means you're tied to windows 11. I don't have the time to tinker with it, so I haven't looked much further into it than this.