this post was submitted on 03 Jun 2024
1296 points (96.4% liked)

Technology

70916 readers
3307 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 3) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 1 year ago (1 children)

Do you remember when you could put your Mac to sleep, and when you woke it up a few days later, the battery would barely have dropped? Not now, because your computer never really sleeps anymore.

I assume that the Mac has some kind of hibernation function, and that that will reduce the battery drop to effectively zero.

[–] [email protected] 6 points 1 year ago (4 children)

Waking from hibernation is sooo much slower than waking from sleep. Apple silicon macs are very efficient in their S0 standby so they'll go days before entering hibernation. Kinda odd that they bring that up now that Apple has fully transitioned to ARM machines where this isn't really an issue. That said S0 standby on this 2019 Macbook I have for work is dogshit.

load more comments (4 replies)
[–] [email protected] 7 points 1 year ago

AI has people questioning Windows use Car systems ratting drivers out has people questioning car use

Not the way I expected to reach some of my desired ends but I'll take it. 🤔

[–] [email protected] 6 points 1 year ago

Until big tech needs 2% more systems to squeeze out 2% more money...

[–] [email protected] 6 points 1 year ago (6 children)

What happens when I, a potential new Linux user, need to search for how to make something work on Linux and thanks to SEO and AI driven/created search results I can't find the solution?

[–] [email protected] 8 points 1 year ago

Well you already know how to find this place, so find a Linux-themed instance and either ask your question or better yet post a "guide" telling people to resolve your problem by doing some wrong method you've already tried so that someone else calls you an idiot and posts the correct answer out of spite.

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

how to do <objective> lemmy

load more comments (4 replies)
[–] [email protected] 5 points 1 year ago (12 children)

I would hope that Apple would aim their AI more at iOS and leave Mac OSX alone:-|. If not, I would consider finally leaving it, if the AI features could not be turned off (which likely they would... at first, for awhile).

Oh man, the thought strikes me: how will crucial systems like DoD Windows machines maintain integrity, if people can exploit those gigantic loopholes to basically have the OS be a keylogger? It's not enough for me to use secure systems at home, if those in charge of our nation's defense (especially nuclear!?) do not.

load more comments (12 replies)
[–] [email protected] 5 points 1 year ago (1 children)

Ehh, I have a different vision here - AI is useful, it's just going down the hypermonetisation path at the moment. It's not great because your data is being scraped and used to fuel paywalled content - that is largely why most folks object.

It's, also, badly implemented, and is draining a lot of system resource when plugged into an OS for little more than a showy web search.

Eventually, after a suitable lag, we'll see Linux AI as the AI we always wanted. A local, reasonable resource intense, option.

The real game changer will be a shift towards custom hardware for AIs (they're just huge probability models with a lot of repetitive similar calculations). At the moment, we use GPUs as they're the best option for these calculations. As the specialist hardware is developed, and gets cheaper, we'll see more local models and thus more Linux AI goodness.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›