this post was submitted on 22 Oct 2023
200 points (93.1% liked)

Google Pixel

5405 readers
50 users here now

The World's Google Pixel community!

This community is for lemmings to gather and discuss all things related to the Google Pixel phone and other related hardware. Feel free to ask questions, seek advice, and engage in discussions around the Pixel and its ecosystem.

We ask you to be polite when addressing others and respect Lemmy.world's rules.

NSFW content is not allowed and will immediately get you banned.

It also goes without saying that self-promotion of any nature and referral links are not allowed. When in doubt, contact the mod team first.

Also, please, no politics.

For more general Android discussions, see [email protected].

This community is not in any way affiliated with Google. If you are looking for customer support regarding your Pixel phone, look here instead: https://support.google.com/pixelphone/

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 47 points 10 months ago (3 children)

ITT people who don't understand that generative ML models for imagery take up TB of active memory and TFLOPs of compute to process.

[–] [email protected] 21 points 10 months ago* (last edited 10 months ago) (3 children)

That's wrong. You can do it on your home PC with stable diffusion.

[–] [email protected] 20 points 10 months ago

And a lot of those require models that are multiple Gigabytes in size that then need to be loaded into memory and are processed on a high end video card that would generate enough heat to ruin your phones battery if they could somehow shrink it to fit inside a phone. This just isn't feasible on phones yet. Is it technically possible today? Yes, absolutely. Are the tradeoffs worth it? Not for the average person.

[–] [email protected] 3 points 10 months ago* (last edited 10 months ago)

You can for example run some upscaling models on your phone just fine (I mentioned the SuperImage app in the photography tips megathread). Yes the most powerful and memory-hungry models will need more RAM than what your phone can offer but it's a bit misleading if Google doesn't say that those are being run on the cloud.

[–] mindbleach 2 points 10 months ago

Stable Diffusion runs in 4 GB of VRAM.