this post was submitted on 29 Mar 2024
1015 points (98.0% liked)

Curated Tumblr

4102 readers
7 users here now

For preserving the least toxic and most culturally relevant Tumblr heritage posts.

The best transcribed post each week will be pinned and receive a random bitmap of a trophy superimposed with the author's username and a personalized message. Here are some OCR tools to assist you in your endeavors.

-web

-iOS

-android

Don't be mean. I promise to do my best to judge that fairly.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 73 points 9 months ago (3 children)

Apple, afaik, used to be doing this on-device rather than in the cloud. Not quite sure about the situation today.

[–] [email protected] 22 points 9 months ago (1 children)
[–] [email protected] 33 points 9 months ago (1 children)

I don’t. Corps gonna corp, if they can. But I’ve checked this using all the development, networking, and energy monitoring tools at my disposal and apple’s e2e and on-device guarantee does appear to hold. For now.

Still, those who can should audit periodically, even if they’re only doing it for the settlement.

[–] brbposting 8 points 9 months ago (1 children)
[–] [email protected] 5 points 9 months ago* (last edited 9 months ago) (1 children)

Security is in my interest, but yw

[–] [email protected] 6 points 9 months ago (1 children)

They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

[–] didnt_readit 1 points 9 months ago* (last edited 9 months ago) (1 children)

They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

[–] [email protected] 1 points 9 months ago

It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.