this post was submitted on 29 Mar 2024
1015 points (98.0% liked)
Curated Tumblr
4102 readers
7 users here now
For preserving the least toxic and most culturally relevant Tumblr heritage posts.
The best transcribed post each week will be pinned and receive a random bitmap of a trophy superimposed with the author's username and a personalized message. Here are some OCR tools to assist you in your endeavors.
-web
-iOS
Don't be mean. I promise to do my best to judge that fairly.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.
They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.
It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.