this post was submitted on 11 Jun 2024
218 points (99.1% liked)
Technology
59708 readers
2500 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Presumably to minimize exposure while they add the announced security band-aids?
So... while I have you guys here, how do we feel about iOS having just announced basically the same feature? We angy about that one too or nah?
I mean, joking aside, I'm genuinely curious about what the reaction is going to be. On paper it's a very similar concept, but it feels like routing it through Siri and not surfacing the stored data will legitimately kill some of the creepy factor even if what's happening behind the scenes is very similar.
people generally probably hate the iOS integration just because it’s another AI product, but they’re fundamentally different. the problem with Recall isn’t the AI, it’s the trove of extra data that gets collected that you normally wouldn't save to disk whereas the iOS features are only accessing existing data that you give it access to.
from my perspective this is a pretty good use case for “AI” and about as good as you can do privacy wise, if their claims pan out. most features use existing data that is user controlled and local models, and it’s pretty explicit about when it’s reaching out to the cloud.
this data is already accessible by services on your phone or exists in iCloud. if you don’t trust that infrastructure already then of course you don’t want this feature. you know how you can search for pictures of people in Photos? that’s the terrifying cLoUD Ai looking through your pictures and classifying them. this feature actually moves a lot of that semantic search on device, which is inherently more private.
of course it does make access to that data easier, so if someone could unlock your device they could potentially get access to sensitive data with simple prompts like “nudes plz”, but you should have layers of security on more sensitive stuff like bank or social accounts that would keep Siri from reading it. likely Siri won’t be able to get access to app data unless it’s specified via their API.
Wait, no, that doesn't sound right. From the way Apple describes this they are accessing all your info, plus extracting context from it. So not only does it know people's faces, who sent you what when, the content of every image on your device and every message you sent or received, but it knows which people are related to you and how, where you are and a bunch of other stuff.
Plus there are other issues on the Apple side where it compares worse in terms of privacy. As far as I can tell this doesn't have an opt-out, right? And they do send the data to Apple servers for processing (but don't store it), which the MS version doesn't do at all. It seems like they each have ways in which they're worse than the other privacy-wise, although presumably the only actually secure option between the two would be Windows with Recall turned off, unless Apple do have an opt-out they're not talking about.
Ultimately, like I've been telling everyone, the interesting bit here is how the presentation of each of them and the branding and positioning of each brand alter the outcome. Both MS and Apple are arguing the same thing: that your data is secure because their system is secure and your data remains local or at least under your control. But one of them did not pay any mind to presenting security as a concern and will only ship some common sense additional security in response to pushback while the others will ship something very similar but reassuring you in a calm voice that this is all very private even if it's flying through the ether to an Apple server. So one is "a security and privacy nightmare" and the other one... well, if you have your nudes just sitting in your personal device you're really just asking for it, you know?
That is the kind of understanding of marketing that separates Apple from MS, if you ask me. A whole master class in branding right there. I'll go one further: Based on what I'm reading about this, I suspect if MS had announced their bad, unencrypted leaky version today, after the Apple presentation they would have seen less angry pushback because Apple's good messaging would have smoothed things over for both.
Human brains are squishy and weird.
Apples AI is mostly processed on device. That’s why it takes an iPhone 15 pro or an M-series processor. They also claim that what is processed in the cloud is neither identifiable nor stored, just processed. We will know if that’s true (at least what is being sent) as soon as it gets out into the public and we can start picking apart the traffic.
There is no mention of opt-out or not yet, probably because we’re several months away from the actual release. I’m sure we’ll get more information before then.
MS's AI is entirely processed on device. That was their entire security pitch: the data never leaves your PC, why are you all getting so angry about it? Isn't your PC secure?
But you didn't remember that because you were already angry when you read the headlines and that was only two paragraphs down and also it's a terrible argument that doesn't resolve any of the valid concerns people had.
But Apple went out there and talked about sending the name and face of your auntie to their servers along with every email she's ever sent you for a computer to parse exactly how close you are to her like it's the best thing that's happened to your privacy this century. And they sounded like they meant it and were vague enough and they said they pinky promise to not keep any of that info for themselves. And you don't just remember, you believe it.
They are really, REALLY good at this, and that's only helped by how bad Microsoft is at it.