this post was submitted on 10 Jun 2023
14 points (100.0% liked)

Technology

25 readers
1 users here now

Computers, phones, AI, whatever

founded 1 year ago
MODERATORS
top 29 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 year ago (1 children)

So more scanning of arbitrary data for the sake of sanctimonious reasons, and definitely not for the sake of collecting data. I'm curious what is send where regarding those scans. There has been a scandal regarding amazon and those ring cameras. That software might run on the device, but whatever detection it's using is bound to make mistakes, and who sees the results? Is everything fully automated? Or human verified? I don't know which one would concern me more. Not even talking about young people taking photos of their body for various reasons. And just because it runs on your device does not necessarily mean that whatever is scanned is never sent anywhere. It just means that the scanning happens on your device.

Quite frankly, if it wasn't horrible i'd find the idea of some secret ring inside of apple using that CSAM-detection to collect material to sell on the dark net rather interesting. Might make an interesting plot for a thriller or novel...

[–] [email protected] 4 points 1 year ago (1 children)

I don’t believe there’s any actual data collection?

[–] [email protected] 2 points 1 year ago (2 children)

It's something that not talked about, which, given our data-obsessed world, i interpret as "we just do it by default (because nobody will complain, it's normal, yada yada)".

Besides, it's stated that the scanning itself does only happen on your device. If you scan locally for illegal stuff, it's not really far fetched that someone gets informed about someone having, for example, CSAM on their device. Why else would you scan for it? So at the very least, that information is collected somewhere.

[–] [email protected] 5 points 1 year ago (1 children)

I think your threat model for this is wrong.

First of all, understand how it works: it’s a local feature that uses image recognition to identify nudity. The idea is, if someone sends you a dick pick (or worse, CSAM), you don’t have to view it to know what it is. That’s been an option on the accounts of minors for some time now and it is legitimately a useful feature.

Now they’re adding it as an option to adult accounts and letting third party developers add it to their apps.

The threat that suddenly they’re going to send the scanning results to corporate without telling anyone seems unlikely. It would be a huge liability to do so and have no real benefits for them.

But the threat is this: with this technology available, there will be pressure to make it not optional (“Why does Apple let you disable the child porn filter — wtf?”). If they bend to that pressure then why not introduce filters for other illegal content. Why not filter for comments criticizing the CCP in China or content that infringes on copyright?

Having a “dick pick filter” is a useful technology and I know some people who would love to have it. That doesn’t mean the technology could be misused for nefarious purposes.

[–] [email protected] 2 points 1 year ago (2 children)

I am aware that it's local, i just assumed it would also call home.

My threat model here is based on cases like this: https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation

And yes, i did see it as a privacy issue, not a censorship one. Inevitably, if this finds the pressure to expand it towards other content, it could be a problem comparable to the "Article 13" Europe was, or is, facing.

Generally, blocking specific types of content is a valid option to have. As long as it is an option, and the user knows it is an option. I just distrust it coming from the likes of google or apple.

[–] [email protected] 6 points 1 year ago (1 children)

Google explicitly says they scan images and report them to law enforcement. Apple explicitly says they do not phone home with scan results and so far there have been no such investigations.

I get not trusting big tech companies, I do, but I think you're not modeling their behavior. Usually when a huge publicly traded company does something dodgy, they don't explicitly say they don't do it; they use weasel words.

[–] [email protected] 2 points 1 year ago

Well, thank you for clarifying. I was not aware of what exactly apple or google where communicating regarding their platforms.

[–] [email protected] 4 points 1 year ago (1 children)

I would honestly find it very difficult to believe that there wasn't going to be some telemetry, data / etc sent back to the mothership. I know in the marketing realm Apple caters towards "privacy", but who's really validating those claims.

Granted......I'm also very tin-foil-hatty about my data and retain it all locally with offsite backups. I tore down my Google Drive / cloud data about 2-years ago.

[–] [email protected] 2 points 1 year ago

There’s always some telemetry. But there’s a fair amount they do to truly make telemetry anonymous.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (2 children)

I find it very possible for an on device scanning taking place with current bionic chips and Apple being against any server side data collection: If 1 person gets flagged because of a false positive, Apple's Privacy reputation in the general public goes down the drain, there would be too much damage to the brand. The ability to deny knowledge of anything to do with their customers activities is also the biggest reason for their push to E2E encryption. Using server side data collection about CSAM would disrupt that plausible deniability argument in all matters.

[–] [email protected] 1 points 1 year ago

Not just damage to the brand, but also, a lawsuit. They flatly say they aren't phoning home with detection results. If they are, that opens them up to legal remedies from people who were lied to.

Maybe Anker gets away with just flat-out lying (about e2e encryption for example) but a huge publicly traded company this side of the Pacific is another matter.

[–] [email protected] 1 points 1 year ago (2 children)

If you think about it, apples privacy reputation doesn't matter to begin with. First, it's just a reputation. It's what they claim, not necessarily what they do. They are a multi trillion dollar tech giant, you don't get there with honesty. But regardless, imagine their reputation goes down the drain. The consensus of "If you have nothing to hide, you have nothing to fear", the ability to claim everything being to "protect the children" (or, related, "protect the country form terrorism") all negate the necessity for that reputation. As sad as it is, most people don't think too much about privacy, at least not in regards to modern technology. People will still buy their products and be part of the ecosystem. Apple is a luxury brand, their products used as status symbols, their most loyal customers are essentially a cult and for many it's all they know. That is, if such a case gets big enough to "go viral".

[–] [email protected] 2 points 1 year ago (1 children)

I think you’re underestimating it. They just introduced e2e encryption for almost all iCloud content. That’s not something there was that much market pressure to implement.

[–] [email protected] 2 points 1 year ago (1 children)

That is good for apple users. Does that include meta-data? Locations, timestamps and the likes?

[–] [email protected] 1 points 1 year ago

I don’t know that likes apply at all.

To my understanding it’s all the metadata though. What’s not included are contacts, calendar, and email—because there’s no way to implement it with carddav, imap, etc.

[–] [email protected] 1 points 1 year ago (1 children)

It’s also what they do.

  • Private relay
  • Tracking protection
  • Full e2e encryption where feasible
  • Device encryption
  • App tracking protection

It’s a brand, yes, but it is absolutely reasonable to ask why they would want to flush that reputation away.

[–] [email protected] 2 points 1 year ago (1 children)

If they actually do that, great. Nothing against that. I just have an inherit distrust towards fortune 500 companies

[–] [email protected] 1 points 1 year ago

Source code to confirm it would be nice, but security researchers crawl all over this stuff.

Also, they have no real incentive to do otherwise. As product features, these don't just sell products, they actually reduce the administrative load on Apple because then Apple doesn't have to deal with as many data requests.

[–] [email protected] 3 points 1 year ago (1 children)

I don't think a lot of people appreciate just how bad the "unsolicited dick pick" situation is. Maybe you don't experience it, but if you're young and a woman and online, you'll 100% start getting dick picks from strangers.

Being able to block and report those without first having to view them is a huge win. And this is done in a very privacy-respecting way.

[–] [email protected] 3 points 1 year ago

I agree. For normies sick of online harassment, these filters are a huge win. Also for parents.

[–] [email protected] 2 points 1 year ago (1 children)

Apple users been attacked by the Four Horsemen of the Infocalypse used to end privacy: protecting children, fighting terrorism, protecting Intellectual Property (IP), and fighting organized crime.

[–] [email protected] 2 points 1 year ago (1 children)

I mean as far as privacy goes, Apple products are generally quite good. Certainly an iPhone has a lot more privacy than any mainstream Android device with Play Services enabled.

With Linux, you could get a lot more privacy and control (obviously) but with the expense of, perhaps, less actual personal security. iPhones and Macs are generally quite theft-resistant in terms of protecting your data.

[–] [email protected] 1 points 1 year ago (1 children)

I would not recommend a stock Apple or Android phone. I use GrapheneOS at the moment which is Android without Google and additional security options.

Apple does not publish their source so you have to trust that their privacy policy is more than just public relations without any proof. Trust, but verify.

[–] [email protected] 1 points 1 year ago (1 children)

GrapheneOS+Debian is probably your gold standard. But it's also not at all a mainstream choice.

Among the mainstream consumer products, I tend to agree with others here who say that Apple has the best features and track record. Safari has better privacy than Chrome, iPhone has better privacy than Samsung Galaxies or Google Pixels, macOS has better privacy than Windows, etc.

Now, sure, Firefox+Graphene+Linux is better than all of them, but then you're giving up a lot. You're giving up what most consumers are not willing to give up. Can you hail Uber? Can you use your banking apps? Are your favorite games available?

And besides, your biggest privacy problem with your cell phone is, frankly, your carrier, which is most likely selling all the data they can get from your tower pings and account details, if not also your actual internet traffic (dns lookups etc).

And for that matter, even on Linux, there are privacy features I don't get that are available on my Apple stuff: most notably for me, a very usable dropbox alternative that's end-to-end encrypted (iCloud Files). Yes, there are end-to-end encrypted options on Linux that work like Dropbox, but they aren't very well polished.

If you're able to go all-in on Open Source and free software, you'll truly control your own destiny. That's all the better. But for the average consumer, Apple is a big upgrade over the direct competition from Google and Microsoft.

[–] [email protected] 1 points 1 year ago (1 children)

I believe Apple provides a false sense of security which is often worse than no security at all. Edward Snowden showed that privacy policies are useless documents and that is much of what Apple has going for it. They make a big press while fighting with the FBI. When the NSA asks Apple to provide backdoors to iCloud, iOS, and MacOS, they don't amend their privacy policy. They comply.

When a user knows that they are being watched, they will self-censor. When a user thinks they are free on a non-free platform, they will make mistakes that cannot be erased.

You can take control of your DNS. You can encrypt your traffic and communication. You cannot hide your location from your carrier, but you can disable the hardware modem. With the Pine64's PinePhone/PinePhone Pro, you can even take over the user space of the modem. Not all of these steps are necessary for everyone, but every little piece that is improved helps.

[–] [email protected] 1 points 1 year ago

The NSA's ability to compel private entities to modify their products is spurious. The FBI famously bullied Lavabit out of existence, but they would have a much more challenging time with a company that has lawyers and throngs of fans. I'm sure you heard of the FBI tried to have Apple develop an exploit and Apple successfully refused. And that's the FBI: the NSA has broad surveillance power, but their ability to tell a private company to modify their products is basically non-existent.

But again, doing something like disabling the hardware modem is just not a realistic step most people would even consider.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Yes, there is potential for a slippery slope. And any filtering technology could be used for nefarious purposes. But this strikes me as pretty far from the slope and the purpose is clearly a good one. Remember you can always just turn it off.

[–] [email protected] 2 points 1 year ago (1 children)

You can only turn it off until you can't. The road to hell is paved with good intentions.

[–] [email protected] 2 points 1 year ago

That's kind of the risk with any technology. And I admit, it is the most likely way we lose control: someone will ask, "why does Apple let you turn off the child porn filter?" and the answers may not be enough for lawmakers or an angry mob.

That the same could be said of a great many tools that filter bad content, from spam filtering to DDOS filtering. Should a technology not be available to consumers based on a hypothetical? That's just as bad.

If a technology exists to filter content I don't want to see, who are you to tell me Apple shouldn't sell me a device with that technology I want?

load more comments
view more: next ›