this post was submitted on 04 Dec 2024
629 points (99.5% liked)

Technology

59958 readers
3360 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 34 points 2 weeks ago (10 children)

End to end is end to end. Its either "the devices sign the messages with keys that never leave the the device so no 3rd party can ever compromise them" or it's not.

Signal is a more trustworthy org, but google isn't going to fuck around with this service to make money. They make their money off you by keeping you in the google ecosystem and data harvesting elsewhere.

[–] [email protected] 50 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

google isn't going to fuck around with this service to make money

Your honor, I would like to submit Exhibit A, Google Chrome “Enhanced Privacy”.

https://www.eff.org/deeplinks/2023/09/how-turn-googles-privacy-sandbox-ad-tracking-and-why-you-should

Google will absolutely fuck with anything that makes them money.

[–] [email protected] 27 points 2 weeks ago

This. Distrust in corporations is healthy regardless of what they claim.

[–] [email protected] -1 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Thats a different tech. End to end is cut and dry how it works. If you do anything to data mine it, it's not end to end anymore.

Only the users involved in end to end can access the data in that chat. Everyone else sees encrypted data, i.e noise. If there are any backdoors or any methods to pull data out, you can't bill it as end to end.

[–] [email protected] 12 points 2 weeks ago (1 children)

You are suggesting that "end-to-end" is some kind of legally codified phrase. It just isn't. If Google were to steal data from a system claiming to be end-to-end encrypted, no one would be surprised.

I think your point is: if that were the case, the messages wouldn't have been end-to-end encrypted, by definition. Which is fine. I'm saying we shouldn't trust a giant corporation making money off of selling personal data that it actually is end-to-end encrypted.

By the same token, don't trust Microsoft when they say Windows is secure.

[–] [email protected] 6 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Its a specific, technical phrase that means one thing only, and yes, googles RCS meets that standard:

https://support.google.com/messages/answer/10262381?hl=en

How end-to-end encryption works

When you use the Google Messages app to send end-to-end encrypted messages, all chats, including their text and any files or media, are encrypted as the data travels between devices. Encryption converts data into scrambled text. The unreadable text can only be decoded with a secret key.

The secret key is a number that’s:

Created on your device and the device you message. It exists only on these two devices.

Not shared with Google, anyone else, or other devices.

Generated again for each message.

Deleted from the sender's device when the encrypted message is created, and deleted from the receiver's device when the message is decrypted.

Neither Google or other third parties can read end-to-end encrypted messages because they don’t have the key.

They have more technical information here if you want to deep dive about the literal implementation.

You shouldn't trust any corporation, but needless FUD detracts from their actual issues.

[–] [email protected] 9 points 2 weeks ago (1 children)

You are missing my point.

I don't deny the definition of E2EE. What I question is whether or not RCS does in fact meet the standard.

You provided a link from Google itself as verification. That is... not useful.

Has there been an independent audit on RCS? Why or why not?

[–] [email protected] -1 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Not that I can find. Can you post Signals most recent independent audit?

Many of these orgs don't post public audits like this. Its not common, even for the open source players like Signal.

What we do have is a megacorp stating its technical implementation extremely explicitly for a well defined security protocol, for a service meant to directly compete with iMessage. If they are violating that, it opens them up to huge legal liability and reputational harm. Neither of these is worth data mining this specific service.

[–] [email protected] 7 points 2 weeks ago

I'm not suggesting that Signal is any better. I'm supporting absolute distrust until such information is available.

[–] deranger 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)
[–] [email protected] 8 points 2 weeks ago* (last edited 2 weeks ago)

Thank you. I had trouble running down a list.

I do consider Signal to be a more trustworthy org than Google clearly, but find this quibbling about them "maybe putting a super secret backdoor in the e2ee they use to compete with iMessage" to be pretty clear FUD.

[–] sugar_in_your_tea 2 points 1 week ago

Even if we assume they don't have a backdoor (which is probably accurate), they can still exfiltrate any data they want through Google Play services after it's decrypted.

They're an ad company, so they have a vested interest in doing that. So I don't trust them. If they make it FOSS and not rely on Google Play services, I might trust them, but I'd probably use a fork instead.

[–] [email protected] 9 points 2 weeks ago (3 children)

They can just claim archived or deleted messages don't qualify for end to end encryption in their privacy policy or something equally vague. If they invent their own program they can invent the loophole on how the data is processed

[–] [email protected] 11 points 2 weeks ago (2 children)

Or the content is encrypted, but the metadata isn't, so they can market to you based on who you talk to and what they buy, etc.

[–] [email protected] 2 points 2 weeks ago (1 children)

This part is likely, but not what we are talking about. Who you know and how you interact with them is separate from the fact that the content of the messages is not decryptable by anyone but the participants, by design. There is no "quasi" end to end. Its an either/or situation.

[–] sugar_in_your_tea 2 points 1 week ago

It doesn't matter if the content is encrypted in transit if Google can access the content in the app after decryption. That doesn't violate E2EE, and they could easily exfiltrate the data though Google Play Services, which is a hard requirement.

I don't trust them until the app is FOSS, doesn't rely on Google Play Services, and is independently verified to not send data or metadata to their servers. Until then, I won't use it.

[–] [email protected] 1 points 1 week ago

Provided they have an open API and don't ban alternative clients, one can make something kinda similar to TOR in this system, taking from the service provider the identities and channels between them.

Meaning messages routed through a few hops over different users.

Sadly for all these services to have open APIs, there needs to be force applied. And you can't force someone far stronger than you and with the state on their side.

[–] [email protected] 2 points 2 weeks ago* (last edited 2 weeks ago)

The messages are signed by cryptographic keys on the users phones that never leave the device. They are not decryptable in any way by google or anyone else. Thats the very nature of E2EE.

How end-to-end encryption works

When you use the Google Messages app to send end-to-end encrypted messages, all chats, including their text and any files or media, are encrypted as the data travels between devices. Encryption converts data into scrambled text. The unreadable text can only be decoded with a secret key.

The secret key is a number that’s:

Created on your device and the device you message. It exists only on these two devices.

Not shared with Google, anyone else, or other devices.

Generated again for each message.

Deleted from the sender's device when the encrypted message is created, and deleted from the receiver's device when the message is decrypted.

Neither Google or other third parties can read end-to-end encrypted messages because they don’t have the key.

They cant fuck with it, at all, by design. That's the whole point. Even if they created "archived" messages to datamine, all they would have is the noise.

[–] [email protected] 0 points 2 weeks ago

Exactly. We know corporations regularly use marketing and doublespeak to avoid the fact that they operate for their interests and their interests alone. Again, the interests of corporations are not altruistic, regardless of the imahe they may want to support.

Why should we trust them to "innovate" without independent audit?

[–] [email protected] 1 points 1 week ago

End to end doesn't say anything about where keys are stored, it can be end to end encrypted and someone else have access to the keys.

[–] [email protected] 18 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Signal doesn't harvest, use, sell meta data, Google may do that.
E2E encryption doesn't protect from that.
Signal is orders of magnitude more trustworthy than Google in that regard.

[–] [email protected] 9 points 2 weeks ago

There's also Session, a fork of Signal which claims that their decentralised protocol makes it impossible/very difficult for them to harvest metadata, even if they wanted to.Tho I personally can't vouch for how accurate their claims are.

[–] [email protected] 3 points 1 week ago (1 children)

Agreed. That still doesnt mean google is not doing E2EE for its RCS service.

Im not arguing Google is trustworthy or better than Signal. I'm arguing that E2EE has a specific meaning that most people in this thread do not appear to understand.

[–] [email protected] 1 points 1 week ago

Sure!
I was merely trying to raise awareness for the need to bring privacy protection to a level beyond E2EE, although E2EE is a very important and useful step.

[–] [email protected] 17 points 2 weeks ago (1 children)

It could be end to end encrypted and safe on the network, but if Google is in charge of the device, what's to say they're not reading the message after it's unencrypted? To be fair this would compromise signal or any other app on Android as well

[–] [email protected] -4 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

That's a different threat model that verges on "most astonishing corporate espinoage in human history and greatest threat to corporate personhood" possible for Google. It would require thousands if not tens of thousands of Google employees coordinating in utter secrecy to commit an unheard of crime that would be punishable by death in many circumstances.

If they have backdoored all android phones and are actively exploting them in nefarious ways not explained in their various TOS, then they are exposing themselves to ungodly amounts of legal and regulatory risks.

I expect no board of directors wants a trillion dollars of company worth to evaporate overnight, and would likely not be okay backdooring literally billions of phones from just a fiduciary standpoint.

[–] [email protected] 13 points 2 weeks ago (2 children)

It would require thousands if not tens of thousands of Google semployees coordinating in utter secrecy

This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.

But it doesn't really apply here. We know for example that NSA backdoors exist in Windows. Were those a concerted effort by MS employees? Does everyone working on the project have access to every part of the code?

It just isn't how development works at this scale.

[–] [email protected] 3 points 2 weeks ago (1 children)

Ok but no one is arguing Windows is encrypted. Google is specifically stating, in a way that could get them sued for shitloads of money, that their messaging protocol is E2EE. They have explicitly described how it is E2EE. Google can be a bad company while still doing this thing within the bounds we all understand. For example, just because the chat can't be backdoored doesn't mean the device can't be.

[–] [email protected] 1 points 1 week ago

Telegram has its supposedly E2EE protocol which isn't used by most of Telegram users, but also there have been a few questionable traits found in it.

Google is trusted a bit more than Pavel Durov, but it can well do a similar thing.

And yes, Android is a much larger heap of hay where they can hide a needle.

[–] [email protected] 2 points 1 week ago

This is usually used for things like the Moon Landing, where so many folks worked for NASA to make it entirely impossible that the landing was faked.

I think it's also confirmed by radio transmissions from the Moon received in real time right then by USSR and other countries.

[–] [email protected] 0 points 1 week ago (1 children)

How do spyware services used by nation-state customers, like Pegasus, work?

They use backdoors in commonly used platforms on an industrial scale.

Maybe some of them are vulnerabilities due to honest mistakes, the problem is - the majority of vulnerabilities due to honest mistakes also carry denial of service risks in widespread usage. Which means they get found quickly enough.

[–] [email protected] 1 points 1 week ago (1 children)

So your stance is that Google is applying self designed malware to its own services to violate its own policies to harvest data that could bring intense legal, financial and reputational harm to it as an org it was ever discovered?

Seems far fetched.

[–] [email protected] 0 points 1 week ago (1 children)

Legal and financial - doubt it. Reputational - counter-propaganda is a thing.

I think your worldview lags behind our current reality. I mean, even in 30-years old reality it would seem a bit naive.

Also you've ignored me mentioning things like Pegasus, from our current, not hypothetical, reality.

[–] [email protected] 1 points 1 week ago* (last edited 1 week ago) (1 children)

So yes.

You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?

That is wildly conspiratorial thinking, and honestly plain FUD. It undermines serious, actual privacy issues the company has when you make up wild cabals that are running double secret malware attacks against themselves inside Google.

[–] [email protected] 0 points 1 week ago

You think a nearly trillion dollar public company has an internal division that writes malware against flaws in its own software in order to harvest data from its own apps. It does this to gain just a bit more data about people it already has a lot of data on, because why not purposely leave active zero days in your own software, right?

You think you are being the smart one here?

No, that's not what I said. Also cypherpunks and other hobbyists are not that much smarter than corporations and nation-states, to be the only ones to think about plausible deniability.

For example, the whole Windows sources have been given officially for various 3-letter agencies of various countries (Russia included) to study, and of course there are vulnerabilities with the size of such codebase. MS might not have left obvious backdoors and informed FSB of them, but it has given interested parties the ability to find those themselves, which is only a question of work, or maybe make tampered versions of DLLs and what not easier.

Also they are legally obligated to silently comply with a lot of things.

That is wildly conspiratorial thinking, and honestly plain FUD.

WhatsApp and Facebook (before it bought WhatsApp) have both done this, Telegram has done this, MS has done this, even Apple has done this.

when you make up wild cabals that are running double secret malware attacks against themselves inside Google.

You made that up, not me. Should have tried to read what you are being told first.

[–] [email protected] 7 points 2 weeks ago* (last edited 2 weeks ago)

End to end could still - especially with a company like Google - include data collection on the device. They could even "end to end" encrypt sending it to Google in the side channel. If you want to be generous, they would perform the aggregation in-device and don't track the content verbatim, but the point stands: e2e is no guarantee of privacy. You have to also trust that the app itself isn't recording metrics, and I absolutely do not trust Google to not do this.

They make so of their big money from profiling and ads. No way they're not going to collect analytics. Heck, if you use the stock keyboard, that's collecting analytics about the texts you're typing into Signal, much less Google's RCS.

[–] WhyJiffie 6 points 1 week ago (1 children)

end to end is meaningless when the app scans your content and does whatever with it

[–] ayyy 5 points 1 week ago

For example, WhatsApp and their almost-mandatory “backup” feature.

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago) (2 children)

End to end matters, who has the key; you or the provider. And Google could still read your messages before they are encrypted.

[–] sugar_in_your_tea 2 points 1 week ago

Yup, they can read anything you can, and send whatever part they want through Google Play services. I don't trust them, so I don't use Messenger or Play services on my GrapheneOS device.

[–] [email protected] 2 points 1 week ago (2 children)

You have the key, not the provider. They are explicit about this in the implementation.

They can only read the messages before encryption if they are backdooring all android phones in an act of global sabotage. Pretty high consequences for soke low stakes data.

[–] [email protected] 1 points 1 week ago

I'm pretty sure the key is stored on the device, which is backed up to Google. I cannot say for sure if they do or don't backup your keyring, but I feel better not using it.

[–] [email protected] 1 points 1 week ago

I mean, Google does, with Play Services.

[–] [email protected] 4 points 1 week ago

Note that it doesn't mean metadata is encrypted. They may not know what you sent, but they may very well know you message your mum twice a day and who your close friends are that you message often, that kinda stuff. There's a good bit you can do with metadata about messages combined with the data they gather through other services.

[–] [email protected] 3 points 2 weeks ago

You may be right for that particular instance, but I'd still argue caution is safer.

[–] [email protected] 2 points 2 weeks ago (1 children)

Of course our app is end-to-end encrypted! The ends being your device and our server, that is.

[–] [email protected] 5 points 2 weeks ago

It’s end to end to end encrypted!