197
this post was submitted on 08 Oct 2024
197 points (100.0% liked)
Privacy
1281 readers
29 users here now
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I know a certain bank that is doing the same thing. If you call customer service they are using Chatgpt to listen to the entire call and recently started capturing voice data to allow AI verify they are talking to the customer. Creepy as fuck and no customer aren't told this is happening.
I've seen this advertised as a fraud detection and prevention service, even before ChatGPT. I'm assuming there's a standard disclosure that the call may be recorded for training purposes, it's only recently that "training" has included "training AI".
Training then: Making all the new hires sit down and listen to a recording of you get increasingly frustrated with their dumbass coworker.
Training now: Burning through a neighborhood's worth of power for data processing to ensure that the phone tree understands you with absolute certainty when you say "speak to a representative", yet continues to ignore you anyway.
"You have selected an invalid option. Goodbye." Click
Yeah but still should be illegal. I mean this AI is listening and gathering information about the customer while discussing private banking matters.
It really depends on how it's being stored and used, like the other commenter mentioned, it's standard practice in banking/Brokerage industry to record all calls for training/litigation/coaching.
100%
And how long (as if it isn't already) before the same systems transition to healthcare? AI wants to know all the salacious details, baby.
It doesn't prevent any fraud when anyone on the Internet can now easily recreate anyone's voice using AI. Banks should know better.