Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try [email protected]
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
The "Blockchain" technology is gonna become crucial in the future of AI and Deepfakes.
Since videos, and especially still images can be faked. They would be treated just like witness testimonies, evidence that can be falsified.
What I think will happen is that people would have to use live internet connection to verify a video.
So what happens is that whenever a video is recorded, there will be a "blockchain verify" feature in the camera settings, when enabled either the video feed or the hash of the packets of video data is sent to a blockchain network where it gets timestamped and stored permanently on the blockchain.
The network would consist of various nodes that ideally aren't government run. Think like the ACLU, EFF, or Journalists, or people who independently want to join the network. Each would run their own node independently.
So any time theres suspicion that a video may be faked, the courts can just ask the network to send their own copy of the blockchain, if theres a consensus, then the video can be proven to have been created at the time that is timestamped. So there's no way of creating a fake video evidence after an incident since you wont have the timestamp on the blockchain.
How does this protect the blockchain from someone just uploading hashes of AI generated video though?
It doesn't prevent every falsification. It makes it much harder.
Say the incident is a car crash. And assuming dash cams also get this blockchain feature. The car crash already happened and you cant fake a video afterwards that make it look like the other person hit you first. And if you try to preemptively fake a video, you cant know every possible roads, roadsigns, nearby cars, or what vehicle the other person is driving, basically you cant predict everything that was on the scene before the incident occurs.
Imagine if you hit a truck at 5 PM heading west on Road 27 and on the intersection on Road 52. You'd have to know beforehand the road that the incident will occur on, the position of the sun (its 5PM and you're heading west, remember), the road signs, how wide the road is and how many lanes, the fact that the other vehicle is a truck, what the truck looks like, etc.
I mean you have to create so many fake vehicle collision videos then when an incident happens, you'd have to hope one of the ones you faked matches the situation, then quickily find the video and send it to the blockchain.
Not to mention, the other person could have a dashcam video without any discrepencies. And any slight discrepency on your faked video would make court believe the other recording more than yours.
I mean its not impossible fake something. But its hard to do it before something happens.
It still doesn't prevent me from making a camera that generated ai with timestamps, real block chain, and instead of capturing pixels from lens, generates then from a prompt that says " politician bribes etc etc"