this post was submitted on 23 Mar 2025
977 points (97.5% liked)
memes
13819 readers
4572 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- [email protected] : Star Trek memes, chat and shitposts
- [email protected] : Lemmy Shitposts, anything and everything goes.
- [email protected] : Linux themed memes
- [email protected] : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They are free to peer review the test and do it with all the stuff enabled.
That is how science works.
But I doubt they will, since this is an inherent problem with using camera vision only. Not with the software of the car. And they most likely know it.
I will point out, I don't think "peer review" means repeating the test, it means more generally pointing out issues with the science, right? By that definition, sounds like that's what they're doing. That doesn't make the criticisms inherently valid, but to dismiss it as "they're free to do their own tests" because "that is how science works" seems dishonest.
Peer review usually means repeating the test and comparing results with the original paper. If peer review can't get the same results, it means that the first study was faulty or wasn't described accurately.
Humans also operate on "camera vision" only in that we see visible light and that's it. Adding lidar to the system should improve performance over human capability, but camera vision with good enough software (and this is way easier said than done) ought to be able to match human capability. Whether Tesla's is good enough in FSD mode I have no idea because I have no intention to ever buy one and testing this in a rental is uh... risky, given that they tend to have onboard cameras.
Of course, if Tesla's "FSD" branded driver assist suite is actually good enough to beat this test, I reckon Tesla would be quick to prove it to save their own reputation. It's not that hard to reproduce.
Not just good enough software. Also good enough cameras and good enough processing power. None of which curenty match humans so this is not a valid argument.
The camera only system is just worse at everything.
https://www.adafruit.com/product/4058?gQT=1
These are extremely EXTREMELY reliable at detecting physical obstructions. There is no reason but stupidity or being cheap to not implement redundancy into a safety system. This isn't about implementing "good enough" software. This is about a design choice forced on Tesla engineers by a complete idiot that doubles down on his stupidity when faced with criticism by actually intelligent people.