this post was submitted on 03 Jun 2024
136 points (100.0% liked)

TechTakes

1441 readers
33 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] VirtualOdour 1 points 6 months ago

Yeah but only in one limited way of doing things, like how you can't raise water using geometry alone but obviously there's endless things like lockgates, pumps, etc which can be added into a water transport system to raise it.

It is a hard one though, even people do the exact same thing llms do - Mandela effect and innacuracy of witness testimony are clear examples. Sometimes we don't know we don't know something, or we're sure we do - visual illusions where our mind fills in blanks is a similar thing. The human brain has a few little loops we take things through which are basically sanity checks - not everyone does the same level of thinking about what they're saying though, Alex Jones, Trump, certain people on lemmy aren't interested in if what they're saying is true simply that it serves their purpose. It's learnt behavior and we can construct nns that contain the same sort of sanity checking, or go a level beyond and have it behind the scenes create a layer of axioms and information points associated with the answer and test them individually against a fact checking network.

It's all stuff that we're going to be seeing tried in the upcoming gpt5, self tasking is the next big step to get right - working out the process required to obtain an accurate answer and working through the steps.