this post was submitted on 25 Feb 2025
243 points (97.3% liked)
movies
2503 readers
253 users here now
Matrix room: https://matrix.to/#/#fediversefilms:matrix.org
Warning: If the community is empty, make sure you have "English" selected in your languages in your account settings.
A community focused on discussions on movies. Besides usual movie news, the following threads are welcome
- Discussion threads to discuss about a specific movie or show
- Weekly threads: what have you been watching lately?
- Trailers
- Posters
- Retrospectives
- Should I watch?
Related communities:
Show communities:
Discussion communities:
RULES
Spoilers are strictly forbidden in post titles.
Posts soliciting spoilers (endings, plot elements, twists, etc.) should contain [spoilers] in their title. Comments in these posts do not need to be hidden in spoiler MarkDown if they pertain to the title’s subject matter.
Otherwise, spoilers but must be contained in MarkDown.
2024 discussion threads
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Would you have some scientific sources about the claim that we think in binary and that we are deterministic?
I think you may be conflating your philosophical point of view with science.
All Turing-complete modes of computation are isomorphic so binary or not is irrelevant. Both silicon computers and human brains are Turing-complete, both can compute all computable functions (given enough time and scratch paper).
If non-determinism even exists in the real world (it clashes with cause and effect in a rather fundamental manner) then the architecture of brains, nay the life we know in general, actively works towards minimising its impact. Like, copying the genome has a quite high error rate at first, then error correction is applied which brings the error rate down to practically zero, then randomness is introduced in strategic places, influenced by environmental factors. When the finch genome sees that an individual does not get enough food it throws dice at the beak shape, not mitochondrial DNA.
It's actually quite obvious in AI models: The reason we can quantise them, essentially rounding every weight of the model to be able to run them with lower-precision maths so they run faster and with less memory, is because the architecture is ludicrously resistant to noise, and rounding every number is equivalent to adding noise, from the perspective of the model.
The deterministic universe is a theory as much as the big bang. We can't prove it, but all of the evidence is there. Thinking in binary is me making a point about how our minds interact with the world. If you break down any interaction to its smallest parts, it becomes a simple yes/no, or on/off, we just process it much faster than we think about it in that sense.
There are various independent reproducible measurements that give weight to the hot big bang theory as opposed to other cosmological theories. Are they any for the deterministic nature of humans?
Quantum physic is not deterministic, for example. While quantum decoherence explains why macro physical systems are deterministic, can we really say it couldn't play a role in our neurons?
On a slightly different point, quantum bits are not binary, they can represent a continuous superposition of multiple states. Why would our mind be closer to binary computing rather than quantum computing?
The comparison between human cognition and binary isn't meant to be taken literally as "humans think in 1s and 0s" but rather as an analogy for how deterministic processes work. Even quantum computing, which operates on superposition, ultimately collapses to definite states when observed—the underlying physics differs, but the principle remains: given identical initial conditions, identical outcomes follow.
Regarding empirical evidence for human determinism, we can look to neuroscience. Studies consistently show that neural activity precedes conscious awareness of decisions (Libet's experiments and their modern successors), suggesting our sense of "choosing" comes after the brain has already initiated action. While quantum effects theoretically could influence neural firing, there's no evidence these effects propagate meaningfully to macro-scale cognition—our neural architecture actively dampens random fluctuations through redundancy.
The question isn't whether humans operate on binary code but whether the system as a whole follows deterministic principles. Even if quantum indeterminacy exists at the micro level, emergence creates effectively deterministic systems at the macro level. This is why weather patterns, while chaotic, remain theoretically deterministic—we just lack perfect information about initial conditions.
My position isn't merely philosophical—it's the most parsimonious explanation given current scientific understanding of causality, neuroscience, and complex systems. The alternative requires proposing special exemptions for human cognition that aren't supported by evidence.
I think this is incorrect, it does collapse to definitive state when observed, but the value of the state is probabilistic. We make it deterministic by producing s large number of measurements and deciding on a test on the statistical distribution of all the measurement to get a final value. Maybe our brain also does a test on a statistic of probabilistic measurements, or maybe it doesn't and depends directly on probabilistic measurements, or a combination of both.
We also lack fully proven equations or complete resolution of equations in fluid dynamics.
I think parsimony is very much based on personal opinion at this point of knowledge.
You're right about quantum measurement—I oversimplified. Individual quantum measurements yield probabilistic outcomes, not deterministic ones. My argument isn't that quantum systems are deterministic (they're clearly not at the individual measurement level), but rather that these indeterminacies likely don't propagate meaningfully to macro-scale neural processing.
The brain operates primarily at scales where quantum effects tend to decohere rapidly. Neural firing involves millions of ions and molecules, creating redundancies that typically wash out quantum uncertainties through a process similar to environmental decoherence. This is why most neuroscientists believe classical physics adequately describes neural computation, despite the underlying quantum nature of reality.
Regarding fluid dynamics and weather systems, you're correct that our incomplete mathematical models add another layer of uncertainty beyond just initial conditions. Similarly with brain function, we lack complete models of neural dynamics.
I concede that parsimony is somewhat subjective. Different people might find different explanations more "simple" based on their background assumptions. My deterministic view stems from seeing no compelling evidence that neural processes harness quantum randomness in functionally significant ways, unlike systems specifically evolved to do so (like certain photosynthetic proteins or possibly magnetoreception in birds).
The question remains open, and I appreciate the thoughtful pushback. While I lean toward neural determinism based on current evidence, I acknowledge it's not definitively proven.