this post was submitted on 25 Mar 2025
15 points (100.0% liked)

Hacker News

1117 readers
209 users here now

Posts from the RSS Feed of HackerNews.

The feed sometimes contains ads and posts that have been removed by the mod team at HN.

founded 6 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] kata1yst 1 points 1 week ago* (last edited 1 week ago)

I don't disagree with the ambiguity here, but let's focus on LLMs.

I don't understand the argument that an LLM is more conscious than traditional software. Take for example a spreadsheet with a timed macro.

A spreadsheet with a timed macro stores state in both short term (RAM) memory and long term (in the spreadsheet table) memory and accepts many inputs. The macro timer gives it repeatable continuing execution, and a chance to process both past and present inputs with complexity.

An LLM has limited short term memory (it's current place in the calculation, stored in VRAM) and zero long term memory. It can only accept one input and cannot recall past experience to construct a chain of consciousness.

The spreadsheet is objectively more likely to be conscious than the LLM. But no one argues for spreadsheet consciousness because it's ridiculous. People argue for LLM consciousness simply because we sentient messy meat bags are evolutionarily programmed to construct theories of mind for everything and in so doing mistakenly personify things that remind us of ourselves. LLMs simply feel "alive-ish" enough to pass our evolutionary sniff test.

But in reality, they're less conscious than the spreadsheet.