this post was submitted on 29 Nov 2023
1 points (66.7% liked)
Hardware
33 readers
1 users here now
A place for quality hardware news, reviews, and intelligent discussion.
founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It wouldn’t. Training the neural nets for LLMs are all about brute force, and it’s only been possible in the last few years to train these models without spending billions. Even going back to 2010 I think it’d be largely infeasible.
The good news is if we fast forward even just a few years, training will become relatively cheap compared to today.
So my collection of about 20 Commodore 64 isn't enough? Do I need another 20? /s
You would need at least 44 more