this post was submitted on 29 Nov 2023
1 points (66.7% liked)

Hardware

48 readers
1 users here now

A place for quality hardware news, reviews, and intelligent discussion.

founded 1 year ago
MODERATORS
 

Assuming the training software could be run on the hardware and that we could distribute the load as if it was 2023, would it be possible to train a modern LLM on hardware from 1985?

you are viewing a single comment's thread
view the rest of the comments
[–] DannyBoy 1 points 11 months ago

The fastest computer in 1985 was the CRAY-2 supercomputer at 1.9 gigaflops. ChatGPT 3 can be trained on 1024 A100 GPUs in 34 days*. An A100 outputs 312 teraflops. So no, I don’t think it can be done in 1985 if given the entire year. There’s also storage for incoming digital texts for training - the input data didn’t exist back then, not to the capacity. I don’t think it could be done in a reasonable time.