I'm sure you've heard it before. "X game runs like shit and the devs are using [insert tool] to cover their laziness/bad code." While I don't always agree, I do understand that optimization is dying a slow death and it's made me wonder: Just how much electricity is wasted on bad code? I often notice things that should be fast, like Task Manager in windows, operate far far slower than it seems like it should. Even ultra lightweight programs are slow to open. Seriously, why does the Windows Calculator program take any more than 1/10th of a second to open? It's a damn calculator. They were literally the first computers ever and despite modern chips being literally billions of times faster it still can have trouble starting up on an SSD.
I have heard this from computer experts too (real computer experts like old school OS engineers, not armchair youtube coders). So much code is ultra slow and unoptimized because devs are just relying on raw compute power and saying "eh, fast enough" on 1000 different things, and it all amounts to slow code all around.
With the rise of A.I. the problem has gotten exponentially worse. How many articles have been made about how "X amount of kilowatts are used for every ChatGPT question answered"? This raw compute doesn't come free. Microsoft is reportedly looking to buy nuclear power plants just to run their A.I. systems, and I imagine Google, OpenAI, Apple, etc... are looking to do the same. Imagine if optimization sold software like pretty colors and convenience do. I would be willing to bet our objectively mega powerful hardware would feel as fast as it really is.