this post was submitted on 14 Apr 2024
61 points (91.8% liked)
The Lyrics Game
437 readers
1 users here now
Anybody can post a challenge.
The rules of the game are simple, take any song you like from any genre you want, put the lyrics through any AI [you may weight lyrics as the AI allows] that you want and post the resulting image here under the post title Name That Song [GENRE]. After 48-72 Hours or if the answer is guessed sooner, edit the post's title to [Solved][Genre] and put the song title and artist (and optionally any highlighted lyrics) in a spoiler tag in the body of the post.
Other than that, enjoy.
Here is a list to a few AI image generators you can use.
founded 7 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Oh no! 5 seconds of GPU time on consumer grade hardware!
It’s not nearly as small as you think it is.
https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/
Even better when you take into account the scale at that these run at:
https://www.washingtonpost.com/business/2024/03/07/ai-data-centers-power/
Consider the average toaster, roughly 1100W (on average) and toast takes 1-4 min to cook, (for the purposes of this we'll split the difference and say 2 minutes).
With math, toasting 1 slice of bread equates to roughly 0.037kWh of electricity. (kWh = (watts × hours) ÷ 1000)
Now I'm running a 7900XTX (OC) who's peak power draw is 800W (300W less than a toaster), and it legit takes 5-10secs to generate an image. Realistically I might do a couple of runs (some small then one big one) and use 30 secs of peak compute time. This would equate to 0.0067kWh of electricity usage.
Toasting bread quite literally draws way more electricity than it takes for me to generate one AI image.
So are you out there hassling people cooking their morning toast for thier criminally high power usage?
Also some further context for you, I don't use Stable Diffusion XL (listed in your article) as the old school 512x512 is more than enough for my needs (as demonstrated in this post^^). Your second article is paywalled, (not great to share if ppl can't access it), but appears to be data center use which as described above is not what I'm doing here.
I know exactly how much goes into it, 5 seconds of GPU time, on my own computer. That's why I said it. How many phone charges do you think it would take to fully create a digital drawing on a laptop? It's not going to be much different IME.
You aren’t taking into account the resources required to train the model. You clearly have very little idea how much goes into it other than running software someone else wrote.
Of course I've taken into account model training costs, was that supposed to be your gotcha? You don't actually think the energy cost amortization from training still accounts for the bulk of energy expended per image generated do you?
We aren't training models here, this isn't the training-ai-with-massive-datacentres@lemmy