planish

joined 2 years ago
[–] planish 5 points 2 years ago

Yeah stuff streams live to the client when it arrives on the server and the client is not putting it in the right place in the displayed list, just dumping it at the top.

I heard the Lemmy devs are abandoning their whole websocket-based way of having the client and server interact, so that would probably sort it out if it ever happens.

[–] planish 10 points 2 years ago* (last edited 2 years ago) (1 children)

Some mods are permanently or "indefinitely" closing, some are opening their subs back up tomorrow.

People here are on board with Reddit being over; that's why many of them have jumped ship to new best friend Lemmy.

But, like, digg.com still exists. Only time will tell where if anywhere will really become the next place to be, or if Reddit will somehow recover its coolness.

[–] planish 2 points 2 years ago (1 children)

What happened to the drivers for the old cards to make them bad?

[–] planish 1 points 2 years ago (2 children)

That is one powerful book that dude is reading.

How much does the model want to make people learning to read Black dudes, versus people doing other activities? I have noticed people being almost uniformly white unless specified to be otherwise.

[–] planish 1 points 2 years ago (1 children)

I love it, I want to acquire this item.

What does it do?

[–] planish 4 points 2 years ago (1 children)

The rotation thing is probably the metadata being stripped too. Instead of encoding the image the right way up or re-encoding if someone rotates, often phones will very quickly "rotate" an image by just adding some metadata saying which way up you should draw it. If the metadata is dropped and the image isn't re-encode the right way up at the same time, then the image goes back to being drawn in whatever orientation it came off the sensor.

[–] planish 3 points 2 years ago (3 children)

It's a portable, standard-C++, CPU-based implementation of the code to do inference (i.e. text generation) with the LLaMA language model. You get a command line command that takes text and the model and eventually outputs more text.

You could do the same thing and run Stable Diffusion off of the CPU at some relatively slow speed, but I don't know if anyone has code for it.

[–] planish 4 points 2 years ago (1 children)

If you have control of your instance's backend, that's definitely the better approach.

Over on AT Protocol they have a system of pluggable algorithms. You can publish info on an "algorithm" to your account's profile, and people can add your algorithm as a feed on their homepage, and then when they request their homepage their home server contacts your server and tells it a bit about them, and you send back a feed of posts.

So people can customize the post selection and share those customizations without the instance/home server admin needing to be in on it.

[–] planish 3 points 2 years ago

Another vote for the AI Horde people. They seem to be a legit volunteer operation not about to turn startup, and the kudos economy they are running is pretty neat and well thought out.

[–] planish 1 points 2 years ago (5 children)

Any news of like a llama.cpp equivalent for SD? It would be handy to be able to slowly run it without a GPU, and maybe competitive with other free options in terms of images generated per day.

[–] planish 4 points 2 years ago (3 children)

You might have to do this client side. Hide some data in local storage about what communities to downweight. Then look at the page, find all the posts, pull out the community name and check if it ought to be downweighted. Then do something like hash the title, turn that into a float on 0 to 1, compare with the fraction of posts from the community that you want to see, and if it fails hide the post in CSS.

[–] planish 2 points 2 years ago (1 children)

But you still either have to have each person ship their posts to everyone who wants to see them, direct, or else you have someone out there gasp operating a social media service without a license. And who knows who could be 12 and in Utah.

view more: ‹ prev next ›