planish

joined 2 years ago
[–] planish 7 points 2 years ago

It's only a bad idea if you think you could win concessions with an indefinite strike.

Reddit might get a bunch of subs back tomorrow, but the admin were always going to reopen the good names via reddit request anyway.

And the mods and users aren't likely to go back to happily posting and working for free on a platform that's turned. Communities will be planning organized migrations, and a lot of people here who came because of the strike will discover they like it better here actually.

[–] planish 3 points 2 years ago

I think it has a bit of trouble matching requests and responses. A few times I have opened a post and gotten a page rendered for a different post.

Maybe you are getting notifications for someone else's reports.

[–] planish 8 points 2 years ago (5 children)

I think you might be able to subscribe and unsubscribe? I'm not sure why it would federate over the post itself but not the comments.

If subscriptions are getting stuck pending, the community's home instance could be overloaded. Maybe the comments will federate over later.

[–] planish 1 points 2 years ago

Oooh that makes a lot more sense. Never heard of him.

[–] planish 5 points 2 years ago

Yeah stuff streams live to the client when it arrives on the server and the client is not putting it in the right place in the displayed list, just dumping it at the top.

I heard the Lemmy devs are abandoning their whole websocket-based way of having the client and server interact, so that would probably sort it out if it ever happens.

[–] planish 10 points 2 years ago* (last edited 2 years ago) (1 children)

Some mods are permanently or "indefinitely" closing, some are opening their subs back up tomorrow.

People here are on board with Reddit being over; that's why many of them have jumped ship to new best friend Lemmy.

But, like, digg.com still exists. Only time will tell where if anywhere will really become the next place to be, or if Reddit will somehow recover its coolness.

[–] planish 2 points 2 years ago (1 children)

What happened to the drivers for the old cards to make them bad?

[–] planish 1 points 2 years ago (2 children)

That is one powerful book that dude is reading.

How much does the model want to make people learning to read Black dudes, versus people doing other activities? I have noticed people being almost uniformly white unless specified to be otherwise.

[–] planish 1 points 2 years ago (1 children)

I love it, I want to acquire this item.

What does it do?

[–] planish 4 points 2 years ago (1 children)

The rotation thing is probably the metadata being stripped too. Instead of encoding the image the right way up or re-encoding if someone rotates, often phones will very quickly "rotate" an image by just adding some metadata saying which way up you should draw it. If the metadata is dropped and the image isn't re-encode the right way up at the same time, then the image goes back to being drawn in whatever orientation it came off the sensor.

[–] planish 3 points 2 years ago (3 children)

It's a portable, standard-C++, CPU-based implementation of the code to do inference (i.e. text generation) with the LLaMA language model. You get a command line command that takes text and the model and eventually outputs more text.

You could do the same thing and run Stable Diffusion off of the CPU at some relatively slow speed, but I don't know if anyone has code for it.

[–] planish 4 points 2 years ago (1 children)

If you have control of your instance's backend, that's definitely the better approach.

Over on AT Protocol they have a system of pluggable algorithms. You can publish info on an "algorithm" to your account's profile, and people can add your algorithm as a feed on their homepage, and then when they request their homepage their home server contacts your server and tells it a bit about them, and you send back a feed of posts.

So people can customize the post selection and share those customizations without the instance/home server admin needing to be in on it.

view more: ‹ prev next ›