bitfucker

joined 1 year ago
[–] [email protected] 3 points 9 months ago

Same could be said for any other distro. I think his point is that when shit just works, nothing makes a difference between distro. Be it Arch, Debian, Ubuntu, Mint, Fedora, Gentoo

[–] [email protected] 4 points 9 months ago

I am his distant cousin

[–] [email protected] 9 points 9 months ago

Anything not advertised as E2EE can be assumed to have some 3rd party able to look at the conversation, malicious or not.

[–] [email protected] 2 points 9 months ago

Man, and here some people are literally struggling due to the lack of dopamine just because their brains are built differently.

[–] [email protected] 3 points 9 months ago (2 children)

Where do you shit?

[–] [email protected] 10 points 9 months ago

Your taxes have been received. Have a great day!

[–] [email protected] 15 points 9 months ago (2 children)

Pay your due tax please

[–] [email protected] 11 points 9 months ago (1 children)

You mean interaction right? ...right?

[–] [email protected] 4 points 9 months ago

Maybe got something to do with his username

[–] [email protected] 3 points 9 months ago

Has anyone watched Babish Culinary Universe and seen him drunk on Gatorwine? That could be surprisingly good... or not.

[–] [email protected] 5 points 9 months ago

Research and development is tricky because you will never know how much more progress you will need before reaching a satisfying result.

[–] [email protected] 2 points 9 months ago

*Rant for the beginning of the article ahead

Why in the name of god did they try to bring LLM to the pictures. Saying AI/ML is good enough for predictive maintenance tasks, but noooo, it has to be LLM. If they want to be specific then don't be misleading, I think what they mean is the attention layer/operation commonly used in LLM to capture time series data. I understand that the Recurrent style neural network and LSTM has its limitations. And I agree that exploring attention to be used in time series data is an interesting research but LLM? Just no.

view more: ‹ prev next ›