ghostwolf

joined 1 year ago
[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

Given that you completely ignore my arguments and replace my thesis with your, I'm inclined to think that you don't understand all implications and potential consequences. It's not like you've managed to disprove my point. You simply ignored it.

Moreover, the fact that you don't care about privacy doesn't mean that your data can't be used against you. It can be used, it is being used and it will be used in the future.

[–] [email protected] 7 points 1 year ago (3 children)

You're simplifying the issue down to a set of abstract photos that you claim not to care about, ignoring the broader implications. This tells me that you may not fully understand the complexity of our world, the ease with which you can be manipulated, and the potential consequences of such manipulation. The irony lies in the fact that you are essentially replacing my argument:

Knowing your traits and preferences allows one to tailor a persuasive message specifically to you. This strategy can be used to sell you anything, from a mobile phone to a politician. The implications of such tactics are significant, potentially affecting billions of people.

With your own:

Being afraid that Google had a rouge dick pic that might leak with thousands of others is absurd.

Then declaring it absurd. But in fact, it was your argument, not mine, that you characterized as such :)

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (1 children)

On one hand Facebook has never done anything to help anyone not named Zuckerberg.

Hm, React is also open-source - it's under the MIT license. A lot of people have jobs and develop or use products made with it. Probably there are other good examples that I'm not aware of.

However, here the license is more restrictive:

  1. Additional Commercial Terms. If, on the Llama 2 version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee's affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from Meta, which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until Meta otherwise expressly grants you such rights.

I wouldn't say that's a crazy requirement. A lot of businesses still could use it free of charge, because few have 700 million or more monthly active users. Besides, from the given text I'm not sure if this applies to the current version of the LLM or not.

You can't fork it and change the license. You can't use it to develop another LLM either:

v. You will not use the Llama Materials or any output or results of the Llama Materials to improve any other large language model (excluding Llama 2 or derivative works thereof).

So yeah, while they want to protect their commercial interests and put some restrictions in place, we should discuss the actual license agreement instead of talking about trust and beliefs. To me, it doesn't look bad.

[–] [email protected] 12 points 1 year ago

Indeed, it's quite rare to find someone who isn't concerned about their photos, messages, and other sensitive information potentially being leaked online. Good for you, though I don't believe it's representative. Even so, there are potential risks to consider. With the right information, someone could manipulate, blackmail, or coerce you without you even realizing it. Our brains are subject to numerous biases, making us susceptible to subtle manipulations. Knowing your traits and preferences allows one to tailor a persuasive message specifically to you. This strategy can be used to sell you anything, from a mobile phone to a politician. The implications of such tactics are significant, potentially affecting billions of people.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (8 children)

But if your photos leak, your colleagues could see them. Someone can blackmail you. Or do that using any other sensitive information.

[–] [email protected] 6 points 1 year ago

When you realize that's the death sentence.

[–] [email protected] 159 points 1 year ago* (last edited 1 year ago) (10 children)

Ask them to unlock their phone and give it to you. Chances are, you'll quickly find out they have things they'd like to hide.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Lemmy already has bots that post articles about JavaScript to the Java community. We're growing fast!

[–] [email protected] 48 points 1 year ago (1 children)

Thanks for sharing your feelings!

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

From my experience, they don't read articles either. People can have huge threads with hundreds of replies asking questions that already have been answered in the linked article. Though this is more common in politics and science. F1 media usually take one or two quotes and add 10 meaningless paragraphs that are not worth reading - this is where top comments come to help!

[–] [email protected] 5 points 1 year ago (3 children)

Lol, turns out there's text below this block. I'm so used that they put those 'read more' blocks at the end of the post on most websites.

[–] [email protected] 5 points 1 year ago (5 children)

the Italian tyre manufacturer will offer only 11 sets of tyres for the three-day event at the Hungaroring

Down from...?

 

Hi,

I'm looking for something that could generate code and provide technical assistance on a level similar to ChatGPT4 or at least 3.5. I'm generally satisfied with it, but for privacy and security reasons I can't share some details and code listings with OpenAI. Hence, I'm looking for a self-hosted alternative.

Any recommendations? If nothing specific comes to mind, what parameters should I look at in my search? I've never worked with LLMs yet and there are so many of them. I just know that I could use oobabooga/text-generation-webui to access a model in a friendly way.

Thanks in advance.

view more: next ›