this post was submitted on 18 Aug 2024
17 points (87.0% liked)

Cybersecurity

5941 readers
9 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities [email protected] [email protected] [email protected] [email protected] [email protected]

Notable mention to [email protected]

founded 2 years ago
MODERATORS
 

Copilot Autofix, a new addition to the GitHub Advanced Security service, analyzes vulnerabilities in code and offers code suggestions to help developers fix them.

top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 22 points 4 months ago (3 children)
[–] [email protected] 7 points 4 months ago

Autofix has now corrected your sentence to:

"We're all going to die."

This is now a perfectly correct sentence in every way.

Thank you for using Autofix.

[–] [email protected] 4 points 4 months ago

True, but unrelated. Llms aren't sentient. They are just a useful tool at times.

[–] [email protected] 0 points 4 months ago

Please point to where the language model hurt you