this post was submitted on 01 Aug 2023
18 points (90.9% liked)

LocalLLaMA

2834 readers
48 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS
18
submitted 2 years ago* (last edited 5 months ago) by [email protected] to c/localllama
 

(Deleted for not relevant anymore)

you are viewing a single comment's thread
view the rest of the comments
[–] Saledovil 0 points 2 years ago

DRM on the chip seems not really feasible to me. In the end, the chip doesn't know what it is doing. It just does math. So how can any DRM on that level realize that it is running a forbidden model, or that a jailbreak prompt is being executed? Finding out what a program does already non trivial if you have the source code, and the DRM of the chip would only have the source code.