But you can use it with open ai or Google api.
s-kostyaev
joined 1 year ago
On most Chromebooks are very weak hardware. I don't think it will work fast enough to be useful.
Not now. And personally I don't think it will be useful. But I see how it can be done. In that case hardest thing is to collect good context for completion. Without it this would be dummy t9.
Sure. You can use functions without double dashes as public api. If you want some specific, open issue or you can even use llm library directly if you want more control.
I like large language models like chatGPT, but prefer to run it locally on my hardware. There is cool project to help with it - https://github.com/jmorganca/ollama. I prefer Emacs interface to interact with text, so I create this project. All components are open source. Feedback is welcome.
See also [this reply](https://github.com/s-kostyaev/ellama/issues/13#issuecomment-1807954046) about using ellama on weak hardware:
You can try: