this post was submitted on 28 Nov 2023
2 points (100.0% liked)

Emacs

311 readers
2 users here now

A community for the timeless and infinitely powerful editor. Want to see what Emacs is capable of?!

Get Emacs

Rules

  1. Posts should be emacs related
  2. Be kind please
  3. Yes, we already know: Google results for "emacs" and "vi" link to each other. We good.

Emacs Resources

Emacs Tutorials

Useful Emacs configuration files and distributions

Quick pain-saver tip

founded 1 year ago
MODERATORS
 

Imagine selecting a paragraph and having ChatGPT automatically correct grammar and spelling errors, then seamlessly replacing it in your buffer. Well, imagine no more - it's now a reality!

Here's what it does:

  • Selects the current paragraph in Emacs.
  • Sends it to ChatGPT for a grammar and spelling check.
  • Replaces the original text with the corrected version, all within Emacs.

Inception:

The other night I read a post on X that said LLMs would be used to enhance word prediction for texting on phones. I thought another interesting application would be to easily bring spelling and grammar fixes to whatever I'm editing in emacs.

It's not flawless, but in my experience, it's all I need.

Here's a video example: https://youtu.be/hrhoNE2M9Qw

Here's the gist: https://gist.github.com/ckopsa/c55bf8cc25df8a4a87c6993bdce3573e

Leverages chatgpt-shell found here: https://github.com/xenodium/chatgpt-shell

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 11 months ago (1 children)

This and other things also possible with ellama. It also works with local models.

[–] [email protected] 1 points 11 months ago (4 children)

Yeah, I'd be eager to try and see if it makes the response faster without sacrificing quality. Are there models right now that have decent output running on something like a Chromebook?

[–] [email protected] 1 points 11 months ago

But you can use it with open ai or Google api.

[–] [email protected] 1 points 11 months ago

On most Chromebooks are very weak hardware. I don't think it will work fast enough to be useful.

[–] [email protected] 1 points 11 months ago

Mini orca 3b maybe?

[–] [email protected] 1 points 11 months ago

See also [this reply](https://github.com/s-kostyaev/ellama/issues/13#issuecomment-1807954046) about using ellama on weak hardware:
You can try: