Update: openai sort of specialized on the chat api now, so an interface like https://github.com/benjamin-asdf/chatgpt-shell is maybe a better fit. I am currently using my fork of chatpgt-shell exclusively basically.
I wanted openai-api stuff in emacs. Consider this an alpha version where everything still might change. PRs welcome, feel free to suggest new commands.
- Get an api token at https://openai.com/api/ (after 3 months you’d have to pay)
- load file
openail.el
. - Check
dev.el
. You need to setopenai-api-key
.
You need to adapt this.
(use-package openai-api
:straight nil
:load-path "~/repos/openai-api.el/"
:config
(setq openai-api-key
(let ((s))
(lambda ()
(or s (setf
s
(auth-source-pick-first-password
:host "openai-api"))))))
(define-key openai-api-keymap (kbd "i")
(defun mm/insert-todo ()
(interactive)
(insert "TODO: ")
(comment-line 1)))
(with-eval-after-load 'git-commit-mode
(define-key git-commit-mode-map (kbd "C-c i") #'openai-current-commit-msg))
;; or how ever you define your keys
(meow-leader-define-key `("." . ,openai-api-keymap)))
- optionally
narrow
your buffer. E.g. to your defun. (you usually want to do this). - use
openai-api-davinci-edit
- The response is put in a buffer called
*openai-edit-<your-buffer-name>*
ediff-buffers
works nice for response and target buffer- Use
openai-api-edit-ediff-buffers
. This bound by default atC-c C-e
in response buffers.
Repeat another openai-api-davinci-edit
from inside the response buffer.
Another usage pattern is to open an empy buffer, then ask it to produce something.
switch-to-buffer
“foo”- use
openai-api-davinci-edit
“Shell command that lists all files with more than 1mb in the current dir”
#!/bin/bash
find . -type f -size +1M
- “Add a docstring”
- “Remove redundant let”
- In a css file, say on a high level “Make the dropdown text fill the whole dropdown”.
- You can use quite high level language. The model figures out the intent surprisingly well.
Consider trying empty instructions. This is similar to the completion api.
See examples
See example time complexity.
I made a command openai-api-explain-region
. You can add more custom
prompts via openai-api-explain-data-list
.
(add-to-list
openai-api-explain-data-list
'("What does this do?"
.
(((model . "text-davinci-003")
(top_p . 1)
(max_tokens . 64)
(temperature . 0)))))
Generates a commit messages from your current diff. I recommend binding
it in git-commit-mode-map
.
Make a question, get facts. See example.
Describe a shell command, get a resp buffer. This uses the edit enpoint.
This is meant to be a quick completion function.
I was first trying to put this into capf
but it is not fast enough
for that.
Not fleshed out yet so feel free to play around and let me know if you
have a variant that works well.
gpt-4 is supported via the chat api.
(setf openai-api-chat-model "gpt-4")
As advanced user you can get inspired by openai-api-chat-complete
.
Recent Hacker News article with thread of similar projects.
Other emacs gpt projects, in no particular order:
- https://xenodium.com/a-chatgpt-emacs-shell/
- https://github.com/rksm/org-ai
- https://github.com/CarlQLange/chatgpt-arcana.el
- https://github.com/karthink/gptel
- https://github.com/antonhibl/gptai
- https://github.com/emacs-openai/chatgpt
- https://github.com/emacs-openai/codegpt
- https://github.com/stuhlmueller/gpt.el
- codex-completion package is not up to date with the api endpoints.
This software is (C) Benjamin Schwerdtner and licencend under the GNU GENERAL PUBLIC LICENSE V3. You can find a copy of the licence at LICENCE in the repo root.