add new readme

This commit is contained in:
Xavier Brinon 2024-01-22 07:22:46 +00:00
parent 2dc3efa3f6
commit f761a6b557
2 changed files with 128 additions and 6 deletions

View File

@ -1,7 +1,123 @@
#+title: Emacs config and other niceties
#+date: [2024-01-16 Tue]
#+title: Emacs
#+date: [2023-09-02 Sat]
#+startup: indent
* emacs
* Org mode shortcuts
** Dates:
- active timestamps :: C-c .
creates an entry in the agenda
- inactive timestamps :: C-c !
doesn´t create an entry in the agenda
- link :: [[https://orgmode.org/orgguide.html#Timestamps][timestamps]]
Emacs config
** Links:
- create link :: C-c C-l
- link :: [[https://orgmode.org/orgguide.html#Hyperlinks][hyperlinks]]
** Tables
- display cells references :: C-c }
- display ref for that cell :: C-c ?
references are =@ROW$COLUMN=
- link :: [[https://orgmode.org/manual/References.html][table references]]
- formula :: =@I..II=
select all the rows between horizontal lines (hline) 1 and 2
=vmean(@I..II);EN= column range mean, treats empty values as null (0)
* Config
In your ~init~ file
** Listing buffers
- List buffer :: =C-x C-b=
Classic version
- List Buffer Better :: =M-x ibuffer=
Way better
To overwrite the classic shortcut with the better version:
#+begin_src elisp
(global-set-key [remap list-buffers] 'ibuffer)
#+end_src
* Info Manual
- C-h i
** Shortcuts and Keys
- [, ] :: Previous and Next *node*
- l, r :: Back and Forward in *history*
- n, p :: Previous and Next *sibling node*
- u :: Up one level to a *parent node*
- SPC :: Scroll (browse) one screen at a time, natural reading
- TAB :: Cycles through cross-references and links
- RET :: Opens the active link
- m :: Search, Prompts for a menu item name and opens it
- q :: Closes the info browser
** TODO Do the tutorial
- C-h i
- getting started
** TODO Do the EINTR
- C-h R :: Open a manual
- ~eintr~ An Introduction to Programming in Emacs Lisp
* File-local variables
** Format
Looks like this
#+name: file-local variables *header*
#+begin_src org
-*- mode: mode-name-here; my-variable: value -*-
#+end_src
#+name: file-local variables *footer*
#+begin_src org
Local Variables:
mode: mode-name
my-variables: value
End:
#+end_src
** Shortcut
Doesn't seem to have any shortcuts.
The commands are:
- ~M-x add-file-local-variable-prop-line~
- ~M-x add-file-local-variable~
** Shebangs (Program loader)
Classic shell shebangs also sets the major mode
as long as it's listed in ~interpreter-mode-alist~
** Magic mode
Using the variable in the beginning of the file
#+begin_src org
#+magic-mode-alist: mode-name
#+end_src
* Other Window command shortcuts
The *other* window is the next window that will be on focus with ~C-x o~
You can execute command in that window instead of the one that has the current focus.
The shorcuts start with ~C-x 4~ and mirror the commands applied to the current window ~C-x~
- ~C-x 4 C-f~ :: Finds a file in the other window
- ~C-x 4 d~ :: Opens ~dired-mode~ in the other window
- ~C-x 4 C-o~ :: Displays a buffer in the other window
- ~C-x 4 b~ :: Switches the buffer in the other window
and make it the active window
- ~C-x 4 0~ :: Kills the buffer *and* window
- ~C-x 4 p~ :: Run project command in the other window
The same commands work in an other *frame* using ~5~ instead of ~4~
* Bookmark tab
You can bookmark *windown configurations* equivalent to a *workspace* in some
other editors.
** Shortcuts
- ~C-x t 2~ :: Create a new tab
- ~C-x t r~ :: Rename current tab
- ~C-<TAB>~ :: Cicle through the tabs
- ~C-x t RET~ :: Select tab by name
- ~C-x t 0~ :: Close current tab
- ~C-x t d~ :: Open dired mode in another tab

10
init.el
View File

@ -155,6 +155,13 @@
(require 'ocp-indent)
;; GPTel with Llamafile config
;; Llama.cpp offers an OpenAI compatible API
(gptel-make-openai "openai"
:stream t
:header (lambda () `(("Authorization" . ,(concat "Bearer " (gptel--get-api-key)))))
:key 'gptel-api-key
:models '("gpt-3.5-turbo"))
;; Llama.cpp offers an OpenAI compatible API
;; (gptel-make-openai "llama-cpp" ;Any name
;; :stream t ;Stream responses
@ -162,9 +169,8 @@
;; :host "127.0.0.1:8000" ;Llama.cpp server location
;; :models '("local")) ;Any names, doesn't matter for Llama
;; And now set it as the default backend
;; OPTIONAL configuration
(setq gptel-default-mode 'org-mode)
(setq-default gptel-backend (gptel-make-openai "llama-cpp"
(setq-default gptel-backend (gptel-make-openai "Llamafile"
:stream t
:host "127.0.0.1:8080"
:protocol "http"