Ollama-Buddy 0.9.8: Transient Menu, Model Managing, GGUF Import, fabric Prompts and History Editing

March 19, 2025

This week in ollama-buddy updates, I have been continuing on the busy bee side of things.

The headlines are :

  • Transient menu - yes, I know I said I would never do it, but, well I did and as it turns out I kinda quite like it and works especially well when setting parameters.
  • Support for fabric prompts presets - mainly as I thought generally user curated prompts was a pretty cool idea, and now I have system prompts implemented it seemed like a perfect fit. All I needed to do was to pull the patterns directory and then parse accordingly, of course Emacs is good at this.
  • GGUF import - I don't always pull from ollama's command line, sometimes I download a GGUF file, it is a bit of a process to import to ollama, create a model file, run a command, e.t.c, but now you can import from within dired!
  • More support for the ollama API - includes model management, so pulling, stopping, deleting and more!
  • Conversation history editing - as I store the history in a hash table, I can easily just display an alist, and editing can leverage the sexp usual keybindings and then load back in to the variable.
  • Parameter profiles - When implementing the transient menu I thought it might be fun to try parameter profiles where a set of parameters can be applied in a block for each preset.

Ollama-Buddy 0.8.0 - Added System Prompts, Model Info and simpler menu model assignment

March 14, 2025

More improvements to ollama-buddy https://github.com/captainflasmr/ollama-buddy

The main addition is that of system prompts, which allows setting the general tone and guidance of the overall chat. Currently the system prompt can be set at any time and turned on and off but I think to enhance my model/command for each menu item concept, I could also add a :system property to the menu alist definition to allow even tighter control of a menu action to prompt response.

Installing Emacs 30.1 On Arch and SUSE

February 26, 2025

Seems to be a common post at the moment, so I thought I would quickly put out there how I updated to Emacs 30.1.

I use an Arch spin called Garuda, running SwayWM, so as its on wayland, this for me is simple, just update the system using pacman -Syu and emacs-wayland will pull in 30.1 automatically!

Ollama Buddy - Now On MELPA!

February 24, 2025

 ___ _ _      n _ n      ___       _   _ _ _
|   | | |__._|o(Y)o|__._| . |_ _ _| |_| | | |
| | | | | .  |     | .  | . | | | . | . |__ |
|___|_|_|__/_|_|_|_|__/_|___|___|___|___|___|

https://github.com/captainflasmr/ollama-buddy

A friendly Emacs interface for interacting with Ollama models. This package provides a convenient way to integrate Ollama’s local LLM capabilities directly into your Emacs workflow with little or no configuration required.

Latest improvements:

  • Chat buffer now more prompt based rather than ad-hoc using C-c C-c to send and C-c C-k to cancel
  • Connection monitor now optional, ollama status visibility now maintained by strategic status checks simplifying setup.
  • Can now change models from chat buffer using C-c C-m
  • Updated intro message with ascii logo
  • Suggested default "C-c o" for ollama-buddy-menu
  • defcustom ollama-buddy-command-definitions now will work in the customization interface.
  • The presets directory on github contains elisp files that can be evaluated to generate a role-based menu.
  • Added to MELPA, install using the following:
(use-package ollama-buddy
  :bind ("C-c o" . ollama-buddy-menu))
  • and to add initial model:
(use-package ollama-buddy
   :bind ("C-c o" . ollama-buddy-menu)
   :custom ollama-buddy-default-model "llama3.2:1b")

Why I Switched from Magit to VC-Mode (and How It Works for Me)

February 21, 2025

I am currently using vc-mode for my source code configuration needs. I wouldn’t call myself a die-hard vc-mode user, (at least not yet!). To earn that title, I think I would need years of experience with it and scoffing at this newfangled thing they call magit, while my muscle memory recoils at the thought of reading and interacting with a transient menu!

Ollama Buddy - Local LLM Integration for Emacs

February 7, 2025

I have been playing around with local LLMs recently through ollama and decided to create the basis for an Emacs package to focus on interfacing to ollama specifically. My idea here is to implement something very minimal and as light-weight as possible and that could be run out-of-the-ollamabox with no configuration (obviously the ollama server just needs to be running). I have a deeper dive into my overall design thoughts and decisions in the github README and there are some simple demos:"hello"

https://github.com/captainflasmr/ollama-buddy

Copying completion candidate to the clipboard

January 30, 2025

In my continuing quest to replace all my external use packages with my own elisp versions so I can still follow my current workflow on an air-gapped system, I would like to replace a single function I use often from embark

embark allows an action in a certain context, and there are a lot of actions to choose from, but I found I was generally using very few, so to remove my reliance on embark I think I can implement these actions myself.

Creating a small local elisp rainbow-mode solution

January 25, 2025

In this post, as part of my ongoing mission to replace all (or as many as possible) external packages with pure elisp, I’ll demonstrate how to implement a lightweight alternative to the popular rainbow-mode package, which highlights hex colour codes in all their vibrant glory. I use this quite often, especially when "ricing" my tiling window setup.