diff options
author | Ben Sima <ben@bsima.me> | 2024-05-14 09:35:45 -0400 |
---|---|---|
committer | Ben Sima <ben@bsima.me> | 2024-05-20 22:15:49 -0400 |
commit | 2d33aa547ff6a516c90ca2b47b13e2add200583a (patch) | |
tree | 8d4941699982c59c6430f4b9a629b8ea91245bb1 /Network/Wai | |
parent | cceefa62d147594d43478e398bbaa9c630670935 (diff) |
Add simonw/llm as cli/library client for running LLMs
This is basically exactly the client library that I would write myself. Some
parts of it are still beta quality, but it's the sort of thing that I would
contribute to anyway.
Unfortunately I couldn't get the llm-llama-cpp plugin to work because it depends
on llama-cpp-python which is not packaged for nix and is hard to package because
the upstream project vendors a patched version of llama.cpp. So I'm stuck with
ollama for now, but that's fine because it actually works.
Diffstat (limited to 'Network/Wai')
0 files changed, 0 insertions, 0 deletions