Age | Commit message (Collapse) | Author | |
---|---|---|---|
2024-05-20 | Add simonw/llm as cli/library client for running LLMs | Ben Sima | |
This is basically exactly the client library that I would write myself. Some parts of it are still beta quality, but it's the sort of thing that I would contribute to anyway. Unfortunately I couldn't get the llm-llama-cpp plugin to work because it depends on llama-cpp-python which is not packaged for nix and is hard to package because the upstream project vendors a patched version of llama.cpp. So I'm stuck with ollama for now, but that's fine because it actually works. |