Age | Commit message (Collapse) | Author |
|
I need a way to reliably get a NixOS VM provisioned in the cloud, and the
easiest way to do this is to create a qcow2 image, upload it to Digital Ocean,
and use that to start a droplet. This is very much a manual process, but that's
fine, I shouldn't need to do it very often (for now).
|
|
This adds the Images endpoint and related functions for loading and saving
images to the filesystem.
In the view layer, it also loads the images asynchronously using HTMX, so the
images get lazy-loaded only when they are done generating.
|
|
This was all dead weight, just delete it and move on.
|
|
This is basically a full rewrite. I ripped out Flask and rearchitected the whole
thing to use fully RESTful resources and endpoints using Ludic. The UI was
completely redone to use Ludic's components. I added tests for everything that I
reasonably could.
This is almost ready for an alpha launch. Before shipping it I still need to:
1. generate images using image n-1 applied to `openai.images.create_variation()`
2. write a nix service, get it on a VM somewhere, I'll probably provision a new
VM for this
3. replace the `db` thing with a real sqlite database
I only need the first one done to show it to Lia and see if she likes it, that
should be completed in a day or two. Then the nix service and deployment won't
take long at all. Setting up a sqlite database will be annoying, but that I
can't see that actually taking more than 2 days. So max 5 days out from
launching this to friends and family.
|
|
This required upgrading to python 3.12 because of some f-string format thing
that ludic uses. It's kind of annoying but the upgrade was easy enough, so I
just did it.
|
|
It's good to do this often.
|
|
This paritally used gptme to create a storybook generator. The problem I ran
into is that gptme doesn't do any architecting or considerations for
maintainable code, or even readable code, so it just wrote a long script. I
couldn't test it. Also, it didn't actually generate a 10-page story, it
generated 10 separate stories. So, I ended up writing it myself and using gptme
to fixup TODOs that I wrote along the way.
|
|
I had forgotten to add this feature, apparently, so bild --test just didn't do
the test part.
|
|
This is handy for looking at llm chat history.
|
|
I forgot to add llm to this, instead I just added the extra libraries, which
meant I had the libraries present but not the binary for running them! And llm
is important in the base dev environment because I need to experiment with the
various llms independent of my application code.
|
|
I was getting confused about what is a product and what is internal
infrastructure; I think it is good to keep those things separate. So I moved a
bunch of stuff to an Omni namespace, actually most stuff went there. Only things
that are explicitly external products are still in the Biz namespace.
|