Aichat Config for OpenAI and Anthropic APIs
There is a fascinating LLM CLI tool called aichat that lets you use all kinds of LLM models in the command line, including REPL mode, RAG mode (feeding it your documents and files of choice to use as a knowledgebase), and much more.
I was a little confused about how to configure it to give me a choice of OpenAI and Anthropic LLM models though. I kept breaking the config and generating error messages. Finally I stumbled across a post in the aichat repo’s closed issues that explained for OpenAI and Anthropic, you can configure their API and type information and skip configuring any model information. That way aichat will pull the list of available models from each API for you.
Here’s what my config looks like now:
You can see I default to the Claude 3.5 Sonnet (latest) model. The nomenclature for the value of specified model is type:model-name
. Then I have a clients section which lists two types of models: OpenAI and Claude (Anthropic).
Here’s the default Claude 3.5 Sonnet (which is notoriously shy about answering questions about model information):
But I could also specify a model manually, or set it in an environment variable:
And in aichat REPL mode, you can get a list of models to choose from with the .model ⇥
command (that’s .model followed by a space and a tab).
Arrow up and down to select the model you want and then hit return. This basically does the same as you typing .model type:model-name
:
There’s a lot that aichat can do that I haven’t even poked at yet. Honestly, I don’t have time and probably use cases to dig into it much more in the very near term, but it is definitely a very comprehensive CLI tool for LLM use.