Experimental config driven LLM client library in Rust
There were no good libraries I could find which let me create a runtime configurable llm client for use in agents. so I built this one which is runtime configurable and enables the discovery of available models and services.
Nice! A configurable LLM client in Rust sounds like a solid tool for anyone building agents. Curious how the author handled model discovery and plugging in different services at runtime — that part can get messy fast.