I Built a Terminal AI Chat App Using Docker — No API Keys, No Cloud
In the final part of this series, I build a complete terminal AI chat app in TypeScript using Docker Model Runner. One binary called llm that anyone can run — streaming responses, conversation memory, preset modes, auto model detection, and history saved to disk. I walk through the full architecture, every design decision, and what I learned. Plus — why I'm migrating this to Go next.
