MiniLLM Client banner

MiniLLM Client

1 devlogs
20m 10s

Interactive browser-based LLM chat with model selection, Markdown rendering, media pop-ups, chat history, copy & regenerate. Fully runs in your browser using localStorage — no backend needed! Click and try instantly.

Loading README...

prokopsafranek

I’m building Mini-LLM, a lightweight web client for running and interacting with LLMs directly in the browser.
No backend, no accounts — everything runs locally using LocalStorage.

Live demo: https://mini-llm.pages.dev

Attachment
2

Comments

almartdev
almartdev 14 days ago

this looks very promising but i cant access the web rn!

alistairtomori02
alistairtomori02 14 days ago

Love the UI