MiniLLM Client banner

MiniLLM Client

2 devlogs
2h 53m 20s
This project uses AI

This project was developed with significant assistance from AI tools, which generated a large portion of the code (frontend), while I was responsible for the overall idea, architecture, integration, testing, and final decisions.

Demo Repository

Loading README...

prokopsafranek

I added key features like persistent chat history, chat renaming and deletion, model selection, Markdown rendering, response regeneration, and media popups. The app now runs fully in the browser and connects to AI models through a proxy, so you can try it out without needing your own API key.
šŸ‘‰ Try the demo: https://mini-llm.pages.dev

Attachment
0
prokopsafranek

Shipped this project!

Hours: 0.34
Cookies: šŸŖ 1
Multiplier: 2.59 cookies/hr

I built Mini-LLM, a fully client-side web interface for interacting with LLMs.
It runs entirely in the browser — no backend, no accounts, everything is stored locally.
I deployed a live demo and focused on keeping the architecture minimal and transparent.

prokopsafranek

I’m building Mini-LLM, a lightweight web client for running and interacting with LLMs directly in the browser.
No backend, no accounts — everything runs locally using LocalStorage.

Live demo: https://mini-llm.pages.dev

Attachment
2

Comments

aloyak
aloyak about 2 months ago

this looks very promising but i cant access the web rn!

alistairtomori02
alistairtomori02 about 2 months ago

Love the UI