Show HN: LocalGPT – A local-first AI assistant in Rust with persistent memory I built LocalGPT over 4 nights as a Rust reimagining of the OpenClaw assistant pattern (markdown-based persistent memory, autonomous heartbeat tasks, skills system). It compiles to a single ~27MB binary — no Node.js, Docker, or Python required. Key features: - Persistent memory via markdown files (MEMORY, HEARTBEAT, SOUL markdown files) — compatible with OpenClaw's format - Full-text search (SQLite FTS5) + semantic search (local embeddings, no API key needed) - Autonomous heartbeat runner that checks tasks on a configurable interval - CLI + web interface + desktop GUI - Multi-provider: Anthropic, OpenAI, Ollama etc - Apache 2.0 Install: `cargo install localgpt` I use it daily as a knowledge accumulator, research assistant, and autonomous task runner for my side projects. The memory compounds — every session makes the next one better. GitHub: [https://github.com/localgpt-app/localgpt][1] Website: [https://localgpt.app][2] Would love feedback on the architecture or feature ideas. Comments URL: [https://news.ycombinator.com/item?id=46930391][3] Points: 5 # Comments: 0 [1]: https://github.com/localgpt-app/localgpt [2]: https://localgpt.app [3]: https://news.ycombinator.com/item?id=46930391 https://github.com/localgpt-app/localgpt