01. Zero Latency
Words appear char-by-char as you speak. Powered by a Rust-native audio bus that bypasses the JS bridge entirely.
A local-first, zero-latency voice assistant engine built with Rust and Tauri. Your audio never leaves localhost.
Words appear char-by-char as you speak. Powered by a Rust-native audio bus that bypasses the JS bridge entirely.
Runs offline. No API keys, no servers, no telemetry. Your voice data is processed in RAM and dropped.
Semantic endpoint detection. It knows when you've finished a sentence using neural analysis, not just silence detection.
Native support for English and Catalan (NeMo CTC). Plus Whisper Small integration for 99+ other languages.
Local LLM (FunctionGemma) analyzes your speech to trigger desktop tools. "Search Wikipedia for Rust" just works.
A high-performance modular monolith. Separation of concerns between domain logic and application glue.