Moltis
Rust-native AI gateway with single binary deployment. Local-first AI agent framework featuring voice interaction, memory management, sandbox execution, and MCP tool integration.
Rust-Native AI Gateway - Single Binary, Local-First
Moltis is a Rust-native AI gateway and agent framework designed for local-first, privacy-preserving AI interactions. Distributed as a single binary with no external dependencies, Moltis combines voice interaction, memory management, sandbox execution, and MCP tool integration into a lightweight package. Official Website: https://moltis.org/ GitHub: https://github.com/moltis-org/moltis Developer: moltis-org (Community Open Source) License: MIT Key Innovation: Moltis brings together voice interaction, long-term memory, and safe tool execution in a single Rust binary that runs anywhere without dependencies.
Core Philosophy: "Local-first, privacy-preserving AI for everyone"
Why Moltis?
Single Binary: No Node.js, Python, or Docker required. Just download and run.
Voice Native: Built-in speech-to-text and text-to-speech capabilities for hands-free operation.
Memory System: Persistent vector-based memory that remembers your conversations and preferences.
Sandbox Security: Tools run in isolated sandboxes, protecting your system from malicious code.
Key Features
$#1.
Rust-Native Performance
- Single binary executable (~15-25MB)
- No runtime dependencies required
- Memory usage under 50MB at idle
- Sub-second startup time
- Cross-platform compatibility
$#2.
Voice Interaction
- Built-in speech recognition
- Text-to-speech output
- Wake word detection
- Multi-language support
- Edge TTS integration
$#3.
Memory Management
- Vector-based long-term memory
- Conversation history persistence
- User preference learning
- Context-aware responses
- Memory search and retrieval
$#4.
Sandbox Execution
- Isolated tool execution environment
- Resource limits enforcement
- Network access control
- File system restrictions
- Security policy configuration
$#5.
MCP Integration
- Model Context Protocol support
- Tool discovery and registration
- Standardized tool interfaces
- Community tool marketplace
- Custom tool development
Installation
Official Script (Recommended)
curl -fsSL https://www.moltis.org/install.sh | shHomebrew (macOS)
brew install moltisDocker
docker pull moltis/moltis:latest
docker run -d --name moltis -p 8080:8080 moltis/moltis:latestManual Download
# Download from GitHub releases
wget https://github.com/moltis-org/moltis/releases/latest/download/moltis-linux-x86_64.tar.gz
tar -xzf moltis-linux-x86_64.tar.gz
sudo mv moltis /usr/local/bin/Cargo Build
cargo install moltisConfiguration
Create ~/.config/moltis/config.toml:
[general]
name = "Moltis"
language = "en"
[voice]
enabled = true
wake_word = "hey moltis"
speech_model = "whisper-small"
[memory]
enabled = true
storage_path = "~/.local/share/moltis/memory"
max_memories = 10000
[sandbox]
enabled = true
network_access = false
file_access = ["~/Documents"]
[models]
default = "claude-sonnet-4-20250514"
fallback = "gpt-4o-mini"Pricing
Free: Moltis is completely free and open source under MIT license. You only pay for the AI models you use through their respective providers.
System Requirements
| Component | Minimum | Recommended |
|---|---|---|
| OS | Linux, macOS 12+, Windows 10 | Linux, macOS 13+, Windows 11 |
| CPU | 2 cores | 4+ cores |
| Memory | 512MB RAM | 2GB+ RAM |
| Storage | 100MB | 500MB |
| Network | Optional (for AI models) | Stable internet |
Voice Features Requirements
- Microphone for speech input
- Speakers for speech output
- Additional 200MB for speech models
Use Cases
Personal AI Assistant
Use Moltis as your daily AI companion with natural voice interaction. Ask questions, get reminders, and manage tasks hands-free.
Local Knowledge Base
Build a personal knowledge base that Moltis can search and reference. Upload documents, notes, and references for AI-assisted retrieval.
Safe Tool Execution
Run AI-powered tools in a sandboxed environment. Execute code, access APIs, and automate tasks without risking your system.
Privacy-Focused Workflows
Keep all AI interactions local when possible. Use Moltis for sensitive tasks where cloud processing is not desirable.
Resource-Constrained Deployments
Deploy Moltis on low-power devices, embedded systems, or environments where full Node.js/Python runtimes are not feasible.
Community and Support
- Website: https://moltis.org/
- GitHub: https://github.com/moltis-org/moltis
- Documentation: https://docs.moltis.org/
- Discord: Available on website
- Issues: GitHub Issues tab
Quick Start Guide
Get Moltis running in under 2 minutes — it is a single binary with zero dependencies.
Step 1: Download the Binary
Download the latest release from the official GitHub repository.
Step 2: Run
chmod +x moltis
./moltis setupStep 3: Configure
Set your AI model API key and preferred platform integration.
Time to first result: ~2 minutes — No runtime dependencies, single binary.
Full documentation: https://docs.moltis.org/
FAQ
Is Moltis free to use?
Yes, Moltis is free and open source (MIT license). You only pay for AI model API costs if using external models.
What are the system requirements for Moltis?
Moltis requires 50MB RAM of RAM minimum. Runtime: Native binary (Rust). It runs on Windows, macOS, and Linux.
Can I self-host Moltis?
Yes. Moltis is open source (MIT) and can be self-hosted on your own hardware. Clone the repository from GitHub and follow the installation guide.
How does Moltis compare to OpenClaw?
Moltis offers a different approach compared to OpenClaw. While OpenClaw provides the largest ecosystem with 13,729+ skills and maximum flexibility, Moltis focuses on personal assistant. Choose Moltis if you prioritize its specific features; choose OpenClaw for the broadest compatibility and community support.
Is Moltis suitable for beginners?
Moltis requires some technical knowledge to set up (Native binary (Rust)). If you are a beginner, consider starting with QClaw (one-click install) or MaxClaw (cloud-based, no setup) first, then graduate to Moltis as you gain experience.
License
MIT License - Free for personal and commercial use.
Tags
ai-agent, rust, local-first, voice-interaction, memory, sandbox, mcp, single-binary, privacy