Moly
Cross-Platform LLM Client

A privacy-first, zero-setup interface for all your local LLMs. Available on macOS, Windows, and Linux.

Visit Moly Website

A privacy-first, zero-setup interface for all your local LLMs.

moly-ai.ai

Cross-Platform

Built with Rust and Makepad, Moly runs natively on macOS, Linux, and Windows. Experience seamless AI interaction across all your devices.

Local & Remote LLMs

Connect to OpenAI, Anthropic, or run models locally with Moly Server. Choose from cutting-edge cloud models or run open-source models locally.

Lightning Fast

Built in Rust for maximum performance and memory safety. Experience instant responses, optimized memory usage, and native performance.

Open Source

Fully open source with Apache 2.0 license. Join our community, contribute to the project, or build your own AI-powered applications.

Privacy-First

Complete data sovereignty with local models. End-to-end encryption for cloud conversations. Your conversations stay private.

AI Agent Integration

Connect to MoFa servers to interact with AI agents. Deploy custom AI agents and create automated workflows.