New in 0.3.0
AI Agents & DSL Console
AI agents that understand your session and emit commands in a built-in DSL. Cloud or local inference.

MAGDA ships with AI agents that understand your session and act on it by emitting commands in a built-in domain-specific language. You can use the chat, or type commands directly in the REPL.
Cloud providers (OpenAI, Anthropic, Google) are supported, and so is fully local inference via llama.cpp — no API keys required.
Read more: DSL over structured output — when it makes sense and why (Medium)
- Agents operate on real project state, not a preview
- DSL REPL for direct, auditable commands
- Switch between cloud and local models from settings
