Getting Started
Get FERAL running locally in under five minutes. By the end of this page you will have a brain server, a web dashboard, and your first conversation.Prerequisites
- Python 3.11+ —
python3 --version - An LLM API key (OpenAI, Anthropic, Gemini, Groq) or a local Ollama instance
Install
feral CLI, the FastAPI brain server, and the bundled web UI.
Setup
Run the interactive setup wizard:| Step | What It Configures |
|---|---|
| LLM Provider | Choose OpenAI, Anthropic, Gemini, Groq, or Ollama (free/local). API key is validated live. |
| Agent Identity | Name, personality, voice settings, and behavioral rules. |
| Skills & Tools | Enable computer use, web search, vision, hardware control. Add keys for Tavily, Spotify, etc. |
| Features | Voice mode (realtime / whisper / disabled), streaming, proactive behavior, wake word. |
~/.feral/. No cloud account needed.
Start
First Chat (CLI)
You can also talk to FERAL from the terminal:First Chat (SDK)
Docker Alternative
If you prefer Docker:| Service | URL |
|---|---|
| Brain + API | http://localhost:9090 |
| Web UI | http://localhost:3000 |
| Skill Registry | http://localhost:8080 |
What’s Next
- Architecture — understand how the brain, memory, and device mesh fit together
- Python SDK — build plugins and automate FERAL programmatically
- Write a Skill — add new capabilities to your agent
- Connect a Device — bring hardware into the mesh
