Skip to main content

General

FERAL is a Spatial Agentic OS — an open-source AI brain that connects to every device and app in your life. It runs locally on your machine and integrates with wearables (wristbands, smart glasses), smart-home devices, your computer, your phone, and software like calendar, email, and Slack.Unlike chatbots, FERAL has persistent memory, proactive intelligence (it acts without being asked), three autonomy levels, a voice pipeline with sub-200ms latency, and a hardware mesh that controls physical devices.
ChatGPT / SiriFERAL
Runs whereTheir cloud serversYour machine
MemoryForgets between sessions4-tier persistent memory + knowledge graph
DevicesNone (or limited)Wristbands, glasses, smart home, robots, computer
ProactiveNo — waits for promptsYes — watches context and acts
Voice1-2s cloud latencySub-200ms, wake word, interrupt-and-resume
PrivacyData sent to cloudEverything stays local
Open sourceNoYes — Apache 2.0
Open Interpreter and similar tools are terminal-based coding assistants — they execute code on your machine via an LLM. FERAL is a full operating system layer:
  • Hardware mesh with direct device control
  • Real-time biometric streaming from wearables
  • Proactive intelligence engine (rule + LLM hybrid)
  • Voice pipeline with wake word detection
  • Server-driven UI generation
  • Digital twin that models your decisions
  • Three enforced autonomy levels
  • Persistent 4-tier memory with knowledge graph
FERAL isn’t a better terminal agent — it’s a different category.

Privacy & Security

Only LLM API calls — and only if you choose a cloud provider (Anthropic, OpenAI, etc.). The text of your query is sent to the provider for inference.Everything else stays local:
  • Health data from wearables
  • Memory and knowledge graph
  • Smart-home commands
  • Screen captures
  • Voice audio (processed locally when using Whisper)
If you use Ollama (local models), nothing leaves your machine at all.
Keys are stored in ~/.feral/credentials.json, which is AES-256 encrypted at rest using a key derived from your system keychain. The vault is decrypted in memory only while FERAL is running. See Configuration for details.
Only if you enable it. Computer-use features (screen capture, browser automation, file operations) are controlled by the FERAL_COMPUTER_USE flag. When enabled, actions are gated by your autonomy level — in strict mode, every file or screen action requires explicit approval.

Models & Providers

FERAL supports 9 LLM providers out of the box:
ProviderModelsLocal?
OllamaLlama 3, Mistral, Phi-3, CodeLlama, any GGUFYes
AnthropicClaude Opus, Sonnet, HaikuNo
OpenAIGPT-4o, GPT-4, GPT-3.5No
OpenRouterAny model on OpenRouterNo
GoogleGemini Pro, Gemini FlashNo
GroqLlama 3, Mixtral (fast inference)No
TogetherVarious open modelsNo
FireworksVarious open modelsNo
Local GGUFAny GGUF via llama.cppYes
Yes. Use Ollama as your LLM provider with a locally downloaded model:
ollama pull llama3:70b
export FERAL_LLM_PROVIDER=ollama
export FERAL_LLM_MODEL=llama3:70b
feral start
Combine with FERAL_VOICE_PROVIDER=whisper_local for offline voice. All hardware, memory, and smart-home features work offline by default.
Yes. Use the dashboard settings or send a command:
> Switch to Claude Sonnet for this conversation
The brain swaps the active provider while preserving conversation context and memory.

Hardware

FERAL ships with a BLE adapter for its reference wristband (PPG-based, streaming HR, SpO2, skin temp). It also integrates with Whoop and Oura Ring via their APIs. Any BLE wearable that exposes standard GATT health services can be adapted. See Wristband.
No. Hardware integrations are optional. FERAL works as a powerful AI assistant on just your computer — with memory, voice, computer-use, and all software integrations. Hardware extends it into the physical world.

Development

FERAL is in active development. The core brain, memory system, voice pipeline, and hardware mesh are functional. It is suitable for personal use and development. We don’t recommend it for production deployments serving multiple users — yet.
git clone https://github.com/FERAL-AI/FERAL-AI.git && cd FERAL-AI
cd feral-core && pip install -e ".[llm,dev]"
cd ../feral-client && npm install && npm run dev
Read the CONTRIBUTING.md for coding standards, PR guidelines, and architecture docs. Issues labeled good first issue are a great starting point.
Apache 2.0. You can use, modify, and distribute FERAL freely, including in commercial products, as long as you include the license notice.
Yes. The Apache 2.0 license explicitly allows commercial use. You can build products, services, or integrations on top of FERAL without restrictions beyond license attribution.