Fleet Overview
Live platform health — auto-refreshes every 30s
Total Users
—
Active Containers
—
WA Connected
—
Local (Admin VM)
—
Worker VMs
—
Overloaded VMs
—
Consultant AI
Check
VM Nodes
Worker VM health, capacity and container management
Users
Subscription and container management
| User | Plan | Status | User ID | Container Name | Server IP | Container | Joined | Actions |
|---|
Auto-Healer
Loading...
Automated recovery events and escalations
Total Events
—
Success Rate
—
Escalated
—
Critical
—
Live Activity
Real-time platform events via WebSocket
Waiting for events...
Revenue & Growth
Subscription metrics and MRR breakdown
MRR
—
—
VM Expense
—
—
Profit
—
—
Active Subs
—
—
Trial Users
—
Converting
Churn Rate
—
—
Plan Distribution
Subscription Status
VM Expense by Provider
Migration Log
Container migration history
| User | From | To | Status | By | Time | Error |
|---|
Logs & Activity
Container logs and agent-start timeline
Click Refresh to load...
QR Diagnostics
QR code flow diagnostics — agent WS status, QR-related logs
Click Refresh to load...
VM Providers
API key status, fallback order, and provider management
Loading...
Fly.io Container Backend
Run agent containers as Fly Machines — no worker VMs required
Backend
—
Live Machines
—
DB Containers
—
Running
—
WA Connected
—
⚙️ FLY.IO CONFIGURATION
Controls which backend new
/containers/start requests use. Existing containers keep their original backend.🖥️ FLY MACHINES
| Machine ID | Name | State | Region | User | Private IP | CPU / Mem | Actions |
|---|---|---|---|---|---|---|---|
| Loading… | |||||||
AI Request Logs
Every LLM request — see what was sent, which AI replied, and what it said
Click Refresh to load...
Pending Commits
Unmerged branches from GitHub & Claude Code sessions
Loading...
Consultant AI
Chat directly with the local Ollama instance on the admin server
Status: unknown
Send a message to start chatting with Consultant AI
Server Specs
Loading...
Loaded in Memory
Models currently consuming RAM/VRAM — unload to free resources
Loading...
Installed Models
Manage Ollama models on the admin server
Loading models...
Install Model
Pick from popular models or enter a custom name from ollama.com/library. Cards are cross-checked against the admin server — already installed models show an INSTALLED badge.