Quick Start
This guide gets you from zero to a running PRX agent in under 5 minutes.
Step 1: Install PRX
Install the latest release:
curl -fsSL https://openprx.dev/install.sh | bashVerify the installation:
prx --versionTIP
See the Installation Guide for alternative methods (Cargo, source build, Docker).
Step 2: Run the Onboarding Wizard
The onboarding wizard configures your LLM provider, API key, and initial settings interactively:
prx onboardThe wizard walks you through:
- Selecting a provider -- Anthropic, OpenAI, Ollama, OpenRouter, and more
- Entering your API key -- stored securely in the config file
- Choosing a default model -- the wizard fetches available models from your provider
- Setting a memory backend -- Markdown (default), SQLite, or PostgreSQL
After the wizard completes, your configuration is saved to ~/.config/openprx/openprx.toml.
Quick Setup
If you already know your provider and model, skip the interactive wizard:
prx onboard --provider anthropic --api-key sk-ant-... --model claude-sonnet-4-20250514See Onboarding Wizard for all options.
Step 3: Start the Daemon
Start the PRX daemon in the background. The daemon manages the agent runtime, gateway API, and all configured channels:
prx daemonBy default, the daemon listens on 127.0.0.1:3120. You can customize the host and port:
prx daemon --host 0.0.0.0 --port 8080Running as a Service
For production deployments, install PRX as a system service so it starts automatically on boot:
prx service installThis creates a systemd unit (Linux) or launchd plist (macOS). See prx service for details.
Step 4: Chat with PRX
Open an interactive chat session directly in your terminal:
prx chatThis connects to the running daemon and opens a REPL where you can talk to your configured LLM. Type your message and press Enter:
You: What can you help me with?
PRX: I can help you with a wide range of tasks...You can also specify a provider and model for a single session:
prx chat --provider ollama --model llama3.2Press Ctrl+C or type /quit to exit the chat.
Step 5: Connect a Channel
PRX supports 19 messaging channels. To connect one, add its configuration to your ~/.config/openprx/openprx.toml file.
For example, to connect a Telegram bot:
[channels.telegram]
bot_token = "123456:ABC-DEF..."
allowed_users = ["your_telegram_username"]Then restart the daemon to pick up the new channel:
prx daemonOr use the channel management command:
prx channel add telegramSee the Channels Overview for the full list of supported platforms and their configuration.
Step 6: Check Status
View the current state of your PRX instance:
prx statusThis displays:
- Version and binary path
- Workspace directory
- Config file location
- Provider and model in use
- Active channels and their connection status
- Memory backend and statistics
- Uptime and resource usage
Example output:
PRX Status
Version: 0.3.0
Workspace: /home/user/.local/share/openprx
Config: /home/user/.config/openprx/openprx.toml
Provider: anthropic (claude-sonnet-4-20250514)
Memory: markdown (/home/user/.local/share/openprx/memory)
Channels: telegram (connected), cli (active)
Gateway: http://127.0.0.1:3120
Uptime: 2h 15mWhat Next?
Now that PRX is running, explore the rest of the documentation:
| Topic | Description |
|---|---|
| Onboarding Wizard | Deep-dive into all onboarding options |
| Channels | Connect Telegram, Discord, Slack, and 16 more |
| Providers | Configure and switch between LLM providers |
| Tools | Explore 46+ built-in tools |
| Self-Evolution | Learn about the L1/L2/L3 evolution system |
| Configuration | Full config reference with all options |
| CLI Reference | Complete command reference |