Get Laddr installed and configured on your system.
Prerequisites
- Python 3.9+ - Required for Laddr
- pip - Python package manager
- Docker (optional but recommended) - For full stack with dashboard
- Node.js (optional) - For MCP servers
Python Virtual Environment
Create a virtual environment to isolate dependencies:
Activate the virtual environment:
# On macOS/Linux
source venv/bin/activate
# On Windows (Git Bash or CMD)
.\\venv\\Scripts\\activate
Always activate your virtual environment before installing or using Laddr.
Install Laddr
Standard Installation
This installs:
- Laddr CLI
- Core framework
- API server
- Worker runtime
Development Installation
To develop against the Laddr repository:
Verify Installation
laddr --version
laddr --help
Docker Setup
For the full experience including the web UI and isolated environments, install Docker:
- Download from Docker Desktop
- Install and start Docker
- Verify installation:
docker --version
docker compose version
Create a New Project
Initialize a new Laddr project:
laddr init my_agent_system
cd my_agent_system
This creates:
my_agent_system/
├── agents/ # Agent definitions
├── workers/ # Worker scripts
├── tools/ # Custom tools
├── Dockerfile # Docker configuration
├── docker-compose.yml
├── main.py # Runner script
└── .env.example # Environment template
Copy the example environment file:
Edit .env with your configuration:
# .env
# LLM Configuration
GEMINI_API_KEY=your_gemini_api_key
# or
OPENAI_API_KEY=sk-...
# or
ANTHROPIC_API_KEY=sk-ant-...
# Optional: Tool API keys
SERPER_API_KEY=your_serper_api_key
# Queue Backend (default: memory)
QUEUE_BACKEND=memory # or redis, kafka
# Database (default: sqlite)
DB_BACKEND=sqlite # or postgresql
DATABASE_URL=sqlite:///./laddr.db
# Storage (default: local)
STORAGE_BACKEND=local # or minio, s3
Never commit your .env file to version control. It contains sensitive API keys.
Run the Stack
Option 1: Local Development (No Docker)
For quick testing without Docker:
# Set environment
export QUEUE_BACKEND=memory
export DB_BACKEND=sqlite
# Run agent locally
laddr run-local researcher --input '{"query": "test"}'
Option 2: Full Stack (Docker)
Start all services with Docker:
laddr run dev -d
# or
docker compose up -d
This starts:
- PostgreSQL - Database (port 5432)
- Redis - Message queue (port 6379)
- MinIO - Object storage (port 9000)
- API Server - REST API (port 8000)
- Dashboard - Web UI (port 5173)
- Workers - Agent workers
Access the dashboard at http://localhost:5173
Verify Installation
Check Services
Run Diagnostics
This validates:
- Database connectivity
- Queue backend connection
- Storage backend access
- LLM API connectivity
- Agent configuration
Test with a Simple Agent
# Add a test agent
laddr add agent tester \\
--role "Test Agent" \\
--goal "Test the system" \\
--llm-model gemini-2.5-flash
# Run it
laddr run-local tester --input '{"message": "Hello"}'
Troubleshooting
Command Not Found
If laddr command is not found:
# Check if installed
pip show laddr
# Reinstall if needed
pip install --upgrade laddr
# Or use Python module
python -m laddr --version
Docker Issues
If Docker commands fail:
# Check Docker is running
docker ps
# Check Docker Compose
docker compose version
# Restart Docker Desktop if needed
Port Conflicts
If ports are already in use:
# Check what's using the port
lsof -i :8000 # macOS/Linux
netstat -ano | findstr :8000 # Windows
# Change ports in docker-compose.yml
API Key Errors
Ensure your API keys are valid:
# Test API key
curl https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:generateContent?key=YOUR_KEY
Next Steps