Self-Hosted Quick Start
Get Atlas running locally in under 5 minutes with the demo dataset.
Using the hosted platform?
If you're using app.useatlas.dev, see the Hosted Quick Start instead — no CLI, Docker, or configuration files needed.
Prerequisites
Before you start, make sure you have:
- Bun v1.3.11+ — check with
bun --version - Docker — check with
docker --version(Docker Desktop must be running) - An LLM API key — Anthropic, OpenAI, or another supported provider
Set up your environment
Start the database
bun run db:upThis launches two Docker containers:
- Postgres with the simple demo dataset pre-seeded (3 tables, ~330 rows)
- Sandbox sidecar for isolated code execution
For a larger, production-like dataset, see Demo Datasets (--demo cybersec or --demo ecommerce).
You should see
[+] Running 2/2
✔ Container atlas-postgres-1 Started
✔ Container atlas-sandbox-1 StartedVerify with docker compose ps — both containers should show status Up or running.
Configure your environment
cp .env.example .envEdit .env and set your LLM provider. The database URLs are pre-configured for the local Docker setup — you only need to add your API key:
# Set the provider and its API key — Atlas auto-selects the best available model
ATLAS_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-...# Set the provider and its API key — Atlas auto-selects the best available model
ATLAS_PROVIDER=openai
OPENAI_API_KEY=sk-...# AWS Bedrock — uses IAM credentials from the environment
ATLAS_PROVIDER=bedrock
AWS_REGION=us-east-1
# Local Ollama — runs models on your machine (no API key needed)
ATLAS_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434
# AI Gateway — route through LiteLLM, Portkey, or similar proxy
ATLAS_PROVIDER=gateway
AI_GATEWAY_API_KEY=...See Environment Variables — LLM Provider for all supported providers and their required variables.
Generate the semantic layer
The semantic layer (a set of YAML entity files in semantic/) tells the agent what tables exist and what they mean. Generate it from the demo database:
# Profile the database and generate YAML entity files in semantic/entities/
bun run atlas -- initYou should see
Profiling postgres database...
[1/3] Profiling companies...
[2/3] Profiling people...
[3/3] Profiling accounts...
Done! Semantic layer for "default" is at ./semantic/Check the files: ls semantic/entities/ should list companies.yml, people.yml, and accounts.yml.
Start the dev servers
bun run devThis starts:
- API server at http://localhost:3001
- Web UI at http://localhost:3000
You should see
Both URLs should respond. Open http://localhost:3000 in your browser — you should see the Atlas chat interface with suggested starter questions.
Dev admin account: admin@useatlas.dev / atlas-dev — seeded automatically with the local Docker setup.
Ask your first question
The chat UI shows suggested questions based on the demo dataset. Try one:
- "How many companies are there by industry?"
- "What are the top 5 companies by revenue?"
- "Which department has the most people?"
You should see
The agent will:
- Explore the semantic layer to understand the schema
- Write a SQL query
- Execute it against the demo database
- Return an interpreted answer with a data table
The entire flow takes a few seconds. You'll see each step in the chat as the agent works.
You're up and running. The agent can answer questions about the demo dataset using natural language.
Useful commands
| Command | Description |
|---|---|
bun run dev | Start containers + dev servers |
bun run db:up | Start Postgres + sandbox sidecar |
bun run db:down | Stop containers |
bun run db:reset | Nuke volume and re-seed from scratch |
bun run atlas -- doctor | Validate environment and connectivity |
bun run atlas -- validate | Check semantic layer YAMLs for errors (offline, no DB needed) |
bun run atlas -- diff | Compare DB schema against semantic layer |
See CLI Reference for all commands and flags.
Next steps
- Understand the semantic layer — learn what entities, dimensions, measures, and metrics mean
- Connect your own database — replace the demo with your PostgreSQL, MySQL, or other datasource
- Deploy to production — Railway, Vercel, or Docker
- Set up authentication — API key, managed auth, or bring-your-own-token
- Choose an integration — not sure whether to use the widget, SDK, or API? Compare all options
- Embed in your app — add Atlas as a chat widget in any web app
- Integrate with Slack — query data from Slack channels
- Use as MCP server — connect Atlas to Claude Desktop or Cursor
- Troubleshooting — full error reference and debug logging