Skip to main content

Getting Started for New Contributors

This page walks you through setting up a complete AuroraSOC development environment on your machine. By the end, the API will be running, the dashboard will be live, and you will be able to run the full test suite.


Prerequisites

Install these before starting:

ToolMinimum VersionWhat It DoesInstall
Python3.12+Runs the backend, agents, and training scriptspython.org or sudo apt install python3.12 python3.12-venv
Rust1.83+Compiles the optional Rust core fast pathcurl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Node.js18+Runs the Next.js dashboard and Docusaurus docsnodejs.org or nvm install 18
Docker & Docker Compose24+ / v2Runs PostgreSQL, Redis, NATS, Mosquitto, Grafana, etc.docs.docker.com
OllamaLatestServes the Granite 4 LLM locallycurl -fsSL https://ollama.com/install.sh | sh
Git2.30+Version controlsudo apt install git
MakeAnyRuns Makefile targets (the main developer interface)sudo apt install build-essential
tip

Run make help at any time to see every available target with a short description.


Step 1: Clone and Enter the Repository

git clone https://github.com/ahmeddwalid/AuroraSOC
cd AuroraSOC

Step 2: Start Infrastructure Services

AuroraSOC depends on several services. make docker-up starts the default Compose stack:

make docker-up

That default stack does not enable the profile-gated agent fleet or the optional rust-core service. Use explicit Compose profiles when you need those paths during development.

This starts:

ServicePortPurpose
PostgreSQL 165432Primary database
Redis 76379Cache, event streams, rate limiting
pgvectorPostgreSQL extension for vector embeddings (no separate service)
NATS JetStream4222Cross-site event federation
Mosquitto1883/8883MQTT broker for IoT devices
Prometheus9090Metrics collection
Grafana3001Monitoring dashboards
OTel Collector4317Distributed trace collection

To check everything is running:

docker compose ps

To view logs:

make docker-logs

Step 3: Install Python Dependencies

Create a virtual environment and install all dependencies (including dev and test extras):

python3 -m venv .venv
source .venv/bin/activate
make dev

This runs pip install -e ".[dev,test]", which installs AuroraSOC in editable mode so your code changes take effect immediately.


Step 4: Pull the LLM Models

AuroraSOC uses IBM Granite 4 models served through Ollama:

make ollama-pull-granite

This pulls two models:

  • granite4:8b — used by all 16 specialist agents
  • granite4:dense — used by the Orchestrator for complex reasoning

Verify the models are available:

make ollama-list

If Ollama is not running, start it first:

make ollama-serve

Step 5: Run Database Migrations

Apply the schema to PostgreSQL:

make migrate

This runs alembic upgrade head, which creates all 13 tables (alerts, cases, agents, playbooks, etc.). To check the current migration state:

alembic current

Step 6: Start the Backend API

make api

This starts FastAPI on http://localhost:8000 with hot-reload enabled. You should see:

INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
INFO: Started reloader process

Verify it's working:

curl http://localhost:8000/health

API Documentation

FastAPI auto-generates interactive API docs:


Step 7: Start the Dashboard

In a second terminal:

make dashboard-install # First time only
make dashboard-dev

The Next.js dashboard starts on http://localhost:3000. It connects to the FastAPI backend at port 8000.

For a production-style dashboard smoke test after make dashboard-build, run npm run start from dashboard/. AuroraSOC builds the dashboard with output: standalone, so the production start path uses the generated standalone server instead of next start, stages .next/static and public/ into the standalone directory, and binds to 0.0.0.0 by default. Set DASHBOARD_HOST only if you need a different listen address.

Shortcut

You can start both the API and dashboard with a single command:

make dev-all

This runs both processes and stops both when you press Ctrl+C.


Step 8: Run the Test Suite

make test

This runs all Python tests with pytest. For a coverage report:

make test-cov

Coverage HTML report is generated in htmlcov/index.html.

To run Rust tests as well:

make rust-test

To run all checks (lint + type-check + Python tests + Rust clippy + Rust tests + dashboard lint):

make check

Step 9: Verify Everything Works

Run through this checklist to confirm your setup is complete:

  • docker compose ps — default services are "Up"
  • curl http://localhost:8000/health — returns OK
  • make test — all tests pass
  • make lint — no linting errors
  • http://localhost:3000 — dashboard loads in browser
  • curl http://localhost:8000/api/v1/inference/status — backend and model status look healthy

Automated Local Setup

For a fully automated first-time setup, run:

make setup-local

This script (scripts/setup_local.sh) performs:

  1. Checks for all required tools (Python, Rust, Node.js, Docker, Ollama fallback tooling)
  2. Pulls Granite 4 models into Ollama fallback for local/offline use
  3. Creates a default .env file if missing
  4. Verifies all dependencies

Common Makefile Targets Reference

Development

TargetDescription
make installInstall Python dependencies (production only)
make devInstall with dev + test extras
make apiStart FastAPI backend (port 8000, hot-reload)
make dev-allStart API + dashboard together
make dashboard-devStart Next.js dashboard (port 3000)
make mcpStart the MCP Tool Registry server

Quality

TargetDescription
make testRun Python test suite
make test-covRun tests with coverage
make lintRun ruff linter
make formatAuto-format code with ruff
make type-checkRun mypy type checking
make checkRun ALL checks (lint + type + test + Rust + dashboard)

Rust

TargetDescription
make rust-buildBuild Rust core (release mode)
make rust-testRun Rust tests
make rust-clippyRun Rust linter

Database

TargetDescription
make migrateApply all pending migrations
make migrate-new MSG="description"Create a new migration
make migrate-downRoll back one migration

Docker

TargetDescription
make docker-upStart default compose stack
make docker-downStop compose stack
make docker-logsTail container logs
make docker-buildRebuild all Docker images

LLM / Training

TargetDescription
make ollama-pull-granitePull base Granite 4 models
make ollama-serveStart Ollama server
make train-dataPrepare SOC training datasets
make trainFine-tune Granite 4 (requires GPU)
make train-agent AGENT=nameFine-tune for a specific agent
make train-evalEvaluate fine-tuned model
make enable-finetunedSwitch agents to fine-tuned models
make disable-finetunedSwitch agents back to base models

Cleanup

TargetDescription
make cleanRemove all build artifacts and caches

Environment Variables

AuroraSOC is configured via environment variables, managed through aurorasoc/config/settings.py using Pydantic Settings. Key variables:

VariableDefaultDescription
DATABASE_URLpostgresql+asyncpg://aurora:aurora@localhost:5432/aurorasocPostgreSQL connection
REDIS_URLredis://localhost:6379Redis connection
PG_POOL_SIZE20PostgreSQL connection pool size
NATS_URLnats://localhost:4222NATS JetStream
MQTT_BROKERlocalhostMQTT broker host
OLLAMA_BASE_URLhttp://localhost:11434Ollama API
JWT_SECRET_KEY(must set)Secret for JWT token signing
GRANITE_USE_FINETUNEDfalseUse fine-tuned vs base models

You can set these in a .env file in the project root — the make setup-local script creates one for you.


Troubleshooting

"Connection refused" on port 5432/6379/4222

Infrastructure services are not running. Run make docker-up and wait 10–20 seconds for startup.

Ollama model not found

Run make ollama-pull-granite. If Ollama itself isn't installed, see the Prerequisites table above.

Import errors after pulling latest code

Re-install dependencies: make dev. If the database schema changed, also run make migrate.

Tests fail with database errors

Ensure PostgreSQL is running (docker compose ps) and migrations are applied (make migrate).

Dashboard shows "Network Error"

The FastAPI backend must be running (make api) on port 8000 before the dashboard can connect.

Rust build fails

Ensure Rust is installed (rustup --version). Then try make rust-build.


Next Steps

Your environment is ready. Now:

  1. Read the Contribution Guide to understand coding standards and the PR process.
  2. Pick a task — check the GitHub Issues board for issues tagged good first issue.
  3. Start coding — the make format and make lint commands will keep your code clean.