Scan websites from your terminal — fast, repeatable, and AI-aware
Request scans, capture session videos, and get AI-summarized results. Use the lightweight CLI locally or call the API from your CI/CD pipelines.
Blazing Fast
Upload your project for AI analysis
Use the CLI commands below to analyze code or request a server scan. The uploader respects a .subtoignore file (one pattern per line) and always ignores .env.
subto upload [dir]— run a local AI analysis on sampled files (does not upload to server).subto scan upload [dir]— upload sampled file snippets to the server and request a scan; returnsuploadIdandscanId.
`.subtoignore` format: one pattern per line; lines starting with # are comments; patterns can be folder names (e.g. node_modules), simple globs like *.lock, or specific files. The uploader uses simple matching (exact, prefix, contains, or endsWith).
To enable richer AI responses set OPENAI_API_KEY or OPENROUTER_API_KEY in your environment. You can choose a model with AI_MODEL (for example: openai/gpt-oss-120b:free).
Example:
subto upload . --max-files 200 --max-bytes 5242880
Server endpoints
Interactive links printed by the CLI point to two server endpoints:
- /video/:scanId — redirects to a signed GCS URL for the session video (short-lived signed URL, typically ~1 hour).
- /aichat/:scanId — opens the stored scan UI at
/scanned/:scanIdwhere the AI assistant UI is available for interactive Q&A about the scan.
Make sure your environment sets OPENROUTER_API_KEY or OPENAI_API_KEY if you want server-side AI features. The CLI and server will use AI_MODEL when present to select a model.