Now live on orbitforge.dev

Deploy OrbitForge

A clean path from private repo to a public OrbitForge deployment on your own domain.

The hosted site is designed to deploy on Vercel with a lightweight setup, optional provider credentials, and a release checklist that proves the public surface is actually healthy.

Step 01

Import the repo into Vercel

Use a Vercel Hobby project and connect the GitHub repository that contains the public OrbitForge web app.

Create a new Vercel project from GitHub.
Set the Root Directory to `/` or leave it blank for the repository root.
Keep the framework as Next.js and use Node 20 or newer.

Step 02

Add environment variables

Set the public site URL plus whichever provider keys or base URLs you want enabled in production.

Set `NEXT_PUBLIC_SITE_URL=https://orbitforge.dev`.
Add hosted provider keys only if you want cloud models enabled.
Keep local-provider URLs optional so the public site still works without local infrastructure.

Step 03

Attach orbitforge.dev

Point the apex domain and optional `www` subdomain to Vercel so the hosted marketing site and workbench live on one address.

Add `orbitforge.dev` in Vercel Project Settings -> Domains.
Follow Vercel DNS prompts for the apex record and `www` CNAME.
Wait for SSL issuance, then set the production domain as primary.

Step 04

Run public launch checks

Before announcing the product, validate the key routes, health endpoint, and build pipeline from the same repo state you deploy.

Run `npm test` and `npm run build` from the repo root.
Open `/`, `/features`, `/docs`, `/download`, `/pricing`, and `/api/health`.
Verify the provider catalog and download/install guidance match the actual shipped surfaces.

Environment variables

Only configure what your deployment actually needs

NEXT_PUBLIC_SITE_URL

Required

Canonical public domain used for metadata, sitemap, and links.

https://orbitforge.dev

OLLAMA_BASE_URL

Optional

Optional Ollama endpoint for local/private deployments.

http://localhost:11434

LMSTUDIO_BASE_URL

Optional

Optional LM Studio OpenAI-compatible endpoint.

http://localhost:1234/v1

OPENAI_API_KEY

Optional

Enables OpenAI routes for hosted runs.

sk-...

ANTHROPIC_API_KEY

Optional

Enables Anthropic routes for hosted runs.

sk-ant-...

OPENROUTER_API_KEY

Optional

Enables OpenRouter brokered model access.

sk-or-...

Launch proof

Checks worth running before you announce anything

Public routes

Home, features, docs, deploy, pricing, download, status, and evidence all render without private localhost dependencies.

A public product site should not collapse when it leaves the development laptop.

Health endpoint

The hosted app exposes `/api/health` for uptime checks and deployment validation.

Operations and trust need a machine-readable status signal, not only screenshots.

Release content parity

Pricing, downloads, docs, and the README describe the same shipped surfaces and deployment flow.

Public claims need to match reality across every surface a new user sees.

Cross-surface install story

The repo documents web hosting, CLI packaging, desktop targets, and VS Code packaging together.

A product feels unfinished when the installation story changes by surface.