Now live on orbitforge.dev

Downloads

OrbitForge ships as one product across hosted web, desktop, CLI, and VS Code.

The download story is designed to feel coherent. You do not get one polished web surface and then a weak handoff everywhere else.

Recommended entry point

OrbitForge Web

Browser workspace for planning, patching, and release coordination.

Surface

OrbitForge for macOS

Electron desktop target configured for DMG and ZIP packaging.

Surface

OrbitForge for Windows

Electron desktop target configured for NSIS installer and ZIP packaging.

Surface

OrbitForge for Linux

Electron desktop target configured for AppImage and tar.gz packaging.

Surface

OrbitForge CLI for macOS, Linux, and Windows

Cross-platform Node CLI with the `orbitforge` binary.

Surface

OrbitForge VS Code Extension

Workspace package ready to build into a VSIX.

Surface

Launch Docs

Setup guides, provider recipes, and enterprise rollout notes.

Installation flow

01

Open the hosted workbench or choose the surface you want to package locally.

02

Connect a provider lane for local or cloud execution.

03

Use the same mission, proof, and release workflow across every surface.

04

Ship with docs, pricing, downloads, and status already aligned.

Platform packaging

macOS

DMG + ZIP for desktop, VSIX for editor, TGZ for CLI

Local-first builders using Ollama, LM Studio, and desktop packaging.

Hosted web appDesktop appCLIVS Code extension

Windows

NSIS + ZIP for desktop, VSIX for editor, TGZ for CLI

Teams standardizing PowerShell, enterprise laptops, and shared rollout rules.

Hosted web appDesktop appCLIVS Code extension

Linux

AppImage + tar.gz for desktop, TGZ for CLI

CI pipelines, self-hosted coding flows, and local model infrastructure.

Hosted web appDesktop appCLI

Provider support

Choose the model lane that matches your privacy, speed, and release constraints.

Read provider docs

OrbitForge Demo

embedded

Zero-config built-in execution lane for the hosted product

Always works on the public site
Great for feature evaluation
No credentials required

Ollama

ollama

Best for private local coding sessions

Local inference
Offline-friendly
Strong code editing loops

LM Studio

openai-compatible

Desktop-hosted OpenAI-compatible endpoint

Works with desktop models
OpenAI-compatible
Fast local iteration

OpenAI

openai-compatible

Cloud frontier models for coding and reasoning

Tool use
Strong coding quality
Large ecosystem

Anthropic

anthropic

Claude-family models for long-horizon coding tasks

Long context
Strong planning
Reliable refactors

OpenRouter

openai-compatible

Broker multiple hosted models behind one endpoint

Model brokering
Fallback routing
Provider diversity