VibeCode Documentation

Everything you need to build, run, and ship code from your browser.

Getting Started

What is VibeCode?
A browser-based code editor with AI superpowers

VibeCode is a full-featured, browser-based code editor powered by WebContainers. It provides a complete development environment with an integrated terminal, live preview, file explorer, and AI assistance -- all without installing anything on your machine.

Whether you are prototyping a new idea, learning to code, or building production apps, VibeCode gives you the tools to go from zero to deployed in minutes.

Quick Start Guide
1

Sign In

Create an account or sign in with your existing credentials.

2

Create a Playground

Click "New Playground" from the dashboard to start a new project.

3

Choose a Template

Pick from React, Next.js, Express, Vue, Angular, or Hono.

4

Start Coding

Write code, use the integrated terminal, and see live preview instantly.

System Requirements
  • A modern browser (Chrome, Edge, or Brave recommended)
  • Internet connection for authentication and GitHub features
  • Ollama installed locally for AI features (optional)

Editor Features

Monaco Editor

Full syntax highlighting, IntelliSense, auto-completion, multi-cursor editing, and minimap. The same editor engine that powers VS Code.

File Explorer

Create, rename, and delete files and folders. Drag-and-drop support with a familiar tree view.

Integrated Terminal

Full terminal powered by WebContainers. Run commands, install packages, and start dev servers. Supports multiple tabs.

Live Preview

See your running application in real time. The preview updates automatically as your dev server detects changes.

File Tabs

Open multiple files simultaneously. Unsaved changes are indicated with a dot so you never lose track of edits.

Breadcrumbs

Navigate your file path with clickable breadcrumbs at the top of the editor.

Status Bar

Displays cursor position, language mode, encoding, indentation, and active theme at a glance.

AI Features

VibeCode integrates local AI models via Ollama to provide intelligent coding assistance without sending your code to external servers.

AI Chat
Four specialized modes for different workflows
Chat

General-purpose conversation about your code, concepts, or architecture.

Code Review

Get feedback on code quality, best practices, and potential issues.

Bug Fix

Paste an error or describe a bug and get targeted debugging help.

Optimization

Improve performance, reduce bundle size, and refactor for clarity.

GitHub Integration

Push to GitHub
Create new repositories and push your playground code directly to GitHub without leaving the editor.
Import from GitHub
Clone any public or private repository into a new playground and start editing immediately.
Personal Access Token

To use GitHub features, generate a PAT:

  1. Go to GitHub > Settings > Developer Settings
  2. Click "Personal access tokens"
  3. Generate a new token with repo scope
  4. Paste the token in VibeCode settings

Keyboard Shortcuts

Command Palette
Ctrl + K
Save File
Ctrl + S
Search in Files
Ctrl + Shift + F
Toggle Sidebar
Ctrl + B
Accept AI Suggestion
Tab
Dismiss AI Suggestion
Esc

Templates

Kickstart your project with one of our pre-configured templates. Each comes with the framework, bundler, and dependencies pre-installed.

React
Popular

Vite + React with fast HMR

Next.js
Full-stack

Full-stack React framework with SSR

Express
Backend

Minimal Node.js backend server

Vue
Frontend

Vite + Vue for reactive UIs

Angular
Enterprise

Full-featured TypeScript framework

Hono
Edge

Ultrafast edge-first web framework

Setting Up Ollama

Ollama runs AI models locally on your machine. Follow these steps to enable AI features in VibeCode.

1

Download Ollama

Visit ollama.com and download the installer for your operating system.

2

Install and Launch

Run the installer and launch Ollama. It runs as a background service on your machine.

3

Pull a Model

Open a terminal and pull the CodeLlama model:

$ ollama pull codellama:latest
4

Verify Installation

Confirm the model is available:

$ ollama list
5

Ready to Go

VibeCode automatically detects Ollama running on localhost:11434. Open VibeCode and AI features will be available immediately.

Frequently Asked Questions

Built with Next.js, Shadcn UI, and WebContainers. Powered by local AI via Ollama.