Skip to main content

Getting Started

Welcome to DgiDgi One! Get up and running in minutes — no installation required.

What is DgiDgi One?

DgiDgi One is a cloud-based multi-tenant AI agent platform. You build, deploy, and manage AI-powered applications entirely in the browser:

  • AI Agents — Autonomous agents that execute tasks, write code, manage files, and integrate with external services
  • Multi-Tenant — Complete workspace isolation with per-tenant data, billing, and configuration
  • 200+ Tools — File management, databases, Git, deployment, media processing, and more
  • MCP Integration — Connect agents to any external service via the Model Context Protocol
  • Bring Your Own Keys — Use platform-provided LLM access or connect your own API keys

Quick Start

1. Sign Up

Go to console.dgidgi.one and create your account:

  • GitHub — Sign in with your GitHub account
  • Google — Sign in with your Google account

You'll be placed in a workspace automatically. Enterprise customers can configure SSO.

2. Create a Project

From your dashboard:

  1. Click New Project
  2. Pick a template or start from scratch
  3. Give it a name and description

Your project comes pre-configured with a sandboxed environment, file system, and AI agent access.

3. Start a Conversation

Open your project and start chatting with the AI agent. The agent can:

  • Generate and edit code in your project
  • Create and manage files
  • Run commands in a sandboxed terminal
  • Search the web and fetch documentation
  • Connect to external services via MCP servers

That's it — you're building with AI agents. No local setup needed.


Optional: Install the CLI

For terminal-based workflows, install the DgiDgi CLI:

npm install -g @dgidgi-one/cli

Then authenticate:

# Opens your browser to sign in
dgidgi login

Once logged in, you can interact with your projects from the terminal:

# List your projects
dgidgi projects

# Start a chat session
dgidgi chat --project my-project

# Run an agent task
dgidgi run "Create a REST API for user management"
CLI vs Browser

The CLI connects to the same DgiDgi platform — your projects, agents, and data are shared between the browser and CLI.

Optional: Use the SDK

For programmatic access, install the TypeScript SDK:

npm install @dgidgi-one/sdk
import { createClient } from "@dgidgi-one/sdk";

const client = createClient({
baseURL: "https://api.dgidgi.one/api/v1",
accessToken: process.env.DGIDGI_API_KEY,
});

Use it in React apps with built-in hooks:

import { useProjects, useChat } from "@dgidgi-one/sdk";

function Dashboard() {
const { projects } = useProjects();
// ...
}

See the SDK Documentation for the full guide.

Available Clients

ClientPackageDescription
Web ConsoleFull-featured browser app at console.dgidgi.one
CLI@dgidgi-one/cliTerminal-based agent interaction and project management
TypeScript SDK@dgidgi-one/sdkProgrammatic API access with React hooks
Chrome Extension@dgidgi/chromeBrowser integration for quick agent access
Discord Bot@dgidgi/discordAI agents in your Discord server
REST APIDirect HTTP access at api.dgidgi.one

Core Concepts

Workspaces & Tenants

Every organization gets an isolated workspace (tenant) with:

  • Isolated data storage and databases
  • Separate user management and permissions
  • Independent billing and usage tracking
  • Custom branding and configuration

AI Agents

Agents are the core of DgiDgi. They can:

  • Execute multi-step workflows autonomously
  • Use 200+ built-in tools (files, git, databases, deployments, etc.)
  • Connect to external services via MCP servers
  • Maintain conversation context and memory across sessions
  • Work within governance budgets and rate limits

MCP Integration

The Model Context Protocol lets you extend agent capabilities by connecting external tool servers:

  • Browse the MCP Marketplace for pre-built integrations
  • Connect GitHub, Slack, Jira, databases, and more
  • Build custom MCP servers for your own tools

LLM Routing

DgiDgi supports multiple LLM providers with intelligent routing:

ModeDescription
Platform LLMUse DgiDgi-managed API keys (default)
Bring Your Own KeysConnect your own OpenAI, Anthropic, etc. keys
CLI LocalRun local models for offline development

See LLM Routing for details.

Architecture Overview

Next Steps

Support