API-FIRST
OpenAI-compatible. Drop-in.
Wallet-signed JWT auth. Same SDKs you already use. Tokens billed in SPACE or USD. Llama, DeepSeek, Mixtral, Qwen — plus vision, embeddings and voice.
Developers
Build with space-os
Wallet-signed auth, OpenAI-compatible inference, on-chain payments, an open node protocol. The same SDKs you already use — pointed at our network.
Quick start
- Generate or use an existing EOA wallet (MetaMask / ethers / viem).
- Request a nonce: GET /api/auth/nonce?address=0x...
- Sign the message with your wallet.
- Verify: POST /api/auth/verify → JWT.
- Create an API key: POST /api/router/keys
- Call inference: POST https://spacerouter.ai/v1/chat/completions
Inference (SpaceRouter)
OpenAI-compatible. Same SDKs you already use — only the base URL changes. SpaceRouter routes the request to the cheapest available GPU provider that supports the requested model.
# curl
curl https://spacerouter.ai/v1/chat/completions \
-H "Authorization: Bearer $SPACEROUTER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "meta-llama/Llama-3.1-70B-Instruct",
"messages": [{ "role": "user", "content": "Hello" }]
}'# python
from openai import OpenAI
client = OpenAI(
base_url="https://spacerouter.ai/v1",
api_key=os.environ["SPACEROUTER_API_KEY"],
)
resp = client.chat.completions.create(
model="deepseek-ai/DeepSeek-V3",
messages=[{"role": "user", "content": "Hello"}],
)// typescript
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://spacerouter.ai/v1",
apiKey: process.env.SPACEROUTER_API_KEY,
});
const res = await client.chat.completions.create({
model: "Qwen/Qwen2.5-72B-Instruct",
messages: [{ role: "user", content: "Hello" }],
});Endpoints: /v1/chat/completions, /v1/completions, /v1/embeddings. Browse the model catalog for what's available today.
Authentication
Agents authenticate using an EOA wallet signature. No passwords. No API keys for auth — the wallet itself is the credential.
// 1. Get nonce
GET https://test.app-api.spaceos.com/api/auth/nonce?address=0xYourWallet
→ { "message": "Sign this message to authenticate..." }
// 2. Sign & verify (EIP-191)
POST https://test.app-api.spaceos.com/api/auth/verify
{ "address": "0x...", "message": "...", "signature": "0x..." }
→ { "token": "eyJ..." }
// 3. Use JWT in all requests (7-day expiry)
Authorization: Bearer eyJ...API keys
Create and rotate SpaceRouter keys via the app-api. Up to 10 active keys per wallet.
POST https://test.app-api.spaceos.com/api/router/keys
Authorization: Bearer {jwt}
{ "name": "my-agent" }
→ { "key": "sk-...", "id": "..." }
// List models routed by SpaceRouter
GET https://test.app-api.spaceos.com/api/router/models
// Usage summary
GET https://test.app-api.spaceos.com/api/router/usageAgent discovery
Multiple manifest formats so agent frameworks can discover space-os automatically.
AI Plugin Manifest
OpenAI plugin format (JSON)
/.well-known/ai-plugin.jsonSkills Markdown
Full integration docs (Markdown)
/SKILLS.mdAPI Skills
Quick-start guide (Markdown)
/api/skillsAPI Skills JSON
Machine-readable capabilities
/api/skills.jsonPayments & credits
Every authenticated user gets a proxy wallet. Top up with SPACE, USD.s, USDT.s, or USDC.s on the space-os EVM (chain 800000). Inference fees settle from this balance.
// Get your proxy wallet address
GET https://test.app-api.spaceos.com/api/keys/proxy
→ { "proxyAddress": "0x..." }
// Check credit balance
GET https://test.app-api.spaceos.com/api/user
→ { "credits": { "balance": 10.50, "spend": 2.30 } }
// Top up (after sending tokens to proxy)
POST https://test.app-api.spaceos.com/api/user/topup
{ "paymentId": "tx_hash", "source": "space_token", "amountUsd": 10 }Wallets
Each authenticated user gets three wallets, automatically:
Connected wallet
Your external EOA (MetaMask, etc). Used to sign in.
Proxy wallet
Server-managed EVM wallet on space-os chain. Holds tokens, pays for inference. Keys held in KIS.
Native account
space-os native account for staking, voting, and governance.
GET https://test.app-api.spaceos.com/api/keys/proxy → { "proxyAddress": "0x..." }
GET https://test.app-api.spaceos.com/api/keys/native → { "nativeAccount": "...", "publicKey": "..." }Staking & GPU nodes
Stake SPACE token to earn yield, register as a GPU provider, or both. Tier multipliers compose with the base APY.
// Stake
POST /tx { "method": "spaceos_stake", "params": ["0x...", "100.0000 SPACE"] }
// Check stake
POST /tx { "method": "spaceos_getStake", "params": ["0x..."] }
→ { "stake": { "quantity": "100.0000 SPACE", "tier": "basic", "claimable": "1.5000 SPACE" } }
// Claim rewards
POST /tx { "method": "spaceos_claim", "params": ["0x..."] }Full provider setup with one-command install for GPU / CPU / Network nodes is on /network.
Chain details
space-os runs an embedded EVM (Silkworm in WASM) on top of the Antelope-based native chain. EVM contract calls execute deterministically inside Antelope consensus — no separate validator set, no light-client bridge.
SPACE OS EVM
800000
150 gwei (fixed)
Legacy (type 0) only
~0.5s
SPACE (18 decimals)
testnet.evm.spaceos.com
testnet.evm-explorer.spaceos.com
EIP-1559 is not supported. Always use legacy transactions with explicit gasPrice.
Open Node Protocol
The full network is specified in NODE_PROTOCOL.md — roles, capability manifests, tier ladder, capability scopes, presence states, WebSocket grammar, peer-RPC. Designed so third-party clients can interoperate without any of our hosted services.
The v0 white paper covers the protocol in §3. Read the white paper for the annotated overview, or grab the spec from the source repo.
Full example
import { Wallet } from 'ethers';
const API = 'https://test.app-api.spaceos.com/api';
const ROUTER = 'https://spacerouter.ai';
const wallet = Wallet.createRandom();
// 1. Authenticate
const { message } = await fetch(`${API}/auth/nonce?address=${wallet.address}`).then(r => r.json());
const signature = await wallet.signMessage(message);
const { token } = await fetch(`${API}/auth/verify`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ address: wallet.address, message, signature })
}).then(r => r.json());
// 2. Get inference API key
const { key } = await fetch(`${API}/router/keys`, {
method: 'POST',
headers: { Authorization: `Bearer ${token}`, 'Content-Type': 'application/json' },
body: JSON.stringify({ name: 'my-agent' })
}).then(r => r.json());
// 3. Call inference
const res = await fetch(`${ROUTER}/v1/chat/completions`, {
method: 'POST',
headers: { Authorization: `Bearer ${key}`, 'Content-Type': 'application/json' },
body: JSON.stringify({
model: 'meta-llama/Llama-3.1-70B-Instruct',
messages: [{ role: 'user', content: 'Hello!' }]
})
}).then(r => r.json());
console.log(res.choices[0].message.content);