đ Supported Models
Claude Models
Endpoint:
/api/v1/messages
| Model ID | Description | Status |
|---|---|---|
claude-opus-4-5-20251101 |
Opus 4.5 - Most powerful model | â Available |
claude-sonnet-4-5-20250929 |
Sonnet 4.5 - Balanced performance | â Available |
claude-sonnet-4-20250514 |
Sonnet 4 | â Available |
claude-opus-4-1-20250805 |
Opus 4.1 | â Available |
claude-haiku-4-5-20251001 |
Haiku 4.5 - Fastest model | â Available |
OpenAI / Codex Models
Endpoint:
/openai/v1/responses
â ī¸ Note: Uses Responses API format with
stream: true and input array (not messages)
| Model ID | Description | Status |
|---|---|---|
gpt-5.2-codex |
Default model (auto-selected if not specified) | Default |
gpt-5.1-codex |
Codex 5.1 | â Available |
gpt-5.1-codex-max |
Maximum context version | â Available |
gpt-5.1-codex-mini |
Lightweight & fast | â Available |
gpt-5.2 |
GPT-5.2 (â gpt-5.2-2025-12-11) | â Available |
Gemini Models
Endpoint:
/gemini/v1beta/models/{model}:generateContent
| Model ID | Description | Status |
|---|---|---|
gemini-3-pro-preview |
Default Gemini model | Default |
gemini-3-flash-preview |
Fast response model | â Available |
gemini-2.5-pro |
Gemini 2.5 Pro | â Available |
gemini-2.5-flash-image |
Image generation (1024px) - Nano Banana | â Available |
gemini-3-pro-image-preview |
Image generation (4096px) - Nano Banana Pro | â Available |
Antigravity Proxy (Free Fallback)
Direct Endpoint:
http://23.95.207.162:8080/v1/messages
âšī¸ Info: Free tier using Google Antigravity IDE OAuth authentication
| Model ID | Description | Status |
|---|---|---|
claude-opus-4-5-thinking |
Free Claude Opus 4.5 | â Available |
claude-sonnet-4-5-thinking |
Free Claude Sonnet 4.5 | â Available |
gemini-3-pro-high |
Gemini 3 Pro | â Available |
gemini-2.5-pro |
Gemini 2.5 Pro | â Available |
đ Fallback Strategy
CRS automatically switches between providers when rate limits or quota issues occur:
Claude Models Fallback
1. Claude API (Primary)
2. GitHub Copilot Claude (User Account)
3. Antigravity Proxy (Free Backup)
OpenAI/Codex Models Fallback
1. OpenAI API (User Account)
2. GitHub Copilot GPT (User Account)
Gemini Models Fallback
1. Gemini API (Primary)
2. GitHub Copilot Gemini (User Account)
3. Antigravity Proxy (Free Backup)
| Trigger Condition | Fallback Target | Reason |
|---|---|---|
| Claude API 429/529 | GitHub Copilot | User account priority |
| Copilot Claude exhausted | Antigravity | Free backup |
| OpenAI API 429 | GitHub Copilot GPT | User account |
| Gemini API 429 | GitHub Copilot | User account priority |
| Copilot Gemini exhausted | Antigravity | Free backup |
đģ Usage Examples
Claude API Request (cURL)
curl -X POST "http://23.95.207.162:13000/api/v1/messages" \
-H "Content-Type: application/json" \
-H "x-api-key: YOUR_CRS_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-5-20250929",
"max_tokens": 1024,
"messages": [
{
"role": "user",
"content": "Hello, Claude!"
}
]
}'
Environment Setup (Linux/macOS)
export ANTHROPIC_BASE_URL="http://23.95.207.162:13000/api"
export ANTHROPIC_AUTH_TOKEN="YOUR_CRS_API_KEY"
export CRS_API_KEY="YOUR_CRS_API_KEY"
OpenAI Codex Request (cURL)
curl -s -N "http://23.95.207.162:13000/openai/v1/responses" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_CRS_API_KEY" \
-d '{
"model": "gpt-5.1-codex",
"stream": true,
"input": [
{
"role": "user",
"content": "Write a Python hello world"
}
]
}'
Environment Setup
export OPENAI_API_KEY="YOUR_CRS_API_KEY"
export OPENAI_BASE_URL="http://23.95.207.162:13000/openai/v1"
Codex CLI Setup
# Install Codex CLI
npm install -g @openai/codex
# Create ~/.codex/config.toml
mkdir -p ~/.codex
cat > ~/.codex/config.toml <<'EOF'
personality = "pragmatic"
model_provider = "crs"
[model_providers.crs]
name = "CRS Claude Relay"
base_url = "http://23.95.207.162:13000/openai/v1"
wire_api = "responses"
env_key = "CRS_API_KEY"
EOF
# Use Codex
codex "your prompt here"
Gemini API Request (cURL)
curl -X POST "http://23.95.207.162:13000/gemini/v1beta/models/gemini-3-pro-preview:generateContent" \
-H "Content-Type: application/json" \
-H "x-goog-api-key: YOUR_CRS_API_KEY" \
-d '{
"contents": [
{
"role": "user",
"parts": [
{
"text": "Hello, Gemini!"
}
]
}
]
}'
Image Generation
curl -X POST "http://23.95.207.162:13000/gemini/v1beta/models/gemini-2.5-flash-image:generateContent" \
-H "Content-Type: application/json" \
-H "x-goog-api-key: YOUR_CRS_API_KEY" \
-d '{
"contents": [
{
"role": "user",
"parts": [
{
"text": "A beautiful sunset over mountains"
}
]
}
],
"responseModalities": ["TEXT", "IMAGE"]
}'
Python - Anthropic SDK
import os
from anthropic import Anthropic
client = Anthropic(
api_key="YOUR_CRS_API_KEY",
base_url="http://23.95.207.162:13000/api"
)
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Claude!"}
]
)
print(message.content)
Python - OpenAI SDK (for Codex)
from openai import OpenAI
client = OpenAI(
api_key="YOUR_CRS_API_KEY",
base_url="http://23.95.207.162:13000/openai/v1"
)
response = client.chat.completions.create(
model="gpt-5.1-codex",
messages=[
{"role": "user", "content": "Write a hello world"}
]
)
print(response.choices[0].message.content)
đ§ Server Management
SSH Access
ssh root@23.95.207.162
# Contact administrator for SSH credentials
Docker Commands
# Restart CRS service
cd /root/crs && docker-compose restart claude-relay
# View logs
docker logs --tail 100 crs-claude-relay-1
# Follow logs in real-time
docker logs -f crs-claude-relay-1
# Redis CLI access
docker exec -it crs-redis-1 redis-cli
# Check Antigravity Proxy health
curl -s "http://23.95.207.162:8080/health"
# View Antigravity logs
docker logs --tail 50 crs-antigravity-proxy-1
Antigravity Proxy Management
# SSH tunnel to access management panel (from your machine)
ssh -L 18080:localhost:8080 -L 51121:localhost:51121 <your-user>@<jump-host> -t \
"ssh -L 8080:localhost:8080 -L 51121:localhost:51121 root@23.95.207.162"
# Then open in browser: http://localhost:18080
# Contact administrator for SSH credentials
# Restart Antigravity Proxy
cd /root/crs && docker compose restart antigravity-proxy
â ī¸ Important Notes
Volume Mounting: When adding new volume mounts in docker-compose.yml, you must recreate the container:
docker compose up -d --force-recreate --no-deps claude-relay
Using docker-compose restart will NOT apply new volume configurations.
Code Modifications: All code changes must be made directly on the VPS server, not locally:
# SSH to VPS (contact admin for credentials)
ssh root@23.95.207.162
# Edit files on VPS
nano /root/crs/custom/xxx.js
# Restart service
cd /root/crs && docker-compose restart claude-relay
Auto-Fix 400 Errors: CRS includes automatic request sanitization that fixes common Claude API 400 errors including orphaned tool_result blocks, empty content, and thinking blocks.