Skip to main content

What is the OpenAI Compatible Interface

OpenAI’s Chat Completions API format has become the industry de facto standard. Many large language models provide OpenAI-compatible interfaces, allowing developers to access different models using the same code. Ace Data Cloud offers OpenAI-compatible Chat Completions interfaces for models such as Claude, Gemini, DeepSeek, Grok, Kimi, etc. You can switch models simply by changing the Base URL and model name.

Supported Models

ServiceEndpointExample Model
ClaudePOST /v1/chat/completionsclaude-sonnet-4-6
OpenAIPOST /openai/chat/completionsgpt-4o
GeminiPOST /gemini/chat/completionsgemini-2.5-flash
DeepSeekPOST /deepseek/chat/completionsdeepseek-r1
GrokPOST /grok/chat/completionsgrok-3
KimiPOST /kimi/chat/completionskimi-k2.5

Unified Invocation Method

All chat models use the same request format:
import requests

def chat(endpoint, model, message):
    return requests.post(
        f"https://api.acedata.cloud{endpoint}",
        headers={
            "Authorization": "Bearer YOUR_API_TOKEN",
            "Content-Type": "application/json",
        },
        json={
            "model": model,
            "messages": [{"role": "user", "content": message}],
            "max_tokens": 1024,
            "temperature": 0.7,
        },
    ).json()

# Calling different models - just change the endpoint and model name
claude = chat("/v1/chat/completions", "claude-sonnet-4-6", "你好")
gpt = chat("/openai/chat/completions", "gpt-4o", "你好")
gemini = chat("/gemini/chat/completions", "gemini-2.5-flash", "你好")
deepseek = chat("/deepseek/chat/completions", "deepseek-r1", "你好")

Streaming Output

All chat APIs support streaming output (Server-Sent Events):
data = {
    "model": "claude-sonnet-4-6",
    "messages": [{"role": "user", "content": "写一首诗"}],
    "stream": True,
}

response = requests.post(
    "https://api.acedata.cloud/v1/chat/completions",
    headers={"Authorization": "Bearer YOUR_API_TOKEN"},
    json=data,
    stream=True,
)

for line in response.iter_lines():
    if line:
        print(line.decode())

Compatibility with OpenAI SDK

Since the interface format is compatible, you can directly use the official OpenAI SDK by just modifying the base_url:
from openai import OpenAI

client = OpenAI(
    api_key="YOUR_API_TOKEN",
    base_url="https://api.acedata.cloud/v1",  # Claude
    # base_url="https://api.acedata.cloud/openai",  # OpenAI
    # base_url="https://api.acedata.cloud/gemini",  # Gemini
)

response = client.chat.completions.create(
    model="claude-sonnet-4-6",
    messages=[{"role": "user", "content": "你好"}],
)
print(response.choices[0].message.content)

Summary

With Ace Data Cloud’s OpenAI compatible interface, you can:
  • Access 6+ large language models with a single codebase
  • Flexibly switch models to compare performance and cost
  • Use the OpenAI SDK directly with zero migration cost
  • Unified billing and monitoring
Visit platform.acedata.cloud for a free trial.