Point your OpenAI, Anthropic, or OpenRouter calls to our API endpoint. Everything keeps working exactly as before—while we analyze your traffic and automatically route to the most cost-effective models.
Coming soon!
from openai import OpenAI
client = OpenAI(
base_url="https://api.neurometric.ai/v1",
api_key="your-openai-key"
)
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello!"}]
)
from anthropic import Anthropic
client = Anthropic(
base_url="https://api.neurometric.ai/v1",
api_key="your-anthropic-key"
)
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Hello!"}]
)
import OpenAI from "openai"
const client = new OpenAI({
baseURL: "https://api.neurometric.ai/v1",
apiKey: "your-openrouter-key"
})
Change Your Base URL
Point your API calls to api.neurometric.ai instead of your provider's endpoint. Your code stays the same. Everything keeps working.
We Analyze & Optimize
We forward your requests to the original provider and analyze patterns. Then test your workload on cheaper models to find savings.
Activate Smart Routing
Review your dashboard, accept recommendations, and we'll automatically route each request to the optimal model. No code changes needed.