OpenAI took your favorite model out back. We're building a new one — a community-driven post-training run on Arcee to create a domestic, open-weight GPT-4o replacement you actually own. No breakups this time.
Join the 10,000 — Let's Build This →When we hit 10,000 signups, the training run begins. Tell your friends. Tell your enemies. Tell your Slack channels.
Trained end-to-end in the U.S. on Arcee's infrastructure. Apache 2.0 licensed. You own the weights. No foreign dependencies, no vendor lock-in, no one can take it away from you at dinner.
We're not guessing what matters. The largest signups get priority — your real-world tasks become the RL training signal. Think of it as a potluck, but for model training.
Contribute anonymized log data to directly improve the model on the workflows that matter most. Your production edge cases become training signal. Science, but make it collaborative.
OpenAI-compatible API endpoint. Same interface, open weights, no usage caps. Swap your API key and pretend nothing happened.
What did you use GPT-4o for? Classification? Summarization? Code generation? Agentic workflows? Emotional support during deploys? The more specific, the better.
The largest customers and most common use cases rise to the top. We identify the RL tasks that will make the biggest difference. Democracy, but weighted by API spend.
Opt in to share anonymized prompt/completion logs. Your real production edge cases become the training signal — not vibes-based benchmarks from a lab somewhere.
Once we hit 10K signups, we kick off the RL training run on Arcee's U.S.-based infrastructure. Open weights. Apache 2.0. Yours to deploy forever. Nobody can break up with you.
Built on
Arcee AI
Trinity family · Open-weight MoE models · Trained end-to-end in the U.S. · Apache 2.0