Arcee AI releases Trinity, an Apache-licensed 400B open-weight foundation model
Arcee AI says it trained a 400B open-weight LLM in six months for 20 million dollars, positioning Trinity as a US-built alternative to Meta and China-based open models....

Key Takeaways
- Arcee AI released Trinity, a 400B-parameter Apache-licensed open-weight foundation model aimed at developers and academics.
- The company claims it trained Trinity and smaller variants in six months for 20 million dollars using 2,048 Nvidia Blackwell B300 GPUs.
- Trinity is text-only today; vision is in development and speech-to-text is on the roadmap, with a hosted API planned.
- Apache licensing is positioned as a commercial-friendly alternative to Meta’s Llama license, which has faced open-source compliance criticism.
A 30-person startup is trying to change the power dynamics of foundation models by betting that “permanently open” beats “open-ish.” Arcee’s new AI model, Trinity, is released under the Apache license and targets developers and researchers who want frontier-scale weights without usage caveats.
Trinity targets open-weight text AI for developers and academics
Arcee AI says Trinity is a 400B-parameter general-purpose foundation model, putting it in the same weight class as other large open releases such as Meta’s Llama 4 Maverick and Tsinghua-linked Z.ai’s GLM-4.5. Benchmark results shared by Arcee suggest Trinity’s base model is competitive on common evaluation buckets like coding, math, reasoning, and knowledge, with some tests slightly ahead of Llama.
For B2B teams, the immediate relevance is less about raw parameter counts and more about licensing and control. Apache licensing is broadly considered “clean” for commercial reuse, whereas Llama’s license has been criticized as not open source compliant by some groups, including the Open Source Initiative’s analysis: opensource.org/blog/metas-llama-license-is-still-not-open-source. That matters if you want to embed a model in a SaaS product, run it in regulated environments, or fine-tune without legal ambiguity.
Training cost, roadmap, and what marketers should watch
Arcee says it trained Trinity (plus 26B “Mini” and 6B “Nano” variants) in six months for 20 million dollars, using 2,048 Nvidia Blackwell B300 GPUs, funded from roughly 50 million dollars raised to date. Trinity Large will ship in multiple flavors: a base model, an instruct-tuned preview for chat use, and “TrueBase,” positioned as a version without instruct data or post-training so enterprises can customize without “unrolling” inherited behaviors.
Right now Trinity is text-only, so it’s primarily a play in text AI workflows like coding copilots, agentic automation (multi-step tool use), and internal knowledge assistants using RAG (retrieval-augmented generation, a technique that lets an LLM pull facts from external data). Arcee says vision is in development and speech-to-text is on its roadmap. It also plans a hosted API, while listing Trinity Mini pricing at 0.045 dollars per input and 0.15 dollars per output.
For marketing and e-commerce operators, the near-term signal is optionality: an Apache-licensed, US-built 400B open-weight model could expand the pool of models that can be deployed privately, tuned to first-party data, and shipped globally without vendor lock-in.
Stay Informed
Weekly AI marketing insights
Join 5,000+ marketers. Unsubscribe anytime.
