Arcee AI releases Trinity-Large-Thinking, a 399B-parameter MoE AI model under an Apache 2.0 license, allowing full customization and commercial use (Carl Franzen/VentureBeat)

Why it matters: The Apache 2.0 license enables any company to commercially customize Arcee AI's 399B-parameter model.
- Arcee AI released Trinity-Large-Thinking, a 399B-parameter MoE AI model (VentureBeat).
- Trinity-Large-Thinking is available under an Apache 2.0 license, permitting full customization and commercial use (VentureBeat).
- The release signifies a continuation of the open-source AI model trend that has evolved since ChatGPT's debut (VentureBeat).
Arcee AI has launched Trinity-Large-Thinking, a massive 399-billion-parameter Mixture-of-Experts (MoE) AI model, under the permissive Apache 2.0 license. This move, reported by Carl Franzen for VentureBeat, allows for complete customization and commercial application, marking a significant development in open-source AI accessibility.


