Introducing DBRX, an open, general-purpose LLM created by Databricks. Across a range of standard benchmarks, DBRX sets a new state-of-the-art for established open LLMs.
- mixture-of-experts (MoE) architecture with 132B total parameters, of which 36B parameters are active on any input
- trained on 12 trillion tokens — Llama 2 was 2T
- maximum context length of 32k tokens
- Llama-like license: non-commercial terms set at 700 million users and cannot train on outputs.