With its unique Mixture-of-Experts (MoE) architecture, Arctic delivers top-tier intelligence with unparalleled efficiency at scale.
Snowflake, a Data Cloud company, announced Snowflake Arctic, a state-of-the-art large language model (LLM) uniquely designed to be the most open, enterprise-grade LLM on the market.
Sridhar Ramaswamy, CEO, Snowflake, said, “This is a watershed moment for Snowflake, with our AI research team innovating at the forefront of AI. By delivering industry-leading intelligence and efficiency in a truly open way to the AI community, we are furthering the frontiers of what open source AI can do. Our research with Arctic will significantly enhance our capability to deliver reliable, efficient AI to our customers.”
With its unique Mixture-of-Experts (MoE) architecture, Arctic delivers top-tier intelligence with unparalleled efficiency at scale. It is optimised for complex enterprise workloads, topping several industry benchmarks across SQL code generation, instruction following, and more.
In addition, Snowflake is releasing Arctic’s weights under an Apache 2.0 license and details of the research leading to how it was trained, setting a new openness standard for enterprise AI technology. The Snowflake Arctic LLM is a part of the Snowflake Arctic model family, a family of models built by Snowflake that also include the best practical text-embedding models for retrieval use cases.