Stability AI's Stable LM 2 model now boasts 12 billion parameters
Stable LM 2 12B includes a base and instruction-tuned model, trained on multilingual data in seven languages.
The 12 billion parameter model offers a balance of performance, efficiency, and speed, suitable for a wide range of tasks.
Stable LM 2 1.6B has been updated with improved conversational skills and tool usage capabilities.
Stable LM 2 12B compares favorably to other popular language models on zero-shot and few-shot tasks, despite its smaller size.
More details about the updates