starcoder-2 starcoder2-15b hugging-face bigcode code-generation model-training dataset-release model-performance dylan-patel
HuggingFace/BigCode has released StarCoder v2, including the StarCoder2-15B model trained on over 600 programming languages using the The Stack v2 dataset. This release marks a state-of-the-art achievement for models of this size, with opt-out requests excluded from training data. A detailed technical report is available, highlighting the model's capabilities and training methodology. Additionally, a live event featuring Dylan Patel discussing GPU economics is announced for San Francisco.