MosaicML announced the availability of MPT-30B Base, Instruct, and Chat, the most advanced models in their MPT (MosaicML Pretrained Transformer) series of open-source large language models. These state-of-the-art models – which were trained with an 8k token context window – surpass …

MosaicML Releases Open-Source MPT-30B LLMs, Trained on H100s to Power Generative AI Applications Read more »