BrestStormTeam
Mission:
We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance.
Model:
Tempest-LLM – an efficient language model based on Mamba2, leveraging advanced compression methods to achieve an encoding efficiency of 1.58 bits per parameter.
Training Approach:
Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in:
This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts.
Impact:
Vision:
BrestStormTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation.