« MiMo » : différence entre les versions
(Page créée avec « ==en construction== == Définition == XXXXXXXXX == Français == ''' MiMo-7B'' == Anglais == '''MiMo-7B''' MiMo-7B, a large language model specifically designed for reasoning tasks. The model is optimized across both pre-training and post-training stages to unlock its reasoning potential. Despite having only 7 billion parameters, MiMo-7B achieves superior performance on mathematics and code reasoning tasks, outperforming even much larger models including Open... ») |
(Aucune différence)
|
Version du 17 mai 2025 à 19:48
en construction
Définition
XXXXXXXXX
Français
' MiMo-7B
Anglais
MiMo-7B
MiMo-7B, a large language model specifically designed for reasoning tasks. The model is optimized across both pre-training and post-training stages to unlock its reasoning potential. Despite having only 7 billion parameters, MiMo-7B achieves superior performance on mathematics and code reasoning tasks, outperforming even much larger models including OpenAI's o1-mini.
Source
Contributeurs: Arianne Arel, wiki





