On behalf of our colleague, Laurent Besacier:
hi everyone
i thought this info could be useful for scientists
working on NMT for low resource languages
we recently released SMaLL-100 model, a Shallow
Multilingual MT Model for Low-Resource Languages
it is a distilled version of the large 12B MTM-100
model released by Meta
but the reason why i want to share this to SIGUL &
EAMT community is because we provide models (which are a good
pre-trained model to develop MT for low resource language pair)
and also a demo platform to access MT for those 10,000 language
pairs !
(still a bit slow because currently running on 2 vCPU - 16GB RAM)
Best regards
Laurent Besacier
Naver Labs Europe
--
Claudia Soria
Researcher
Cnr-Istituto di Linguistica Computazionale “Antonio Zampolli”
Via Moruzzi 1
56124 Pisa
Italy
Tel. +39 050 3153166
Skype clausor