Dear colleague,
We are pleased to announce that nominations are now open for the 2024
Linguapax Award, an international prize recognizing individuals, groups,
and organizations that have made significant contributions to the
preservation and promotion of linguistic diversity, multilingualism, and
endangered language revitalization efforts worldwide. Since 2002, this
annual award has celebrated and supported initiatives that empower
linguistic communities and advance global linguistic diversity.
*Award Criteria and Submission*
Eligible nominees include both individuals and organizations actively
engaged in activities aligned with these goals, whether through education,
community work, or research. The selection process prioritizes projects and
individuals whose efforts have demonstrated impact and dedication to
safeguarding linguistic heritage. The deadline for nominations is February
21, 2025.
*Get Involved with Linguapax*
Linguapax offers numerous ways to support and participate in its mission.
We invite you to join our community of advocates by contributing to our
programs, attending events, or becoming a member. By joining Linguapax, you
play an essential role in fostering a global network that stands for
linguistic rights, intercultural dialogue, and peace.
For more details on the nomination process or ways to support Linguapax,
please visit our website:
- Nomination Details: Linguapax Award 2024
<https://www.linguapax.org/en/call-for-nominations-for-the-linguapax-award-2…>
- Get Involved: Join Linguapax <https://www.linguapax.org/en/get-involved/>
We look forward to celebrating the vital work of this year's nominees and
thank you for your continued commitment to linguistic diversity and
cultural heritage.
Warm regards,
Maite Melero
Secretary of the Board
[image: LogoLPXunesco.png]
[Due to many requests, we have extended the submission deadline]
Neural language models have revolutionised natural language processing (NLP) and have provided state-of-the-art results for many tasks. However, their effectiveness is largely dependent on the pre-training resources. Therefore, language models (LMs) often struggle with low-resource languages in both training and evaluation. Recently, there has been a growing trend in developing and adopting LMs for low-resource languages. LoResLM aims to provide a forum for researchers to share and discuss their ongoing work on LMs for low-resource languages.
>> Topics
LoResLM 2025 invites submissions on a broad range of topics related to the development and evaluation of neural language models for low-resource languages, including but not limited to the following.
*
Building language models for low-resource languages.
*
Adapting/extending existing language models/large language models for low-resource languages.
*
Corpora creation and curation technologies for training language models/large language models for low-resource languages.
*
Benchmarks to evaluate language models/large language models in low-resource languages.
*
Prompting/in-context learning strategies for low-resource languages with large language models.
*
Review of available corpora to train/fine-tune language models/large language models for low-resource languages.
*
Multilingual/cross-lingual language models/large language models for low-resource languages.
*
Applications of language models/large language models for low-resource languages (i.e. machine translation, chatbots, content moderation, etc.
>> Important Dates
*
Paper submission due – 5th November 2024 12th November 2024
*
Notification of acceptance – 25th November 2024
*
Camera-ready due – 13th December 2024
*
LoResLM 2025 workshop – 19th / 20th January 2025 co-located with COLING 2025
>> Submission Guidelines
We follow the COLING 2025 standards for submission format and guidelines. LoResLM 2025 invites the submission of long papers of up to eight pages and short papers of up to four pages. These page limits only apply to the main body of the paper. At the end of the paper (after the conclusions but before the references), papers need to include a mandatory section discussing the limitations of the work and, optionally, a section discussing ethical considerations. Papers can include unlimited pages of references and an unlimited appendix.
To prepare your submission, please make sure to use the COLING 2025 style files available here:
*
Latex - https://coling2025.org/downloads/coling-2025.zip
*
Word - https://coling2025.org/downloads/coling-2025.docx
*
Overleaf - https://www.overleaf.com/latex/templates/instructions-for-coling-2025-proce…
Papers should be submitted through Softconf/START using the following link: https://softconf.com/coling2025/LoResLM25/
>> Organising Committee
*
Hansi Hettiarachchi, Lancaster University, UK
*
Tharindu Ranasinghe, Lancaster University, UK
*
Paul Rayson, Lancaster University, UK
*
Ruslan Mitkov, Lancaster University, UK
*
Mohamed Gaber, Birmingham City University, UK
*
Damith Premasiri, Lancaster University, UK
*
Fiona Anting Tan, National University of Singapore, Singapore
*
Lasitha Uyangodage, University of Münster, Germany
>> Programme Committee
*
Burcu Can - University of Stirling, UK
*
Çağrı Çöltekin - University of Tübingen, Germany
*
Debashish Das - Birmingham City University, UK
*
Alphaeus Dmonte - George Mason University, USA
*
Daan van Esch - Google
*
Ignatius Ezeani - Lancaster University, UK
*
Anna Furtado - University of Galway, Ireland
*
Amal Htait - Aston University, UK
*
Ali Hürriyetoğlu - Wageningen University & Research, Netherlands
*
Danka Jokic - University of Belgrade, Serbia
*
Diptesh Kanojia - University of Surrey, UK
*
Jean Maillard - Meta
*
Maite Melero - Barcelona Supercomputing Centre, Spain
*
Muhidin Mohamed - Aston University, UK
*
Nadeesha Pathirana - Aston University, UK
*
Alistair Plum - University of Luxembourg, Luxembourg
*
Sandaru Seneviratne - Australian National University, Australia
*
Ravi Shekhar - University of Essex, UK
*
Taro Watanabe - Nara Institute of Science and Technology, Japan
*
Phil Weber - Aston University, UK
URL - https://loreslm.github.io/
Twitter - https://x.com/LoResLM2025
Best Regards
Tharindu Ranasinghe