Hello,
Could you please distribute the following job offer? Thanks.
Best,
Pascal
-------------------------------------------------------------------------------------
3-year PhD position in Computational Models of Semantic Memory and its Acquisition (Inria and University of Lille, France)
We invite applications for a 3-year PhD position at the University of Lille in the context of the recently funded research project "COMANCHE" (Computational Models of Lexical Meaning and Change). The position is funded by Inria, the French national research institute in Computer Science and Applied Mathematics.
COMANCHE proposes to transfer and adapt neural word embeddings algorithms to model the acquisition and evolution of word meaning, by comparing them with linguistic theories on language acquisition and language evolution. At the intersection between Natural Language Processing, psycholinguistics and historical linguistics, this project intends to validate or revise some of these theories, while also developing computational models that are less data hungry and computationally intensive as they exploit new inductive biases inspired by these disciplines.
The first strand of the project, on which the successful candidate will work, focuses on the development of computational models of semantic memory and its acquisition. Two main research directions will be pursued. On the one hand, we will compare the structural properties associated to different semantic spaces derived from word embedding algorithms to those found in human semantic memory as reflected in behavioral data (such as typicality norms) as well as brain imaging data. The latter data will then used as additional supervision to inject more hierarchical structure into the learned semantic spaces. One the other hand, we intend to experiment with training regimes for word embedding algorithms that are closer to those of humans when they acquire language, controlling the quantity as well as the linguistic complexity of the inputs fed to the learning algorithms through the use of longitudinal and child directed speech corpora (e.g., CHILDES, Colaje). In both cases, both English and French data will be considered.
The successful candidate holds a Master's degree in computational linguistics or computer science or cognitive science and has prior experience in word embedding models. Furthermore, the candidate will provide strong programming skills, expertise in machine learning approaches and is eager to work across languages.
The position is affiliated with the MAGNET team at Inria, Lille [1] as well as with the SCALAB group at University of Lille [2] in an effort to strenghten collaborations between these two groups, and ultimately foster cross-fertilizations between Natural Language Processing and Psycholinguistics.
Applications will be considered until the position is filled. However, you are encouraged to apply early as we shall start processing the applications as and when they are received. Applications, written in English or French, should include a brief cover letter with research interests and vision, a CV (including your contact address, work experience, publications), and contact information for at least 2 referees. Applications (and questions) should be sent to Angèle Brunellière (angele.brunelliere@univ-lille.fr) and Pascal Denis (pascal.denis@inria.fr).
The starting date of the position is 1 October 2022 or soon thereafter, for a total of 3 full years.
Best regards,
Angèle Brunellière and Pascal Denis
[1] https://team.inria.fr/magnet/ [2] https://scalab.univ-lille.fr/
Hello,
Could you please distribute the following job offer? Thanks.
Best,
Pascal
-------------------------------------------------------------------------------------
3-year PhD position in Computational Models of Semantic Memory and its Acquisition (Inria and University of Lille, France)
We invite applications for a 3-year PhD position at the University of Lille in the context of the recently funded research project "COMANCHE" (Computational Models of Lexical Meaning and Change). The position is funded by Inria, the French national research institute in Computer Science and Applied Mathematics.
COMANCHE proposes to transfer and adapt neural word embeddings algorithms to model the acquisition and evolution of word meaning, by comparing them with linguistic theories on language acquisition and language evolution. At the intersection between Natural Language Processing, psycholinguistics and historical linguistics, this project intends to validate or revise some of these theories, while also developing computational models that are less data hungry and computationally intensive as they exploit new inductive biases inspired by these disciplines.
The first strand of the project, on which the successful candidate will work, focuses on the development of computational models of semantic memory and its acquisition. Two main research directions will be pursued. On the one hand, we will compare the structural properties associated to different semantic spaces derived from word embedding algorithms to those found in human semantic memory as reflected in behavioral data (such as typicality norms) as well as brain imaging data. The latter data will then used as additional supervision to inject more hierarchical structure into the learned semantic spaces. One the other hand, we intend to experiment with training regimes for word embedding algorithms that are closer to those of humans when they acquire language, controlling the quantity as well as the linguistic complexity of the inputs fed to the learning algorithms through the use of longitudinal and child directed speech corpora (e.g., CHILDES, Colaje). In both cases, both English and French data will be considered.
The successful candidate holds a Master's degree in computational linguistics or computer science or cognitive science and has prior experience in word embedding models. Furthermore, the candidate will provide strong programming skills, expertise in machine learning approaches and is eager to work across languages.
The position is affiliated with the MAGNET team at Inria, Lille [1] as well as with the SCALAB group at University of Lille [2] in an effort to strenghten collaborations between these two groups, and ultimately foster cross-fertilizations between Natural Language Processing and Psycholinguistics.
Applications will be considered until the position is filled. However, you are encouraged to apply early as we shall start processing the applications as and when they are received. Applications, written in English or French, should include a brief cover letter with research interests and vision, a CV (including your contact address, work experience, publications), and contact information for at least 2 referees. Applications (and questions) should be sent to Angèle Brunellière (angele.brunelliere@univ-lille.fr) and Pascal Denis (pascal.denis@inria.fr).
The starting date of the position is 1 May 2023 or soon thereafter, for a total of 3 full years.
Best regards,
Angèle Brunellière and Pascal Denis
[1] https://team.inria.fr/magnet/ [2] https://scalab.univ-lille.fr/
Hello,
Could you please distribute the following job offer? Thanks.
Best,
Pascal Denis
3-year PhD position in Computational Models of Semantic Memory (Inria and University of Lille, France)
We invite applications for a 3-year PhD position at the University of Lille, funded by Inria, the French national research institute in Computer Science and Applied Mathematics.
RESEARCH ENVIRONMENT
The PhD position will be hosted within the MAGNET team at Inria Lille [1], in partnership with with the SCALAB group at University of Lille [2] in an effort to strenghten collaborations between these two research teams, and specifically to foster cross-fertilizations between Natural Language Processing (NLP) and psycholinguistics. The MAGNET is actually evolving into a new interdisciplinary research group focusing on cognitively-grounded computational, neural-based models of language and reasoning.
RESEARCH PROJECT
This PhD project investigates semantic memory through complementary contrastive and integrative approaches, at the intersection of cognitive psychology and natural language processing. The overarching goal is to better understand the semantic capacities of large language models (LLMs) by comparing them to human cognition, and to improve these models using cognitively inspired learning biases.
The first research axis focuses on contrastive evaluation: we will design robust probing and prompting techniques to analyze how different families of LLMs (e.g., auto-regressive vs. masked models) encode and organize semantic knowledge. Models will be evaluated on datasets from experimental psychology, such as typicality norms (e.g., Rosch) and semantic feature norms (e.g., McRae, Buchanan), possibly including new data collection. The goal is to assess whether and how these models exhibit well-known properties of human semantic memory such as taxonomic and prototypical organization, semantic feature sharing and inheritance, and polysemy —building upon preliminary work carried out in the team [3, 4, 5]. In addition, we intend to explore the structure of representations in vision-language models to investigate how multi-modal grounding shapes semantic memory, in light of findings from blind populations and developmental theories that challenge the necessity of visual input for acquiring rich word meanings.
The second axis focuses on integrative modeling, aiming to develop LLMs with inductive biases inspired by human cognitive development. Drawing from developmental psycholinguistics and findings in semantic memory acquisition, we will explore how representations evolve in humans and model this process in artificial learners. We will experiment with training regimes that control input volume, syntactic complexity, and curriculum structure. Longitudinal corpora and multimodal input (e.g., visual and symbolic data) will be used to simulate developmental conditions. This approach is directly inspired by recent initiatives such as the BabyLM benchmark campaigns, which promote the design of smaller, more data-efficient language models grounded in child language learning. Our goal is to integrate such developmental constraints into the architecture and training of LLMs in order to foster interpretability, efficiency, and cognitive plausibility. In both axes, both English and French data will be considered.
REQUIREMENTS
Applicants should hold a Master’s degree (or equivalent) in one or more of the following fields: Computational Linguistics, Natural Language Processing, Artificial Intelligence, Machine Learning, Cognitive Science. Strong programming skills (Python preferred), a solid foundation in empirical research methods, and an interest in interdisciplinary work combining formal, computational, and experimental approaches are highly desirable.
APPLICATION PROCESS
Applications will be considered until the position is filled. However, you are encouraged to apply early as we shall start processing the applications as and when they are received. Applications, written in English or French, should include a brief cover letter with research interests and vision, a CV (including your contact address, work experience, publications), and contact information for at least 2 referees. Applications (and questions) should be sent to Pascal Denis (pascal.denis@inria.fr) and Angèle Brunellière (angele.brunelliere@univ-lille.fr).
The starting date of the position is 1 October 2025 or soon thereafter, for a total of 3 full years.
Best regards,
Angèle Brunellière and Pascal Denis
[1] https://team.inria.fr/magnet/ [2] https://scalab.univ-lille.fr/ [3] https://aclanthology.org/2023.eacl-main.167.pdf [4] https://aclanthology.org/2023.findings-emnlp.615.pdf [5] https://aclanthology.org/2024.emnlp-main.156.pdf