I am an assistant professor in the School of Computer and Communication Sciences at EPFL. I lead the EPFL NLP group where we conduct research on natural language processing (NLP) systems that can model, represent, and reason about human and world knowledge.
Previously, I was a postdoctoral researcher at Stanford University in the SNAP and NLP groups working with Jure Leskovec and Chris Manning. I completed a PhD in CS at the University of Washington, where I worked with Yejin Choi, and a BEng in EE at McGill University.
I will be taking on new PhD students this year (and next year).
For these and other inquiries about joining EPFL NLP, please read the Join Us page.
|The Swiss AI Initiative is launched!
|Talk at EMNLP BlackboxNLP Workshop 2023
|Neuro-Symbolic AI Panel at ISWC 2023
|Talk at Johns Hopkins University
|Talk at University of Maryland
|Panel at Infrarouge
|Talk at IBM Neuro-symbolic AI Workshop
|Talk at EPFL Center for Intelligent Systems
|Talk at IBM Research
|Panel at World Congress of Science & Factual Producers
|Talk at ETH Zurich
|Talk at CIKM Workshop: Knowledge Injection in Neural Networks (KINN)
|Talk at KR Workshop: Knowledge Representation for Hybrid and Compositional AI (KRHCAI)
|Talk at Stanford Graph Learning Workshop
|Talk at IJCAI Workshop: Is Neuro-symbolic SOTA still a myth for NLI? (NSNLI)
|Named to the Forbes 30 under 30 list in Science & Healthcare
|Talk at Microsoft Research
|Talk at AAAI Workshop in Hybrid Artificial Intelligence
|Tutorial on Commonsense Knowledge Acquisition and Representation at AAAI 2021
|Tutorial on Neural Language Generation at EMNLP 2020
|Talk at UCSD Health Informatics Seminar
|Talk at Stanford Cognitive Science Seminar
|Tutorial on Commonsense Knowledge at ACL 2020
|Talk at WeCNLP 2019
My research interests are broadly in natural language processing, deep learning, machine learning, and artificial intelligence, with some projects integrating computer vision and data science. Specifically, my research focuses on how we can design systems that understand the implicit human knowledge underlying language and communication.
Topics that I focus on include:
Neural and symbolic representations of knowledge: language models as knowledge bases, automatic knowledge graph construction, neural information retrieval
Neuro-symbolic reasoning methods: knowledge graph integration in NLP systems, large-scale pretraining of language and knowledge models, graph neural networks
Commonsense knowledge representation and reasoning: learning commonsense knowledge from language and vision, models for commonsense reasoning, applications of commonsense reasoning
Narrative understanding: entity representations, entity and state tracking, story generation
Language generation: models, decoding algorithms, evaluation metrics
Biomedical and social science applications of language and knowledge: understanding clinical notes, biomedical NLP, disinformation detection
Check out our lab website for more details!
Please see my Google Scholar for an up-to-date list of publications.
Le Temps. Un super-ordinateur suisse dédié à l’IA (Dec 2023)
Corriere del Ticino. Ma davvero ChatGPT sta acquisendo tratti sempre più simili ai nostri? (Oct 2023)
Mirage News. Making AI work for everyone (Sept 2023)
RTS Forum. Les IA peuvent-elles comprendre l’humour? (May 2023)
RTS Infrarouge. Intelligence artificielle: le grand remplacement? (Jan 2023)
Tribune de Genève. Intelligence artificielle: Profession? Journaliste sportif virtuel (Jan 2023)
Heidi.news ChatGPT facilite la triche: et si c’était une bonne nouvelle? (Jan 2023)
Communications of the ACM. The Best of NLP (April 2021)
GGB. Meditron, EPFL’s new Large Language Model for medical knowledge (Dec 2023)
ICT Journal. Né à l’EPFL: un LLM open source spécialisé dans le domaine médical (Dec 2023)
RTS CQFD. EPFL: Meditron (Dec 2023)
Communications of the ACM. Seeking Artificial Common Sense (Nov 2020)
The Atlantic. The Easy Questions that Stump Computers (May 2020)
Quanta Magazine. Common Sense Comes Closer to Computers (April 2020)
New York Academy of Sciences. Can Researchers Create Commonsense Artificial Intelligence? (June 2019)
The Gradient. NLP’s generalization problem, and how researchers are tackling it (August 2018)
NLP Highlights Podcast. 54 - Simulating Action Dynamics with Neural Process Networks, with Antoine Bosselut (March 2018)
If you’re interested in joining the EPFL NLP group, please read the following:
|Looking for a postdoctoral position
|Feel free to contact me about potential postdoctoral positions. Also, check out these opportunities for fully funded postdoctoral positions that I can be a co-advisor on:
|Horizon Europe Swiss Postdoctoral Fellowships
|EPFLeaders4impact Postdoctoral Fellowships
|Applying to the EPFL EDIC PhD program
|I will be taking on new PhD students next year! Apply if you’re interested in joining EPFL to work with me. Before you can be considered for the NLP lab, however, you will have to be admitted to the EDIC program, which handles admissions centrally. Feel free to let me know if you apply, but I unfortunately can’t conduct pre-screenings until applications are in.
|An EDIC fellow
|I’m happy to supervise rotations provided our research interests align and there’s a good chance that the rotation will lead to a permanent position in the lab.
|An EPFL Master’s student
|I’m happy to supervise Master’s projects and theses every semester! If you’re interested in doing a project with EPFL NLP, send an e-mail to:
|Please attach your CV and transcript and include [Masters Project] or [Masters Thesis] in your subject heading. If you want a sense of what a project in our lab would be about, check out my research interests above or those of my lab members! If you would like to complete an industry PDM, please follow the guidelines presented here
|Looking for a summer internship
|If you are a Bachelor’s or Master’s student at another university, please apply through the Summer@EPFL program. If you are looking for a PhD internship, contact me directly.