Daniel Loureiro

Postdoctoral Research Associate

Natural Language Processing @ Cardiff University


I'm a Research Associate working with the NLP group of Cardiff University. My PhD focused on learning versatile sense representations from LLMs that can enable applications beyond disambiguation, with particular emphasis on model probing and commonsense reasoning. Currently, I'm also exploring diachronic modelling with LLMs, and associated tasks, from the perspective of social media platforms (e.g., Twitter, Reddit).


  • NLP for Social Media
  • Distributional Semantics
  • Commonsense Reasoning
  • Computational Linguistics


  • PhD in Computer Science, 2022

    University of Porto

  • BSc in Computer Science w/Minor in Biology, 2013

    University of Porto




03/2020 - 2nd place at SemEval 2020 Task 3 Subtask 2 (EN) (Graded Word Similarity in Context).


12/2019 - “MedLinker” accepted at ECIR 2020.
12/2019 - Presented “Language Modelling Makes Sense” at DSPT #69.
11/2019 - Nominated Outstanding Reviewer at EMNLP 2019.
08/2019 - “Incredibles” nomination from INESCTEC.
07/2019 - Invited talk at Unbabel on LMMS.
06/2019 - Nominated Outstanding Reviewer at NAACL 2019.
05/2019 - “Language Modelling Makes Sense” accepted at ACL 2019.
04/2019 - 2nd place on the WiC Challenge shared task (SemDeep5 at IJCAI 2019).


10/2018 - Concluded pre-thesis on the topic “Natural Language Inference using Relational Commonsense Knowledge”.
08/2018 - “Affordance Extraction and Inference based on SRL” accepted at FEVER (EMNLP 2018).


09/2017 - Joined INESCTEC-LIAAD.
09/2017 - Started PhD program at FCUP-DCC, advised by Prof. Alípio Jorge.
04/2017 - Presented “Have we cracked semantics?" at DSPT #8.


MedLinker: Medical Entity Linking with Neural Representations and Dictionary Matching

Adapting and improving LMMS for Entity Linking in the Medical Domain.

LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)

Solution for the WiC challenge based on LMMS (ranked 2nd place at end of challenge).

Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

Word Sense Disambiguation (WSD) based on BERT and sense embeddings with full coverage of WordNet.

Affordance Extraction and Inference based on Semantic Role Labeling

Relational Embeddings based on Semantic Role Labeling (SRL) that expose affordances.