Tehran Institute for Advanced Studies (TEIAS)

/ Probing for Meanings in Neural Word Representations __ Yadollah Yaghoobzadeh

Talk

Probing for Meanings in Neural Word Representations

March 02, 2020

Venue

Khatam University, Building No2.
Address: Mollasadra Blvd., North Shirazi St., East Daneshvar St., No.17. See location on Google map

+982189174612

Dr. Yadollah Yaghoobzadeh

Senior Researcher at Microsoft Research

Overview

Learning transferrable representations from large raw text has been one of the key reasons behind recent advances in natural language processing (NLP). Deeper representations are trained thanks to more powerful computing hardware and more parallelizable neural network architectures. As a general practice, for training a model for a target task, we moved from transferring only static pre-trained word embeddings (e.g., word2vec) to transferring deep layers of pre-trained neural networks interpreted as contextual word representations (e.g., BERT). Models like BERT represent meanings of words and phrases in either their embedding layer or the activations of their hidden layers or both. In this talk, after a brief background on pre-trained word representations, I will focus more on how different meanings are represented in word embeddings and introduce our method of probing semantic classes. We use semantic classes as proxies for meanings and train classifiers to tell us which meanings are encoded in word representations. We also apply a similar approach for contextual word representations like BERT.

Biography

Yadollah Yaghoobzadeh is currently a senior researcher at Microsoft Research in Montreal, where he works on natural language processing (NLP) and deep learning methods, particularly on word, sentence and sentence pair semantics.
Before joining Microsoft in 2017, he was graduated with a PhD in computer science from University of Munich (LMU), supervised by Hinrich Schütze. He obtained MS from Sharif University of Technology in 2012 and BS from University of Tehran in 2009. As professional activities, he co-organized two workshops on Subword and Character level models in NLP (SCLeM) at EMNLP 2017 and NAACL 2018. He has also served as a program committee member at various top NLP conferences: ACL, EMNLP, NAACL, COLING, EACL, CoNLL, LREC.