This class will introduce the basic elements of neural networks / deep learning / representation learning for doing natural language processing (NLP). We will cover the learning of representations of word meaning (word embeddings) and their use for other NLP tasks. After introducing multi-layer neural networks with that task, we will move on to recurrent neural networks and their use in language modelling and sequence tagging more generally.
As there will be a practical element, familiarity with Python is advisable.
We will mostly follow the Stanford class “Deep Learning for NLP” http://web.stanford.edu/class/cs224n. As additional literature we will use Goldberg 2017 (“Neural Network Methods for Natural Language Processing”), which is available as an e-book from the library.
Important: the course is taught in English.*
Rhythmus | Tag | Uhrzeit | Format / Ort | Zeitraum |
---|
Studiengang/-angebot | Gültigkeit | Variante | Untergliederung | Status | Sem. | LP | |
---|---|---|---|---|---|---|---|
Linguistik: Kommunikation, Kognition und Sprachtechnologie / Master | (Einschreibung bis WiSe 19/20) | 23-LIN-MaCL1 | 3 |
Zu dieser Veranstaltung existiert ein Lernraum im E-Learning System. Lehrende können dort Materialien zu dieser Lehrveranstaltung bereitstellen: