The class will be taught in English, by Özge Alaçam:
Do you have some basic knowledge of machine learning and neural networks? Are you curious about applying these techniques to modeling language? In this seminar, we explore applications and implementations of deep learning methods for language technology. We apply deep neural networks to selected problems in natural language processing focussing on so-called sequence-to-sequence models (used in e.g. machine translation, language generation, dialogue modeling). In addition, we also touch upon several multi-modal NLP problems. The basics of these methods have been taught in the class "Neuronale Netze in der Sprachverarbeitung (SS 2021)".
Recent advancements in neural networks promise to learn data representations and relevant features from the data itself, as opposed to task-specific feature engineering. They have progressed the state-of-the-art in several language technology related tasks, in some cases significantly. The seminar includes hands-on sessions to learn relevant programming techniques, e.g. how to use and apply recurrent neural networks and state-of-the-art transformer models to sequential problems in PyTorch , a mini-project and presentations of the results.
Learning objectives:
In this seminar, students learn
- aspects of machine learning, neural networks, language technology
- to apply deep neural networks to selected natural language processing problems
- to implement sequence-to-sequence models, encoder-decoder architecures
- to use and fine-tune pre-trained transformer models
- presentation techniques
Didactic concept:
After a general introduction and a hands-on primer on programming deep neural networks in Python (e.g. with PyTorch) in the first few sessions, we will distribute topics. The participants form teams to work on them and present their results in the seminar.
Literature:
individually assigned per topic
Additional examination information:
To successfully pass, we ask the following:
- mini-project: applying or implementing a deep learning model for some language technology tasxk
- seminar presentation of the implementation or the results
Grundkenntnisse in Maschinellem Lernen und neuronalen Netzen sind sehr empfohlen. (Der Kurs "Neuronale Netze in der Sprachverarbeitung ist ideal als Vorbereitung)
Rhythmus | Tag | Uhrzeit | Format / Ort | Zeitraum |
---|
Modul | Veranstaltung | Leistungen | |
---|---|---|---|
23-LIN-MaCL-MethAngewCL Methoden der angewandten Computerlinguistik | Lehrveranstaltung 1 | Studienleistung
|
Studieninformation |
Lehrveranstaltung 2 | Studienleistung
|
Studieninformation |
Die verbindlichen Modulbeschreibungen enthalten weitere Informationen, auch zu den "Leistungen" und ihren Anforderungen. Sind mehrere "Leistungsformen" möglich, entscheiden die jeweiligen Lehrenden darüber.
Studiengang/-angebot | Gültigkeit | Variante | Untergliederung | Status | Sem. | LP | |
---|---|---|---|---|---|---|---|
Linguistik: Kommunikation, Kognition und Sprachtechnologie / Master | (Einschreibung bis WiSe 19/20) | 23-LIN-MaCL1; 23-LIN-MaCL2 | 3 |
Zu dieser Veranstaltung existiert ein Lernraum im E-Learning System. Lehrende können dort Materialien zu dieser Lehrveranstaltung bereitstellen: