This lecture offers a journey through recent approaches towards a deeper theoretical understanding of deep neural networks. The lecture will proceed mainly along a "theory backbone" provided in the recent book "The principles of deep learning theory" by Roberts, Yaida and Hanin (2021), but, whenever apt, meandering into selected papers to deepen additional aspects. A major strategy will be to develop analytical methods for certain limiting cases of neural network architectures (linear, vanishing width/depth ratio) in order to gain insights about properties of real networks that are "close" to such cases. The developed formalisms draw heavily on methods from theoretical physics developed to connect microscopic and macroscopic scale descriptions of multiparticle systems and, thereby, expose interesting analogies between the behavior of deep networks and physical phenomena in multiparticle systems.
Prerequisites are a solid background in linear algebra, multivariate calculus and rudimentary probability theory.
Prerequisites are a solid background in linear algebra, multivariate calculus and rudimentary probability theory.
Rhythmus | Tag | Uhrzeit | Format / Ort | Zeitraum |
---|
Modul | Veranstaltung | Leistungen | |
---|---|---|---|
39-M-Inf-VML Vertiefung Maschinelles Lernen | Vertiefung Maschinelles Lernen | Studieninformation |
Die verbindlichen Modulbeschreibungen enthalten weitere Informationen, auch zu den "Leistungen" und ihren Anforderungen. Sind mehrere "Leistungsformen" möglich, entscheiden die jeweiligen Lehrenden darüber.
Zu dieser Veranstaltung existiert ein Lernraum im E-Learning System. Lehrende können dort Materialien zu dieser Lehrveranstaltung bereitstellen: