Short Description:
Facial expressions convey significant non-verbal information necessary for communication and interaction. Current tools for facial behavior, such as Open Face [1], use machine learning methods to identify features such as facial landmarks and facial action coding, but often struggle with facial action units with limited or no representation in training data. In this project, you will develop a facial motion tracking method using an optical flow-based approach. Additionally, you will develop a method to identify the activation of facial action units based on the tracking algorithm. To evaluate the effectiveness of your approach, you will also establish a methodology to compare your approach with the Facial Action Unit detection from state-of-the-art facial behavior analysis systems such as OpenFace [1].
[1] https://github.com/TadasBaltrusaitis/OpenFace
Required skills:
• Python and/or C++
• Working knowledge of Machine Learning Methods
In case this would not find enough interest for a team project, this project proposal would be also offered (in reduced/modified form)
• an individual project
• a project for only 2 students
Frequency | Weekday | Time | Format / Place | Period |
---|
Module | Course | Requirements | |
---|---|---|---|
39-M-Inf-GP Grundlagenprojekt Intelligente Systeme | weiteres Projekt | Ungraded examination
|
Student information |
The binding module descriptions contain further information, including specifications on the "types of assignments" students need to complete. In cases where a module description mentions more than one kind of assignment, the respective member of the teaching staff will decide which task(s) they assign the students.