Universität BielefeldPlayrectangle-plus
Universität Bielefeld
Menü umschalten

Personen- und Einrichtungssuche

Bsp.: Nachname oder Nachname,Vorname oder Stichwort oder Telefonnummer

Frau Prof'in Hanna Drimalla

Bild der Person

1. Technische Fakultät / AG Multimodal Behavior Processing

Aufgabenbeschreibung

Leiterin der AG "Multimodal Behavior Processing"

 
E-Mail
drimalla@techfak.uni-bielefeld.de  
Telefon
+49 521 106-12043  
Büro
CITEC-3.108 Lage-/Raumplan
Sprechzeiten

Donnerstag 13:00-14:00 (aktuell nur online und mit vorheriger Terminbuchung)

Termin im eKVV vereinbaren >>

Curriculum Vitae

since December 2020 : Junior-Professorship at University of Bielefeld in Multimodal Behavior Processing

2018 - 2020: Postdoctoral Researcher at the Digital Health Center of the Hasso Plattner Institute.

2014 – 2019: PhD of Psychology, Humboldt University of Berlin Research fellowship at the Social Cognition Group of the Berlin School of Mind and Brain Thesis: "On Facial Mimicry and Empathy: Insights from individuals with autism spectrum conditions, electromyography and machine learning”

2013 – 2017: Master of Computational Science, University of Potsdam Computer science master focused on data analysis Thesis: "Using Neural Networks to Detect Autism from Video Recordings of Virtual Interactions"

2008 – 2013: Bachelor & Master of Psychology (Cognitive Neuroscience), Ruhr University Bochum Research-oriented study program with a focus on cognitive neuroscience Thesis: "The Influence of Acute Stress on Trust Learning"

Aktuelle Forschungsthemen

Automatic affect recognition: Previous solutions for affect recognition are based on non-representative and unrealistic data sets. The first step towards better emotion recognition is therefore the collection of a balanced video data set in a situation as natural as possible using standardized test procedures. With this data material we want to develop algorithms of machine learning for affect recognition. Since the data material is video data, innovative approaches to the integration of different modalities (voice, facial expression, gaze behavior) can be used .

Computer-based stress measurement: The measurement of stress has so far focused mainly on self-report or individual parameters of the physiological response. In different stress paradigms we want to record the non-verbal behavior of test persons together with physiological markers. Based on these multimodal data, we develop algorithms for automatic stress detection and validate them in natural environments.

Analysis of social embedding: To assess the social integration of a person, clinicians and researchers often use questionnaires. In order to capture the social integration of a person more sensitively and objectively, we want to develop an automatic analysis of online social interaction data. In a large online study, we compare the sensitivity of this approach to classical questionnaires. Furthermore, we identify characteristic and helpful interaction patterns using machine learning to predict the social embedding and resilience of a respondent.

Zum Seitenanfang