Staff and
Department Search

Ms Prof. Dr. Hanna Drimalla

Photo of the person Frau Prof. Dr. Hanna Drimalla - Opens the image in full size on a new page

Contact

1. Faculty of Technology / Arbeitsgruppen / AG Multimodal Behavior Processing

Function

Leiterin der AG "Multimodal Behavior Processing"

E-Mail
drimalla@techfak.uni-bielefeld.de  
Homepage
Open
Phone
+49 521 106-12043  
Phone No. of Secretary
+49 521 106-6891 Secretary's Details
Office
CITEC-3.108 Locations Map
Office Hours

Dienstag 13:00-14:00 (nur mit vorheriger Terminbuchung)

Make an appointment

Curriculum Vitae

since December 2020 : Junior-Professorship at University of Bielefeld in Multimodal Behavior Processing

2018 - 2020: Postdoctoral Researcher at the Digital Health Center of the Hasso Plattner Institute.

2014 – 2019: PhD of Psychology, Humboldt University of Berlin Research fellowship at the Social Cognition Group of the Berlin School of Mind and Brain Thesis: "On Facial Mimicry and Empathy: Insights from individuals with autism spectrum conditions, electromyography and machine learning”

2013 – 2017: Master of Computational Science, University of Potsdam Computer science master focused on data analysis Thesis: "Using Neural Networks to Detect Autism from Video Recordings of Virtual Interactions"

2008 – 2013: Bachelor & Master of Psychology (Cognitive Neuroscience), Ruhr University Bochum Research-oriented study program with a focus on cognitive neuroscience Thesis: "The Influence of Acute Stress on Trust Learning"

Current research topics

Automatic affect recognition: Previous solutions for affect recognition are based on non-representative and unrealistic data sets. The first step towards better emotion recognition is therefore the collection of a balanced video data set in a situation as natural as possible using standardized test procedures. With this data material we want to develop algorithms of machine learning for affect recognition. Since the data material is video data, innovative approaches to the integration of different modalities (voice, facial expression, gaze behavior) can be used .

Computer-based stress measurement: The measurement of stress has so far focused mainly on self-report or individual parameters of the physiological response. In different stress paradigms we want to record the non-verbal behavior of test persons together with physiological markers. Based on these multimodal data, we develop algorithms for automatic stress detection and validate them in natural environments.

Analysis of social embedding: To assess the social integration of a person, clinicians and researchers often use questionnaires. In order to capture the social integration of a person more sensitively and objectively, we want to develop an automatic analysis of online social interaction data. In a large online study, we compare the sensitivity of this approach to classical questionnaires. Furthermore, we identify characteristic and helpful interaction patterns using machine learning to predict the social embedding and resilience of a respondent.

Top of page