Eye‐tracking offers a window into the subtle social attention patterns that differentiate Autism Spectrum Condition (ASC) from neurotypical behavior. While many research and clinical teams rely on open‐source, webcam‐based tools (e.g. OpenFace, L2CS-Net), these methods have never been rigorously compared against ground‐truth EyeLink recordings in controlled settings. This project will collect parallel eye‐tracking data and evaluate how well popular open‐source pipelines estimate gaze position, fixation, and saccades under varying lighting, accessories (e.g., glasses), and motion conditions. By quantifying accuracy, precision, and robustness, the student’s work will inform best practices for scalable, low-cost gaze assessment in ASC research and beyond.
Objectives & Tasks:
Develop and benchmark two open-source gaze-tracking pipelines (e.g., OpenFace, MediaPipe) against EyeLink ground truth by extracting and comparing gaze coordinates, fixations, and saccades under varied lighting and motion conditions; quantify spatial and temporal accuracy metrics; and produce visualizations (error heatmaps, time-series overlays) that clearly highlight each method’s strengths and limitations.
Skills & Requirements (concise):
Proficient in Python for data processing and plotting; familiarity with eye-tracking concepts (fixations, saccades); statistical analysis capabilities; and strong documentation and code-organization practices to deliver reproducible evaluation scripts.
Rhythmus | Tag | Uhrzeit | Format / Ort | Zeitraum | |
---|---|---|---|---|---|
nach Vereinbarung | n.V. | 13.10.2025-06.02.2026 |
Modul | Veranstaltung | Leistungen | |
---|---|---|---|
39-M-Inf-P Projekt | Projekt | unbenotete Prüfungsleistung
|
Studieninformation |
Die verbindlichen Modulbeschreibungen enthalten weitere Informationen, auch zu den "Leistungen" und ihren Anforderungen. Sind mehrere "Leistungsformen" möglich, entscheiden die jeweiligen Lehrenden darüber.