392174 Projekt "Gaze-based Interaction in Virtual Reality" (PjS) (SoSe 2015)

Inhalt, Kommentar

In the project Virtual Reality theoretical and practical knowledge obtained in the lecture and the tutorial on Virtual Reality are trained and deepened in a practical application.

The specific project title is:
Gaze-based Interaction in Virtual Reality

Language: The project may be held in German or English, depending on the participants.

General Goal of the Project
In this project we want to create a virtual environment in which we can prototype gaze-based human-computer interaction. A central idea is that of a mobile cognitive assistant that is able to recognize everyday situations in which a digital assistant may help us.

An Attentive Shopping Assistant for the Supermarket
The concrete application idea we want to realize in the project is that of a mobile assistant for shopping situations.
When shopping for complex products, such as Muesli, it can be very time consuming to find the optimal product (unless you already have settled on a certain product and are "habituated" in your decision): there are several features along which a decision can be optimized (price, calories, sugar, brand, taste, ingredients, way of production, e.g. organic) and there might be other constraints, such as allergies against nuts. A typical shelf in a supermarket might contain 50 to 200 products of a kind, which leads to complex decision problems. In addition to that, some information is not directly accessible or visible, so packages need to be picked up and turned around…

The idea is now to use visual attention to detect which products the user is currently attending to and to create an assistant system based on these information to provide the relevant information to the user, e.g. by augmenting the shelves with small information bubbles. There are many possibilities to realize such visualizations and it is difficult to test them under real conditions in a real supermarket.

To this end, we will create a particular shelf in the supermarket with a rich selection of products of a chosen category (Muesli is nice because many packages are similar in shape and thus easy to model). The user will be enabled to walk through the supermarket using the CAVE and she will be equipped with an eye-tracking system to monitor her gaze. Based on the gaze information and the knowledge about the product placements, we will then be able to identify the product currently in focus in real-time. The technology for this is already in place and has been tested for other scenarios, we will thus merely need to adapt it to our application example.

Based on the supermarket and the gaze analysis, we will then design different assistant functions and implement several ways of visualizing information for the user.

Typical tasks we could support are:
- Finding a particular product by, e.g.
○ highlighting it in the shelf, or
○ presenting arrows pointing in the direction of the product, or
○ drawing a line from the current point of regard towards the target, etc…
- Finding a selection of products according to a set of preferences, e.g.
○ Showing all products that contain nuts/not contain nuts
- Comparing two products
○ e.g by depicting their relevant features on side-by-side information sheets decoupled of their position in the shelf
- Retrieving additional information about a product, e.g.
○ retrieving user comments from an online platform
○ finding and depicting test results

The real-time access to eye-gaze information will be used in several ways
- To identify the product of relevance
- To detect whether the user is attending to the presented information (and thus maybe fade them in stronger or make them stick)
- To organize the visual presentation, so that augmented information displays do not occlude relevant products or other information
- To let the user trigger some actions, such as retrieval processes

Procedure
The procedure of the project is oriented on the process of scientific research.

Phase 1 (April): State of the Art and Planning
In the beginning, all participants will discuss the current state of the art regarding gaze-based interaction and augmented information visualization. Relevant scientific questions will then be formulated and the application design will be developed.

Phase 2 (Mai-June): Implementation
Depending on the number of participants, groups will be formed which work on a dedicated aspect of the implementation of the design. The planned interaction methods will be implemented.

Phase 3 (June-July): Evaluation
The implemented system will be evaluated in a small user study.

Remarks:
The project makes it easy for students of different disciplines and of different backgrounds to complement each other. While participants of the lecture on Virtual Reality may focus on the design and implementation of the interaction methods, participants of the lecture "Mensch-Maschine-Interaktion" (Human Machine Interaction) may develop the interaction concept and evaluate the implementation.

As a conclusion of the project, a report will be written. This report will be in the shape of a scientific paper, which might be even submitted to a real scientific workshop, such as the German Workshop for Virtual and Augmented Reality. Participants of the project, in particular those following a master's program, will learn how to structure and write a scientific article. If the article is accepted, participants will also have the chance to present their work on a real scientific workshop (on their own expense).

This project may lay the foundation for many interesting questions or application problems which could be addressed in a bachelor or master thesis.

Teilnahmevoraussetzungen, notwendige Vorkenntnisse

  • Lecture on Virtual Reality OR Human-Machine-Interaction
  • Participation in the tutorials on Virtual Reality is recommended
  • This project requires advanced programming skills in X3D/JavaScript beyond what has been covered in the VR tutorials or the willingness to dedicate additional time into advancing ones skills.

Literaturangaben

  • X3D : extensible 3D graphics for web authors by Don Brutzman and Leonard Daly
  • X3D : Programmierung interaktiver 3D-Anwendungen für das Internet by Jörg H. Kloss
  • Pfeiffer, T., & Mattar, N. (2009). Benefits of locating overt visual attention in space using binocular eye tracking for mixed reality applications, http://pub.uni-bielefeld.de/download/1894565/2487426
  • Pfeiffer, J., Meißner, M., Prosiegel, J., & Pfeiffer, T. (2014). Classification of Goal-Directed Search and Exploratory Search Using Mobile Eye-Tracking. Proceedings of the International Conference on Information Systems 2014 (ICIS 2014). http://aisel.aisnet.org/icis2014/proceedings/GeneralIS/7/
  • Losing, V., Rottkamp, L., Zeunert, M., & Pfeiffer, T. (2014). Guiding Visual Search Tasks Using Gaze-Contingent Auditory Feedback. UbiComp'14 Adjunct: The 2014 ACM Conference on Ubiquitous Computing Adjunct Publication, 1093–1102. http://dx.doi.org/10.1145/2638728.2641687

Lehrende

Termine ( Kalendersicht )

Rhythmus Tag Uhrzeit Format / Ort Zeitraum  

Zeige vergangene Termine >>

Fachzuordnungen

Modul Veranstaltung Leistungen  
23-MeWi-HM5 Praxis-Umgang mit Medien Lehrveranstaltung I benotete Prüfungsleistung
Studieninformation
Lehrveranstaltung II Studienleistung
Studieninformation
Lehrveranstaltung III Studienleistung
Studieninformation
Lehrveranstaltung IV Studienleistung
Studieninformation
39-Inf-VR Virtuelle Realität Virtuelle Realität benotete Prüfungsleistung
Studieninformation
Virtuelle Realität Studieninformation
Virtuelle Realität unbenotete Prüfungsleistung
Studieninformation

Die verbindlichen Modulbeschreibungen enthalten weitere Informationen, auch zu den "Leistungen" und ihren Anforderungen. Sind mehrere "Leistungsformen" möglich, entscheiden die jeweiligen Lehrenden darüber.

Studiengang/-angebot Gültigkeit Variante Untergliederung Status Sem. LP  
Bioinformatik und Genomforschung / Bachelor (Einschreibung bis SoSe 2011) Virtuelle Realität; Mensch-Maschine-Interaktion Wahlpflicht 6. 4 unbenotet  
Informatik / Bachelor (Einschreibung bis SoSe 2011) Nebenfach Virtuelle Realität; Mensch-Maschine-Interaktion Wahlpflicht 6. 4 unbenotet  
Kognitive Informatik / Bachelor (Einschreibung bis SoSe 2011) Mensch-Maschine-Interaktion; Virtuelle Realität Wahlpflicht 6. 4 unbenotet  
Medieninformatik und Gestaltung / Bachelor (Einschreibung bis SoSe 2011) Mensch-Maschine-Inter; Virtuelle Realität Wahlpflicht 6. 4 unbenotet  
Medienwissenschaft, interdisziplinäre / Master (Einschreibung bis SoSe 2014) Hauptmodul 4; Hauptmodul 6 Wahlpflicht 4 unbenotet  
Naturwissenschaftliche Informatik / Bachelor (Einschreibung bis SoSe 2011) Mensch-Maschine-Interaktion; Virtuelle Realität Wahlpflicht 6. 4 unbenotet  
Studieren ab 50    

  • practical work in the project (design, implementation and evaluation of an immersive interface in virtual reality)
  • significant contributions to the team
  • each team writes a report on their work (10 to 15 pages, not graded)
Kein Lernraum vorhanden
registrierte Anzahl: 8
Dies ist die Anzahl der Studierenden, die die Veranstaltung im Stundenplan gespeichert haben. In Klammern die Anzahl der über Gastaccounts angemeldeten Benutzer*innen.
Adresse:
SS2015_392174@ekvv.uni-bielefeld.de
Lehrende, ihre Sekretariate sowie für die Pflege der Veranstaltungsdaten zuständige Personen können über diese Adresse E-Mails an die Veranstaltungsteilnehmer*innen verschicken. WICHTIG: Sie müssen verschickte E-Mails jeweils freischalten. Warten Sie die Freischaltungs-E-Mail ab und folgen Sie den darin enthaltenen Hinweisen.
Falls die Belegnummer mehrfach im Semester verwendet wird können Sie die folgende alternative Verteileradresse nutzen, um die Teilnehmer*innen genau dieser Veranstaltung zu erreichen: VST_53962916@ekvv.uni-bielefeld.de
Reichweite:
3 Studierende direkt per E-Mail erreichbar
Hinweise:
Weitere Hinweise zu den E-Mailverteilern
Letzte Änderung Grunddaten/Lehrende:
Freitag, 11. Dezember 2015 
Letzte Änderung Zeiten:
Freitag, 24. April 2015 
Letzte Änderung Räume:
Freitag, 24. April 2015 
Art(en) / SWS
PjS / 4
Einrichtung
Technische Fakultät
Fragen oder Korrekturen?
Fragen oder Korrekturwünsche zu dieser Veranstaltung?
Planungshilfen
Terminüberschneidungen für diese Veranstaltung
Link auf diese Veranstaltung
Wenn Sie diese Veranstaltungsseite verlinken wollen, so können Sie einen der folgenden Links verwenden. Verwenden Sie nicht den Link, der Ihnen in Ihrem Webbrowser angezeigt wird!
Der folgende Link verwendet die Veranstaltungs-ID und ist immer eindeutig:
https://ekvv.uni-bielefeld.de/kvv_publ/publ/vd?id=53962916
Seite zum Handy schicken
Klicken Sie hier, um den QR Code zu zeigen
Scannen Sie den QR-Code: QR-Code vergrößern
ID
53962916
Zum Seitenanfang