The idea is to create a virtual environment where a player executes some tasks and receives achievements corresponding to their effort of movement. The target group of this project could be disabled persons who are limited in their movement.
The player needs for the scenario a head mounted display (HMD) to interact with the virtual environment. In this view, the player acts in first person view (FPV) and can see his movement like arms and feet. Objects appear in relation to his body. The challenge is to touch the objects with different parts of his body.
The arm length need to be measured before a game session starts. This information is used in a script to calculate the object’s relation to the body. More information like increasing or decreasing the difficulty of the session make the session harder or easier. A value of 80% will magnify the view by 20%, for example. This means the session is easier. Creating an input file facilitates the creation of different game scenarios and levels of difficulty. When a session is finished a protocol logs precision and speed and generate a report.
This report can be used to adjust the script because if the player achieve to reach all objects the same objects could be used with a stronger difficulty like 105%.
The challenges in implementation is on the one hand the initialization of the potential movement because independent of arm length some players can’t reach some positions. Additional to this the objects could be reachable but aren’t visible because of a limited head movement. On the other hand if the player can’t reach
one object, the game engine should detect this and recalculate objects.
The prototype will include a virtual environment with a character in FPV. The real movement is mapped on a virtual body. Created objects are in relation to the body and show an animation by touching. A report will be generated with a script in which the object positions can be changed.
Course of action:
• Create a scene with some 3d objects and a character. The character does an animation by a script and the scene can be viewed with oculus rift dk2
• Mapping the tracked movement to the character in scene. A Kinect 2 is used to achieve this
• Change camera to Character with FPV
• Cast some objects in relation to the body and movement
• Integrate the arm length parameter to fit the distance from body to hand or fingers
• Generate a protocol with information about the session
• Calculating distances for objects can be done with a python script and also respect the arm length and the level of difficulty
Rhythmus | Tag | Uhrzeit | Format / Ort | Zeitraum |
---|
Modul | Veranstaltung | Leistungen | |
---|---|---|---|
39-M-Inf-GP Grundlagenprojekt Intelligente Systeme | Gruppenprojekt | unbenotete Prüfungsleistung
|
Studieninformation |
weiteres Projekt | unbenotete Prüfungsleistung
|
Studieninformation |
Die verbindlichen Modulbeschreibungen enthalten weitere Informationen, auch zu den "Leistungen" und ihren Anforderungen. Sind mehrere "Leistungsformen" möglich, entscheiden die jeweiligen Lehrenden darüber.