Unpredictable user behavior is one of the main issues for free exploration of large immersive environments. One of the main challenges with VR is that both real and virtual worlds should be considered to ensure safe navigation and redirection. If a physical environment has multiple obstacles apart from walls, manipulating the user's attention and thus actions might be especially beneficial.
An accurate prediction of future moves or actions of the user might drastically improve the quality of the experience, reduce the probability of collision with obstacles, enable better haptic interaction.
Different subtopics are available:
- prediction of the locomotion direction: where the user intends to walk,
- prediction of whether the user intends to make large upper body movements such as torso rotation, bowing or leaning forward,
- fast prediction of hands' motions in the proximity of a robotic arm based on RGBD data,
- prediction of the touch interaction with elements of the virtual environment, e.g., if a user intends to touch a wall or a chair.
Please, confirm your specific topic with the supervisor.
Relying on the most relevant and cited sources, such as books and international research publications, you will need to make a well-argued suggestion a good practical or optimal solution for our specific setup and use case. Then, the chosen algorithm(s) will be implemented for testing in Unity (C#).
- Knowledge of English language (source code comments and final report should be in English)
- Familiarity with Unity3D is advantageous
- Programming languages: C#, C++
Game engine: Unity3D.