Interactive Analysis of Virtual Jaw Movement Trajectories

Student Project
Master Thesis


In the dentistry, prosthetic treatments are widely applied to patients missing their original teeth. Prosthetic devices such as dental implants or dentures can be permanently or temporarily fixed in the patient’s mouth. For dental technicians, it is important to confirm that the geometry of these dental prosthetic devices fits well to the geometry of their original antagonist teeth. To achieve this, dental technicians use a virtual articulator that simulates the natural movement of the patient’s jaws. Analyzing the trajectory of this movement over time could reveal potential problems, such as unintendedly and unnaturally colliding teeth As the movement of the jaws repeats itself, this problem can be transferred to finding certain patterns in a 4D trajectory. These patterns correspond to the patient-specific movements. Selection for further inspection requires a tool that allows the user to perform this task on 3D trajectories over time and simultaneously view the virtual articulator side-by-side.


Your task is to develop a tool for selecting the movement trajectories of the virtual articulator, by converting these trajectories first into 4D space, by adding to its spatial coordinates the time component. Then, a single trajectory should be projected into 2D space, while preserving its spatial and temporal meaning. Different projection strategies should be investigated during the course of the project, namely from 4D to 2D.


  • Interest in the topic (the most important requirement)
  • Good programming skills
  • Good English (the final report must be written in English)


Currently, there are no strict requirements for the implementation, but preliminary C++ code is already available. All further details about the implementation and the environment will be discussed at the beginning of the project. If this project is successfully completed, it can be continued in the form of a master thesis with a possible payment of 2.250 €.


For more information please contact Aleksandr Amirkhanov.