In order to improve human-computer interaction, modern concepts involve various input options, including (but not limited to) bodily gestures, eye-movements or even physiological signals. Head and eye-movements are of particular interest since they are closely related to each other when we explore our visual environment. It is therefore not surprising that an increasing number of research projects is taking steps to incorporate head and eye-movements into fully integrated interaction concepts. The current project is supposed to join these efforts by connecting head- and eye-tracking frameworks. Project participants will have the opportunity to choose between various techniques and may draw upon several open source libraries to establish a solid interaction concept. They will investigate which tracking approach (IR, marker, IMU or such) is most suitable and can be realized with the eye-trackers SDK. Overall objective is to put up a proof of concept for target selection and to test it during a pilot study.

The project is carried out in close cooperation with the HCI department.