Student Projects 2022
Next registration deadline in ZeUS: May 1, 2022
We had a digital Zoom meeting on Wednesday, April 13, at 11:45h.
When interested in one of the projects, contact us asap (, and we will send the a link to the session or arrange for a private session later on.

There is also an opportunity for students to engage in the form of a Student Assistant job (HiWi). 

Example of a past project 

Eye-Tracking just noticeable differences

! This is a finished master project/thesis and is not available anymore.      For the list of available projects please scroll down.

Just noticeable difference describes the point at which human observers see changes made to visual media, e.g., through compression. However, the difference is initially only noticeable at a few salient locations. Finding the locations where users observe this change can lead to better encoding algorithms. For this purpose, objective (ML) methods need to be developed, based on features of these change-salient regions. Therefore, this project aims to first locate these parts of the image and then derive features from them, to guide the image processing pipeline.

  • Perform an eye-tracking study on existant just-noticeable-difference datasets
  • Alternative without a lab study: Do it as a crowdsourcing experiment, e.g., by letting the observer identify the image region with just noticeable difference.
  • Extract features common to change-salient regions of the stimuli
  • Use machine learning to predict the change-salient regions in digital images (→ thesis)

Solid practical knowledge and experience with algorithms. Programming in MATLAB, Python, C, and/or C++.

  • Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Van de Weijer J. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford; 2011 Sep 22.
  • Jin L, Lin JY, Hu S, Wang H, Wang P, Katsavounidis I, Aaron A, Kuo CC. Statistical study on perceived JPEG image quality via MCL-JCI dataset construction and analysis. Electronic Imaging. 2016 Feb 14; 2016(13):1-9.
  • Hosu V, Hahn F, Zingman I, Saupe D. Reported attention as a promising alternative to gaze in IQA tasks. In PQS 2016: 5th ISCA/DEGA Workshop on Perceptual Quality of Systems 2016 (pp. 117-121).
  • Hosu V, Hahn F, Wiedemann O, Jung SH, Saupe D. Saliency-driven image coding improves overall perceived JPEG quality. In 2016 Picture Coding Symposium (PCS) 2016 Dec 4 (pp. 1-5). IEEE.

Project emphasis

Theoretical / analytical
Practice and implementation
Literature review


Hanhe Lin, Dietmar Saupe


Available projects in Visual Quality Assessment

Project 1 (NOTE: Project already taken)

Removing biases from subjective quality scores

Visual quality assessment (VQA) models rely on subjective opinions in the form of image quality scores that are often aggregated into image-wise mean opinion scores (MOS). Human raters have their personal biases which influence their opinions, and create variations (noise) in the scores assigned to an image. Biases can relate to individual preferences, personality, emotional state, and many other factors.
It is often desirable to remove biases from subjective scores, which can help generate more precise (de-noised) mean opinion scores. In turn, this can increase the performance of visual quality models when trained on the de-noised opinions.