Publications
Below is a searchable list of publications by the projects of the Priority Program.
1.
Pfützenreuter, Niklas; Liebers, Carina; Goedicke, David; Degraen, Donald; Gruenefeld, Uwe; Schneegass, Stefan
Eye Want It All! Investigating Eye Tracking as Implicit Support for Generative Inpainting Proceedings Article
In: Proceedings of the Extended Abstracts of the 2026 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2026, ISBN: 9798400722813.
Abstract | Links | BibTeX | Tags: eye tracking, Generative Artificial Intelligence, Image Generation
@inproceedings{10.1145/3772363.3799314,
title = {Eye Want It All! Investigating Eye Tracking as Implicit Support for Generative Inpainting},
author = {Niklas Pfützenreuter and Carina Liebers and David Goedicke and Donald Degraen and Uwe Gruenefeld and Stefan Schneegass},
url = {https://doi.org/10.1145/3772363.3799314},
doi = {10.1145/3772363.3799314},
isbn = {9798400722813},
year = {2026},
date = {2026-01-01},
booktitle = {Proceedings of the Extended Abstracts of the 2026 CHI Conference on Human Factors in Computing Systems},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
series = {CHI EA '26},
abstract = {Users often struggle to use Generative Artificial Intelligence (GenAI) models to generate a desired image, as controlling them solely with prompts is difficult. Current solutions to this problem, such as adding conditional controls, require users to provide explicit input, which can be tedious. To avoid depending on additional explicit input, this paper explores what implicit gaze behavior tells about user intentions when viewing generated images. In our user study (N = 16), we evaluated the correlation between gaze behavior and user annotations, showing that users looked longer at areas they wanted to regenerate. While our research is the first step, we believe our work can pave the way for incorporating implicit user input into interactions with GenAI systems.},
keywords = {eye tracking, Generative Artificial Intelligence, Image Generation},
pubstate = {published},
tppubtype = {inproceedings}
}
Users often struggle to use Generative Artificial Intelligence (GenAI) models to generate a desired image, as controlling them solely with prompts is difficult. Current solutions to this problem, such as adding conditional controls, require users to provide explicit input, which can be tedious. To avoid depending on additional explicit input, this paper explores what implicit gaze behavior tells about user intentions when viewing generated images. In our user study (N = 16), we evaluated the correlation between gaze behavior and user annotations, showing that users looked longer at areas they wanted to regenerate. While our research is the first step, we believe our work can pave the way for incorporating implicit user input into interactions with GenAI systems.