Aini Putkonen
ainiputkonen.bsky.social
Aini Putkonen
@ainiputkonen.bsky.social
AIML Resident @Apple
ainiputkonen.fi
Thank you 😊
October 13, 2025 at 5:22 PM
October 12, 2025 at 6:25 PM
8/8
✨ Check out the paper on IJHCS: doi.org/10.1016/j.ij...

🔍 ...and VSGUI10K dataset and the accompanying code repository here: osf.io/hmg9b/

✏️ Thanks to my co-authors: Yue Jiang, Jingchun Zeng, Olli Tammilehto, Jussi P.P. Jokinen and Antti Oulasvirta @oulasvirta.bsky.social
March 17, 2025 at 7:55 AM
7/8 The paper also presents analysis of search times; we find that GUI type (mobile, desktop or webpage), absence/presence of the target and whether the target is presented in a textual or image format affect search times more than visual complexity.
March 17, 2025 at 7:48 AM
6/8 Confirm: Finally, the user must confirm whether the target was found.
March 17, 2025 at 7:46 AM
5/8 Scan: Unlike in free-viewing, attention then gets more selectively deployed based on the structure of the GUI and features of the target.
March 17, 2025 at 7:45 AM
4/8 Guess: Regardless of target location, first fixations are directed towards top-left of a GUI, similar to the persistent upper-left bias in free-viewing.
March 17, 2025 at 7:45 AM
3/8 We synthesize our results in a three-stage pattern of search: Guess, Scan and Confirm.
March 17, 2025 at 7:45 AM
2/8 Our new paper addresses this gap as we release an eye-tracking dataset of 10,000+ visual search trials on 900 GUIs with 84 participants.
March 17, 2025 at 7:40 AM
Thanks to my co-authors Yue Jiang, Jingchun Zeng, Olli Tammilehto, Jussi P.P. Jokinen and @oulasvirta.bsky.social
March 11, 2025 at 6:05 AM