Batmaz, Anıl UfukVoisard, LaurentHatira, AmalBashar, Mohammad RaihanulGemici, MucahitSarac, MineKersten-Oertel, MartaBatmaz, Anil Ufuk2024-06-232024-06-232024097983503744909798350374506https://doi.org/10.1109/VRW62533.2024.00008Gemici, Mucahit/0009-0004-4655-4743; Hatira, Amal/0009-0006-6452-0672; Bashar, Mohammad Raihanul/0000-0002-5271-457XIn the virtual hand interaction techniques, the opacity of the virtual hand avatar can potentially obstruct users' visual feedback, leading to detrimental effects on accuracy and cognitive load. Given that the cognitive load is related to gaze movements, our study focuses on analyzing the gaze movements of participants across opaque, transparent, and invisible hand visualizations in order to create a new interaction technique. For our experimental setup, we used a Purdue Pegboard Test with reaching, grasping, transporting, and inserting subtasks. We examined how long and where participants concentrated on these subtasks and, using the findings, introduced a new virtual hand visualization method to increase accuracy. We hope that our results can be used in future virtual reality applications where users have to interact with virtual objects accurately.eninfo:eu-repo/semantics/closedAccessHuman-centered computingVisualizationVisualization techniquesHuman-centered computingVisualizationVisualization design and evaluation methodsSubtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality EnvironmentsConference Object611WOS:00123937540000210.1109/VRW62533.2024.000082-s2.0-85195555902N/AN/A