Subtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environments

dc.authorscopusid58774909800
dc.authorscopusid58733075600
dc.authorscopusid57213289907
dc.authorscopusid57983665800
dc.authorscopusid55807561700
dc.authorscopusid59166497200
dc.authorscopusid59166497200
dc.contributor.authorVoisard,L.
dc.contributor.authorHatira,A.
dc.contributor.authorBashar,M.R.
dc.contributor.authorGemici,M.
dc.contributor.authorSarac,M.
dc.contributor.authorKereten-Oertel,M.
dc.contributor.authorBatmaz,A.U.
dc.date.accessioned2024-06-23T21:39:23Z
dc.date.available2024-06-23T21:39:23Z
dc.date.issued2024
dc.departmentKadir Has Universityen_US
dc.department-tempVoisard L., Concordia University, Canada; Hatira A., Kadir Has University, Turkey; Bashar M.R., Concordia University, Canada; Gemici M., Concordia University, Canada; Sarac M., Kadir Has University, Turkey; Kereten-Oertel M., Concordia University, Canada; Batmaz A.U., Concordia University, Canadaen_US
dc.description.abstractIn the virtual hand interaction techniques, the opacity of the virtual hand avatar can potentially obstruct users' visual feedback, leading to detrimental effects on accuracy and cognitive load. Given that the cognitive load is related to gaze movements, our study focuses on analyzing the gaze movements of participants across opaque, transparent, and invisible hand visualizations in order to create a new interaction technique. For our experimental setup, we used a Purdue Pegboard Test with reaching, grasping, transporting, and inserting subtasks. We examined how long and where participants concentrated on these subtasks and, using the findings, introduced a new virtual hand visualization method to increase accuracy. We hope that our results can be used in future virtual reality applications where users have to interact with virtual objects accurately. © 2024 IEEE.en_US
dc.identifier.citation0
dc.identifier.doi10.1109/VRW62533.2024.00008
dc.identifier.endpage11en_US
dc.identifier.isbn979-835037449-0
dc.identifier.scopus2-s2.0-85195555902
dc.identifier.scopusqualityN/A
dc.identifier.startpage6en_US
dc.identifier.urihttps://doi.org/10.1109/VRW62533.2024.00008
dc.identifier.urihttps://hdl.handle.net/20.500.12469/5869
dc.identifier.wosqualityN/A
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.relation.ispartofProceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2024 -- 2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2024 -- 16 March 2024 through 21 March 2024 -- Orlando -- 199867en_US
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectHuman-centered computingen_US
dc.subjectHuman-centered computingen_US
dc.subjectVisualizationen_US
dc.subjectVisualizationen_US
dc.subjectVisualization design and evaluation methodsen_US
dc.subjectVisualization techniquesen_US
dc.titleSubtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environmentsen_US
dc.typeConference Objecten_US
dspace.entity.typePublication

Files