Özer, DemetBaykal, G.E.Leyleko?lu, A.Arslan, S.Özer, D.2023-10-192023-10-19202319781450394222https://doi.org/10.1145/3544549.3585865https://hdl.handle.net/20.500.12469/4948ACM SIGCHI;Apple;Bloomberg;Google;NSF;SIEMENS2023 CHI Conference on Human Factors in Computing Systems, CHI 2023 --23 April 2023 through 28 April 2023 -- --188037In this paper, we propose a taxonomy for the classification of children's gestural input elicited from spatial puzzle play in VR hand tracking. The taxonomy builds on the existing manipulative gesture taxonomy in human-computer interaction, and offers two main analytical categories; Goal-directed actions and Hand kinematics as complementary dimensions for analysing gestural input. Based on our study with eight children (aged between 7-14), we report the qualitative results for describing the categories for analysis and quantitative results for their frequency in occurring in children's interaction with the objects during the spatial task. This taxonomy is an initial step towards capturing the complexity of manipulative gestures in relation to mental rotation actions, and helps designers and developers to understand and study children's gestures as an input for object interaction as well as an indicator for spatial thinking strategies in VR hand tracking systems. © 2023 Owner/Author.eninfo:eu-repo/semantics/closedAccessGesturehand trackingHCIspatial thinkingVirtual RealityHuman computer interactionPalmprint recognitionTaxonomiesGestureGesture taxonomiesGoal-directedHand kinematicsHand-trackingMental rotationObject interactionsQuantitative resultSpatial thinkingTracking systemVirtual realityStudying Children's Object Interaction in Virtual Reality: A Manipulative Gesture Taxonomy for VR Hand TrackingConference Object10.1145/3544549.35858652-s2.0-85158166993N/AN/A