Repository logoGCRIS
  • English
  • Türkçe
  • Русский
Log In
New user? Click here to register. Have you forgotten your password?
Home
Communities
Browse GCRIS
Entities
Overview
GCRIS Guide
  1. Home
  2. Browse by Author

Browsing by Author "Gelmez, Zeynep Ecem"

Filter results by typing the first few letters
Now showing 1 - 2 of 2
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Conference Object
    Citation - WoS: 3
    Citation - Scopus: 4
    Effect of Hand and Object Visibility in Navigational Tasks Based on Rotational and Translational Movements in Virtual Reality
    (Ieee Computer Soc, 2024) Hatira, Amal; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Sarac, Mine
    During object manipulation in Virtual Reality (VR) systems, realistically visualizing avatars and objects can hinder user performance and experience by complicating the task or distracting the user from the environment due to possible occlusions. Users might feel the urge to go through biomechanical changes, such as re-positioning the head to visualize the interaction area. In this paper, we investigate the effect of hand avatar and object visibility in navigational tasks using a VR headset. We performed two user studies where participants grasped a small, cylindrical object and navigated it through the virtual obstacles performing rotational or translational movements. We used three different visibility conditions for the hand avatar (opaque, transparent, and invisible) and two conditions for the object (opaque and transparent). Our results indicate that participants performed faster and with fewer collisions using the invisible and transparent hands compared to the opaque hand and fewer collisions with the opaque object compared to the transparent one. Furthermore, participants preferred to use the combination of the transparent hand avatar with the opaque object. The findings of this study might be useful to researchers and developers in deciding the visibility/transparency conditions of hand avatars and virtual objects for tasks that require precise navigational activities.
  • Loading...
    Thumbnail Image
    Conference Object
    Citation - WoS: 5
    Citation - Scopus: 9
    EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching Systems
    (Assoc Computing Machinery, 2024) Turkmen, Rumeysa; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Stuerzlinger, Wolfgang; Asente, Paul; Sarac, Mine
    Visual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EYEGUIDE and EYECONGUIDE cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR.
Repository logo
Collections
  • Scopus Collection
  • WoS Collection
  • TrDizin Collection
  • PubMed Collection
Entities
  • Research Outputs
  • Organizations
  • Researchers
  • Projects
  • Awards
  • Equipments
  • Events
About
  • Contact
  • GCRIS
  • Research Ecosystems
  • Feedback
  • OAI-PMH

Log in to GCRIS Dashboard

Powered by Research Ecosystems

  • Privacy policy
  • End User Agreement
  • Feedback