Browsing by Author "Batmaz, A.U."
Now showing 1 - 6 of 6
- Results Per Page
- Sort Options
Conference Object Citation - Scopus: 2Does Repeatedly Typing the Same Phrase Provide a Good Estimate of Expert Text Entry Performance?(Association for Computing Machinery, 2023) Batmaz, Anıl Ufuk; Batmaz, A.U.; Hudhud Mughrabi, M.; Stuerzlinger, W.To identify if novel/unfamiliar keyboard layouts like OPTI can outperform QWERTY, lengthy training through longitudinal studies is typically required. To reduce this logistical bottleneck, a popular approach in the literature requires participants to type the same phrase repeatedly. However, it is still unknown whether this approach provides a good estimate of expert performance. To validate this method, we set up a study where participants were tasked with typing the same phrase 96 times for both OPTI and QWERTY. Results showed that this approach has the potential to estimate expert performance for novel/unfamiliar keyboards faster than the traditional approach with different phrases. Yet, we also found that accurate estimates still require training over several days and, therefore, do not eliminate the need for a longitudinal study. Our findings thus show the need for research on faster, easier, and more reliable empirical approaches to evaluate text entry systems. © 2023 Owner/Author.Conference Object Citation - Scopus: 0Enhancing Eye-Hand Coordination in Volleyball Players: a Comparative Analysis of Vr, Ar, and 2d Display Technologies and Task Instructions(Institute of Electrical and Electronics Engineers Inc., 2024) Batmaz, Anıl Ufuk; Aliza, A.; Batmaz, A.U.; Sarac, M.Previous studies analyzed user motor performance with Virtual Reality (VR) and Augmented Reality (AR) Eye-Hand Coordination Training Systems (EHCTSs) while asking participants to follow specific task instructions. Although these studies suggested VR & AR EHCTSs as potential training systems for sports players, they recruited participants for their user studies among general population. In this paper, we examined the training performance of 16 professional volleyball players over 8 days using EHCTSs with three display technologies (VR, AR, and 2D touchscreen) and with four distinct task instructions (prioritizing speed, error rate, accuracy, or none). Our results indicate that volleyball players performed best with 2D touchscreen in terms of time, error rate, accuracy, precision, and throughput. Moreover, their performance was superior when using VR over AR. They also successfully followed the task instructions given to them and consistently improved their throughput performance. These findings underscore the potential of EHCTS in volleyball training and highlight the need for further research to optimize VR & AR user experience and performance. © 2024 IEEE.Conference Object Citation - Scopus: 10Exploring Discrete Drawing Guides To Assist Users in Accurate Mid-Air Sketching in Vr(Association for Computing Machinery, 2022) Türkmen, R.; Batmaz, Anıl Ufuk; Pfeuffer, K.; Barrera MacHuca, M.D.; Batmaz, A.U.; Gellersen, H.Even though VR design applications that support sketching are popular, sketching accurately in mid-air is challenging for users. In this paper, we explore discrete visual guides that assist users' stroke accuracy and drawing experience inside the virtual environment. We also present an eye-tracking study that compares continuous, discrete, and no guide in a basic drawing task. Our experiment asks participants to draw a circle and a line using three different guide types, three different sizes and two different orientations. Results indicate that discrete guides are more user-friendly than continuous guides, as the majority of participants preferred their use, while we found no difference in speed/accuracy compared to continuous guides. Potentially, this can be attributed to distinct eye-gaze strategies, as discrete guides led users to shift their eyes more frequently between guide points and the drawing cursor. Our insights are useful for practitioners and researchers in 3D sketching, as they are a first step to inform future design applications of how visual guides inside the virtual environment affect visual behaviour and how eye-gaze can become a tool to assist sketching. © 2022 ACM.Editorial Citation - Scopus: 0Preface(Association for Computing Machinery, Inc, 2021) Ortega, F.; Batmaz, Anıl Ufuk; Teather, R.; Bruder, G.; Piumsomboon, T.; Weyers, B.; Batmaz, A.U.; Johnsen, K.[No abstract available]Editorial Citation - Scopus: 0Preface(Association for Computing Machinery, Inc, 2022) Sra, M.; Batmaz, Anıl Ufuk; Ortega, F.; Piumsomboon, T.; Weyers, B.; Peck, T.; Barrera, M.; Batmaz, A.U.[No abstract available]Conference Object Citation - Scopus: 1When Anchoring Fails: Interactive Alignment of Large Virtual Objects in Occasionally Failing AR Systems(Springer Science and Business Media Deutschland GmbH, 2022) Batmaz, Anıl Ufuk; Stuerzlinger, W.Augmented reality systems show virtual object models overlaid over real ones, which is helpful in many contexts, e.g., during maintenance. Assuming all geometry is known, misalignments in 3D poses will still occur without perfectly robust viewer and object 3D tracking. Such misalignments can impact the user experience and reduce the potential benefits associated with AR systems. In this paper, we implemented several interaction algorithms to make manual virtual object alignment easier, based on previously presented methods, such as HoverCam, SHOCam, and a Signed Distance Field. Our approach also simplifies the user interface for manual 3D pose alignment in 2D input systems. The results of our work indicate that our approach can reduce the time needed for interactive 3D pose alignment, which improves the user experience. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.