期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
EasyGaze:Hybrid eye tracking approach for handheld mobile devices 被引量:2
1
作者 shiwei cheng Qiufeng PING +1 位作者 Jialing WANG Yijian CHEN 《Virtual Reality & Intelligent Hardware》 2022年第2期173-188,共16页
Background Eye-tracking technology for mobile devices has made significant progress.However,owing to limited computing capacity and the complexity of context,the conventional image feature-based technology cannot extr... Background Eye-tracking technology for mobile devices has made significant progress.However,owing to limited computing capacity and the complexity of context,the conventional image feature-based technology cannot extract features accurately,thus affecting the performance.Methods This study proposes a novel approach by combining appearance-and feature-based eye-tracking methods.Face and eye region detections were conducted to obtain features that were used as inputs to the appearance model to detect the feature points.The feature points were used to generate feature vectors,such as corner center-pupil center,by which the gaze fixation coordinates were calculated.Results To obtain feature vectors with the best performance,we compared different vectors under different image resolution and illumination conditions,and the results indicated that the average gaze fixation accuracy was achieved at a visual angle of 1.93°when the image resolution was 96×48 pixels,with light sources illuminating from the front of the eye.Conclusions Compared with the current methods,our method improved the accuracy of gaze fixation and it was more usable. 展开更多
关键词 Eye movement Gaze estimation FIXATION Human-computer interaction Eye tracking
在线阅读 下载PDF
Adaptive navigation assistance based on eye movement features in virtual reality 被引量:1
2
作者 Song ZHAO shiwei cheng 《Virtual Reality & Intelligent Hardware》 2023年第3期232-248,共17页
Background Navigation assistance is essential for users when roaming virtual reality scenes;however,the traditional navigation method requires users to manually request a map for viewing,which leads to low immersion a... Background Navigation assistance is essential for users when roaming virtual reality scenes;however,the traditional navigation method requires users to manually request a map for viewing,which leads to low immersion and poor user experience.Methods To address this issue,we first collected data on who required navigation assistance in a virtual reality environment,including various eye movement features,such as gaze fixation,pupil size,and gaze angle.Subsequently,we used the boosting-based XGBoost algorithm to train a prediction model and finally used it to predict whether users require navigation assistance in a roaming task.Results After evaluating the performance of the model,the accuracy,precision,recall,and F1-score of our model reached approximately 95%.In addition,by applying the model to a virtual reality scene,an adaptive navigation assistance system based on the real-time eye movement data of the user was implemented.Conclusions Compared with traditional navigation assistance methods,our new adaptive navigation assistance method could enable the user to be more immersive and effective while roaming in a virtual reality(VR)environment. 展开更多
关键词 Eye movement NAVIGATION Human-computer interaction Virtual reality Eye tracking
在线阅读 下载PDF
Collaborative eye tracking based code review through real-time shared gaze visualization 被引量:1
3
作者 shiwei cheng Jialing WANG +2 位作者 Xiaoquan SHEN Yijian CHEN Anind DEY 《Frontiers of Computer Science》 SCIE EI CSCD 2022年第3期163-173,共11页
Code review is intended to find bugs in early development phases,improving code quality for later integration and testing.However,due to the lack of experience with algorithm design,or software development,individual ... Code review is intended to find bugs in early development phases,improving code quality for later integration and testing.However,due to the lack of experience with algorithm design,or software development,individual novice programmers face challenges while reviewing code.In this paper,we utilize collaborative eye tracking to record the gaze data from multiple reviewers,and share the gaze visualization among them during the code review process.The visualizations,such as borders highlighting current reviewed code lines,transition lines connecting related reviewed code lines,reveal the visual attention about program functions that can facilitate understanding and bug tracing.This can help novice reviewers to make sense to confirm the potential bugs or avoid repeated reviewing of code,and potentially even help to improve reviewing skills.We built a prototype system,and conducted a user study with paired reviewers.The results showed that the shared real-time visualization allowed the reviewers to find bugs more efficiently. 展开更多
关键词 computer supported collaborative learning computer supported cooperative work social computing FIXATION human computer interaction
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部