OPT OpenIR  > 光谱成像技术研究室
Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities
Wang, Quan1,2; Wang, Hui1,2,3; Dang, Ruo-Chen1,2; Zhu, Guang-Pu1,2,3; Pi, Hai-Feng1,2; Shic, Frederick4; Hu, Bing-liang1,2
作者部门光谱成像技术研究室
发表期刊Applied Intelligence
ISSN0924669X;15737497
产权排序1
摘要

Gaze interaction is essential for social communication in many scenarios; therefore, interpreting people’s gaze direction is helpful for natural human-robot interactions and human-virtual characters. In this study, we first adopt a residual neural network (ResNet) structure with an embedding layer of personal identity (ID-ResNet) that outperformed the current best result of 2.51 with MPIIGaze data, a benchmark dataset for gaze estimation. To avoid using manually labelled data, we used UnityEye synthetic images with and without style transformation as the training data. We exceeded the previously reported best result with MPIIGaze data (from 2.76 to 2.55) and UT-Multiview data (from 4.01 to 3.40). In addition, it only needs to fine-tune with a few "calibration" examples for a new person to yield significant performance gains. In addition, we presented the KLBS-eye dataset that contains 15,350 images collected from 12 participants while looking in nine known directions and received the state-of-the-art result of (0.59 ± 1.69). © 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.

关键词Appearance-based ID-ResNet Style transfer Fine-tune Learning by synthesis
DOI10.1007/s10489-022-03481-9
收录类别SCI ; EI
语种英语
WOS记录号WOS:000790662800006
出版者Springer
EI入藏号20221912072202
引用统计
被引频次:4[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.opt.ac.cn/handle/181661/95869
专题光谱成像技术研究室
通讯作者Wang, Quan
作者单位1.Key Laboratory of Spectral Imaging Technology, Xi’an Institute of Optics and Precision Mechanics of the Chinese Academy of Sciences, Xi’an; 710119, China;
2.Key Laboratory of Biomedical Spectroscopy of Xi’an, Xi’an Institute of Optics and Precision Mechanics of the Chinese Academy of Sciences, Xi’an; 710119, China;
3.University of Chinese Academy of Sciences, Beijing; 100049, China;
4.Center for Child Health, Behavior and Development, Seattle Children’s Research Institute, Seattle; WA; 98101, United States
推荐引用方式
GB/T 7714
Wang, Quan,Wang, Hui,Dang, Ruo-Chen,et al. Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities[J]. Applied Intelligence.
APA Wang, Quan.,Wang, Hui.,Dang, Ruo-Chen.,Zhu, Guang-Pu.,Pi, Hai-Feng.,...&Hu, Bing-liang.
MLA Wang, Quan,et al."Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities".Applied Intelligence
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Style transformed sy(2492KB)期刊论文出版稿限制开放CC BY-NC-SA请求全文
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Wang, Quan]的文章
[Wang, Hui]的文章
[Dang, Ruo-Chen]的文章
百度学术
百度学术中相似的文章
[Wang, Quan]的文章
[Wang, Hui]的文章
[Dang, Ruo-Chen]的文章
必应学术
必应学术中相似的文章
[Wang, Quan]的文章
[Wang, Hui]的文章
[Dang, Ruo-Chen]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。