Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities | |
Wang, Quan1,2; Wang, Hui1,2,3; Dang, Ruo-Chen1,2; Zhu, Guang-Pu1,2,3; Pi, Hai-Feng1,2; Shic, Frederick4; Hu, Bing-liang1,2 | |
作者部门 | 光谱成像技术研究室 |
发表期刊 | Applied Intelligence |
ISSN | 0924669X;15737497 |
产权排序 | 1 |
摘要 | Gaze interaction is essential for social communication in many scenarios; therefore, interpreting people’s gaze direction is helpful for natural human-robot interactions and human-virtual characters. In this study, we first adopt a residual neural network (ResNet) structure with an embedding layer of personal identity (ID-ResNet) that outperformed the current best result of 2.51 with MPIIGaze data, a benchmark dataset for gaze estimation. To avoid using manually labelled data, we used UnityEye synthetic images with and without style transformation as the training data. We exceeded the previously reported best result with MPIIGaze data (from 2.76 to 2.55) and UT-Multiview data (from 4.01 to 3.40). In addition, it only needs to fine-tune with a few "calibration" examples for a new person to yield significant performance gains. In addition, we presented the KLBS-eye dataset that contains 15,350 images collected from 12 participants while looking in nine known directions and received the state-of-the-art result of (0.59 ± 1.69). © 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature. |
关键词 | Appearance-based ID-ResNet Style transfer Fine-tune Learning by synthesis |
DOI | 10.1007/s10489-022-03481-9 |
收录类别 | SCI ; EI |
语种 | 英语 |
WOS记录号 | WOS:000790662800006 |
出版者 | Springer |
EI入藏号 | 20221912072202 |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | http://ir.opt.ac.cn/handle/181661/95869 |
专题 | 光谱成像技术研究室 |
通讯作者 | Wang, Quan |
作者单位 | 1.Key Laboratory of Spectral Imaging Technology, Xi’an Institute of Optics and Precision Mechanics of the Chinese Academy of Sciences, Xi’an; 710119, China; 2.Key Laboratory of Biomedical Spectroscopy of Xi’an, Xi’an Institute of Optics and Precision Mechanics of the Chinese Academy of Sciences, Xi’an; 710119, China; 3.University of Chinese Academy of Sciences, Beijing; 100049, China; 4.Center for Child Health, Behavior and Development, Seattle Children’s Research Institute, Seattle; WA; 98101, United States |
推荐引用方式 GB/T 7714 | Wang, Quan,Wang, Hui,Dang, Ruo-Chen,et al. Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities[J]. Applied Intelligence. |
APA | Wang, Quan.,Wang, Hui.,Dang, Ruo-Chen.,Zhu, Guang-Pu.,Pi, Hai-Feng.,...&Hu, Bing-liang. |
MLA | Wang, Quan,et al."Style transformed synthetic images for real world gaze estimation by using residual neural network with embedded personal identities".Applied Intelligence |
条目包含的文件 | ||||||
文件名称/大小 | 文献类型 | 版本类型 | 开放类型 | 使用许可 | ||
Style transformed sy(2492KB) | 期刊论文 | 出版稿 | 限制开放 | CC BY-NC-SA | 请求全文 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论