|
科研成果 |
|
|
|
论文题目: |
Robust orientation estimate via inertial guided visual sample consensus |
第一作者: |
Zhang YL(张吟龙);Liang W(梁炜);Li Y(李杨);An HB(安海博);Tan JD(谈金东) |
参与作者: |
|
联系作者: |
|
发表刊物: |
Personal and Ubiquitous Computing |
发表年度: |
2017 |
卷,期,页: |
,,1-16 |
论文出处: |
|
第一作者所在部门: |
|
论文编号: |
|
论文摘要: |
This paper presents a novel orientation estimate approach named inertial guided visual sample consensus (IGVSAC). This method is intentionally designed for capturing the orientation of human body joints in free-living environments. Unlike the traditional visual-based orientation estimation methods, where outliers among putative image-pair correspondences are removed based on hypothesize-and-verify models such as the computationally costly RANSAC, our approach novelly exploits prior motion information (i.e., rotation and translation) deduced from the quick-response inertial measurement unit (IMU) as the initial body pose to assist camera in removing hidden outliers. In addition, our IGVSAC algorithm is able to ensure estimation accuracy even in the presence of a large quantity of outliers, thanks to its capability of rejecting apparent mismatches. The estimated orientation from the visual sensor is, in turn, able to correct long-term IMU drifts. We conducted extensive experiments to verify the effectiveness and robustness of our IGVSAC algorithm. Comparisons with highly accurate VICON and OptiTrack Motion Tracking Systems prove that our orientation estimate system is quite suitable for capturing human body joints. |
论文全文: |
|
其他备注: |
|
附件下载: |
|
|
|