Call for Papers
About the Journal
Editorial Board
Publication Ethics
Instructions for Authors
Announcements
Current Issue
Back Issues
Search for Articles
Categories
Search for Articles
 

JCSE, vol. 18, no. 1, pp.47-56, 2024

DOI: http://dx.doi.org/10.5626/JCSE.2024.18.1.47

Real-Time Retargeting of Human Poses from Monocular Images and Videos to the NAO Robot

Oscar Burga, Jonathan Villegas, and Willy Ugarte
Department of Computer Science, Universidad Peruana de Ciencias Aplicadas, Lima, Peru

Abstract: To this point, there has been extensive research investigating human-robot motion retargeting, but the vast majority of existing methods rely on sensors or multiple cameras to detect human poses and movements, while many other methods are not suitable for usage on real-time scenarios. The current paper presents an integrated solution for performing realtime human-to-robot pose retargeting utilizing only regular monocular images and video as input data. We use deep learning models to perform three-dimensional human pose estimation on the monocular images and video, after which we calculate a set of joint angles that the robot must utilize to reproduce the detected human pose as accurately as possible. We evaluate our solution on Softbank?셲 NAO robot and show that it is possible to reproduce promising approximations and imitations of human motions and poses on the NAO robot, although it is subject to the limitations imposed by the robot?셲 degrees of freedom, joint constraints, and movement speed limitations.

Keyword: Human pose estimation; Humanoid robot; Motion retargeting; Geometry; Vectors

Full Paper:   8 Downloads, 38 View

 
 
ⓒ Copyright 2010 KIISE – All Rights Reserved.    
Korean Institute of Information Scientists and Engineers (KIISE)   #401 Meorijae Bldg., 984-1 Bangbae 3-dong, Seo-cho-gu, Seoul 137-849, Korea
Phone: +82-2-588-9240    Fax: +82-2-521-1352    Homepage: http://jcse.kiise.org    Email: office@kiise.org