Research Motion Capture-based Robotic Imitation: A Keyframeless Implementation Method using Multivariate Empirical Mode Decomposition Ran Dong, Qiong Chang, Meng Joo Er, Junpei Zhong, Soichiro Ikuno IEEE/ASME Transactions on Mechatronics, 1-12, 2024 Paper Video Code Learning multivariate empirical mode decomposition for spectral motion editing Ran Dong, Soichiro Ikuno, Xiang Yang SIGGRAPH Asia 2023 Technical Communications, 2023, 1-4 Paper Video Investigating the Effect of Jo-Ha-Kyū on Music Tempos and Kinematics across Cultures: Animation Design for 3D Characters Using Japanese Bunraku Theater Ran Dong, Dongsheng Cai, Shingo Hayano, Shinobu Nakagawa, Soichiro Ikuno Leonardo, 55(5), 468-474, 2022 Paper Electromagnetic Penetration and Reflection Analysis in Fractal Structures using Three-dimensional Empirical Mode Decomposition Ran Dong, Yuki Fujita, Hiroshi Nakamura, Soichiro Ikuno IEEE Transactions on Magnetics, 58, 9, 2022, 1-4 Paper Video Nonlinear frequency analysis of COVID-19 spread in Tokyo using empirical mode decomposition Ran Dong, Shaowen Ni, Soichiro Ikuno Scientific Reports, 12.1, 2022, 1-12 Paper A deep learning framework for realistic robot motion generation Ran Dong, Qiong Chang, Soichiro Ikuno Neural Computing and Applications, 35, 23343-23356, 2021 Paper Motion Capture Data Analysis in the Instantaneous Frequency-Domain Using Hilbert-Huang Transform Ran Dong, Dongsheng Cai, Soichiro Ikuno Sensors, 20, 6534, 2020 Paper Robot Motion Design Using Bunraku Emotional Expressions – Focusing on Jo-ha-kyū in Sounds and Movements Ran Dong, Yang Chen, Dongsheng Cai, Shinobu Nakagawa, Tomonari Higaki, Nobuyoshi Asai Advanced Robotics, 34(5), 299-312, 2020 Paper Nonlinear Dance Motion Analysis and Motion Editing using Hilbert-Huang Transform Ran Dong, Dongsheng Cai, Nobuyoshi Asai Proceedings of CGI ’17, Yokohama, Japan, June 27-30, 2017, 6 pages Paper Code