Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 The price drop of optoelectronic sensors and the release of more compact and easier to implement hybrid and data fusion solutions, as well as next-generation wearable lens-less cameras, will lead to fewer obstructions and improve the practicality of MoCap + 374 word(s) 374 2020-10-10 05:03:56 |
2 format change Meta information modification 374 2020-10-14 04:42:28 |

Video Upload Options

Do you have a full video?


Are you sure to Delete?
If you have any further questions, please contact Encyclopedia Editorial Office.
Menolotto, M.; Komaris, D.; Tedesco, S.; O’flynn, B.; Walsh, M.; Komaris, D.S. Motion Capture Technology. Encyclopedia. Available online: (accessed on 19 June 2024).
Menolotto M, Komaris D, Tedesco S, O’flynn B, Walsh M, Komaris DS. Motion Capture Technology. Encyclopedia. Available at: Accessed June 19, 2024.
Menolotto, Matteo, Dimitrios-Sokratis Komaris, Salvatore Tedesco, Brendan O’flynn, Michael Walsh, Dimitrios Sokratis Komaris. "Motion Capture Technology" Encyclopedia, (accessed June 19, 2024).
Menolotto, M., Komaris, D., Tedesco, S., O’flynn, B., Walsh, M., & Komaris, D.S. (2020, October 10). Motion Capture Technology. In Encyclopedia.
Menolotto, Matteo, et al. "Motion Capture Technology." Encyclopedia. Web. 10 October, 2020.
Motion Capture Technology

Motion capture (MoCap) is the process of digitally tracking and recoding the movements of objects or living beings in space.

motion tracking robot control wearable sensors gait analysis IMU

1. Introduction

Different technologies and techniques have been developed to capture motion. Camera-based systems with infrared (IR) cameras, for example, can be used to triangulate the location of retroreflective rigid bodies attached to the targeted subject. Depth sensitive cameras, projecting light towards an object, can estimate depth based on the time delay from light emission to backscattered light detection[1]. Systems based on inertial sensors[2], electromagnetic fields[3] and potentiometers that track the relative movements of articulated structures[4] also exist. Hybrid systems combine different MoCap technologies in order to improve precision and reduce camera occlusions[5]. Research has also focused on the handling and processing of high dimensional data sets with a wide range of analysis techniques, such as machine learning[6], Kalman filters[7], hierarchical clustering[8] and more.

2. Application

Thanks to their versatility, MoCap technologies are employed in a wide range of applications. In healthcare and clinical settings, they aid in the diagnosis and treatment of physical ailments, for example, by reviewing the motor function of a patient or by comparing past recordings to see if a rehabilitation approach had the desired effect[9]. Sports applications also benefit from MoCap by breaking down the athletes’ motion to analyse the efficiency of the athletic posture and make performance-enhancing modifications[10]. In industrial settings, MoCap is predominately used in the entertainment[11] and gaming industry[12], followed by relatively few industrial applications in the sectors of robotics[13], automotive[14] and construction[15].

2.1. MoCap Industrial Applications

MoCap techniques for industrial applications were primarily used for the assessment of health and safety risks in the working environment (Table 1, 64.4%), whilst fatigue and proper posture were the most targeted issues[16][17][18]. Productivity evaluation was the second most widespread application (20.3%), with studies typically aiming to identify inefficiency or alternative approaches to improve industrial processes. Similarly, MoCap techniques were also employed to directly improve workers productivity (10.1%), whereas 8.5 % of the studies focused on task monitoring[19] or in the quality control of an industrial processes[20].

Table 1. Generic MoCap applications in industry.



Number of Studies

Percentage of Studies

Workers’ Health and Safety




Improvement of Industrial Process or Product




Workers’ Productivity Improvement




Machinery Monitoring and Quality Control





  1. Zhengyou Zhang; Microsoft Kinect Sensor and Its Effect. IEEE Multimedia 2012, 19, 4-10, 10.1109/mmul.2012.24.
  2. Roetenberg, D.; Luinge, H.; Slycke, P.; Xsens MVN: Full 6DOF human motion tracking using miniature inertial. Xsens Motion Technologies BV. Tech. Rep. 2009, 1, null.
  3. Richard W Bohannon; Steven Harrison; Jeffrey Kinsella-Shaw; Reliability and validity of pendulum test measures of spasticity obtained with the Polhemus tracking system from patients with chronic stroke. Journal of NeuroEngineering and Rehabilitation 2009, 6, 30-30, 10.1186/1743-0003-6-30.
  4. Yeongyu Park; Jeongsoo Lee; Joonbum Bae; Development of a Wearable Sensing Glove for Measuring the Motion of Fingers Using Linear Potentiometers and Flexible Wires. IEEE Transactions on Industrial Informatics 2014, 11, 198-206, 10.1109/tii.2014.2381932.
  5. Bentley, M. Wireless and Visual Hybrid Motion Capture System. U.S. Patent 9,320,957, 26 April 2016.
  6. Dimitrios-Sokratis Komaris; Eduardo Perez-Valero; Luke Jordan; John Barton; Liam Hennessy; Brendan O'flynn; Salvatore Tedesco; Brendan OrFlynn; Predicting Three-Dimensional Ground Reaction Forces in Running by Using Artificial Neural Networks and Lower Body Kinematics. IEEE Access 2019, 7, 156779-156786, 10.1109/access.2019.2949699.
  7. Mei Jin; Jinge Zhao; Ju Jin; Guohui Yu; Wenchao Li; The adaptive Kalman filter based on fuzzy logic for inertial motion capture system. Measurement 2014, 49, 196-204, 10.1016/j.measurement.2013.11.022.
  8. Dimitrios-Sokratis Komaris; Cheral Govind; Jon Clarke; Alistair Ewen; Artaban Jeldi; Andrew Murphy; Philip L. Riches; Identifying car ingress movement strategies before and after total knee replacement. International Biomechanics 2020, 7, 9-18, 10.1080/23335432.2020.1716847.
  9. Kamiar Aminian; Bijan Najafi; Capturing human motion using body-fixed sensors: outdoor measurement and clinical applications. Computer Animation and Virtual Worlds 2004, 15, 79-94, 10.1002/cav.2.
  10. Tamir, M.; Oz, G. Real-Time Objects Tracking and Motion Capture in Sports Events. U.S. Patent Application No. 11/909,080, 14 August 2008.
  11. Chris Bregler; Motion Capture Technology for Entertainment [In the Spotlight]. IEEE Signal Processing Magazine 2007, 24, 160-158, 10.1109/MSP.2007.4317482.
  12. Geng, W.; Yu, G. Reuse of motion capture data in animation: A Review. In Proceedings of the Lecture Notes in Computer Science; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2003; pp. 620–629.
  13. Field, M.; Stirling, D.; Naghdy, F.; Pan, Z. Motion capture in robotics review. In Proceedings of the 2009 IEEE International Conference on Control and Automation; Institute of Electrical and Electronics Engineers (IEEE), Christchurch, New Zealand, 9–11 December 2009; pp. 1697–1702.
  14. Pierre Plantard; Hubert P.H. Shum; Anne-Sophie Le Pierres; Franck Multon; Validation of an ergonomic assessment method using Kinect data in real workplace conditions. Applied Ergonomics 2017, 65, 562-569, 10.1016/j.apergo.2016.10.015.
  15. Enrique Valero; Aparajithan Sivanathan; Frédéric Bosché; Mohamed Abdel-Wahab; Analysis of construction trade worker body motions using a wearable and wireless motion sensor network. Automation in Construction 2017, 83, 48-55, 10.1016/j.autcon.2017.08.001.
  16. Maman, Z.S.; Yazdi, M.A.A.; Cavuoto, L.A.; Megahed, F.M. A data-driven approach to modeling physical fatigue in the workplace using wearable sensors. Appl. Ergon. 2017, 65, 515–529.
  17. Merino, G.S.A.D.; Da Silva, L.; Mattos, D.; Guimarães, B.; Merino, E.A.D. Ergonomic evaluation of the musculoskeletal risks in a banana harvesting activity through qualitative and quantitative measures, with emphasis on motion capture (Xsens) and EMG. Int. J. Ind. Ergon. 2019, 69, 80–89.
  18. Romain Balaguier; Pascal Madeleine; Kevin Rose-Dulcina; Nicolas Vuillerme; Trunk kinematics and low back pain during pruning among vineyard workers—A field study at the Chateau Larose-Trintaudon. PLoS ONE 2017, 12, e0175126, 10.1371/journal.pone.0175126.
  19. Mingjie Dong; Jianfeng Li; Wusheng Chou; A new positioning method for remotely operated vehicle of the nuclear power plant. Industrial Robot: An International Journal 2019, 47, 177-186, 10.1108/ir-07-2019-0140.
  20. Han Jun Lin; Hee Seok Chang; In-process monitoring of micro series spot welding using dual accelerometer system. Welding in the World 2019, 63, 1641-1654, 10.1007/s40194-019-00799-w.
  21. Krüger, J.; Nguyen, T.D. Automated vision-based live ergonomics analysis in assembly operations. CIRP Ann. 2015, 64, 9–12.
  22. Austad, H.; Wiggen, Ø.; Færevik, H.; Seeberg, T.M. Towards a wearable sensor system for continuous occupational cold stress assessment. Ind. Health 2018, 56, 228–240.
  23. Brents, C.; Hischke, M.; Reiser, R.; Rosecrance, J.C. Low Back Biomechanics of Keg Handling Using Inertial Measurement Units. In Software Engineering in Intelligent Systems; Springer Science and Business Media: Berlin/Heidelberg, Germany, 2018; Volume 825, pp. 71–81.
  24. Caputo, F.; Greco, A.; D’Amato, E.; Notaro, I.; Sardo, M.L.; Spada, S.; Ghibaudo, L. A human postures inertial tracking system for ergonomic assessments. In Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018), Florence, Italy, 26–30 August 2018; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2018; Volume 825, pp. 173–184.
  25. Greco, A.; Muoio, M.; Lamberti, M.; Gerbino, S.; Caputo, F.; Miraglia, N. Integrated wearable devices for evaluating the biomechanical overload in manufacturing. In Proceedings of the 2019 II Workshop on Metrology for Industry 4.0 and IoT (MetroInd4.0&IoT), Naples, Italy, 4–6 June 2019; pp. 93–97.
  26. Tadele Belay Tuli; Martin Manns; Real-Time Motion Tracking for Humans and Robots in a Collaborative Assembly Task. Proceedings 2019, 42, 48, 10.3390/ecsa-6-06636.
  27. S. R. Fletcher; Teegan L. Johnson; John Thrower; A study to trial the use of inertial non-optical motion capture for ergonomic analysis of manufacturing work. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 2016, 232, 90-98, 10.1177/0954405416660997.
  28. Kinam Kim; Jingdao Chen; Yong K. Cho; Evaluation of Machine Learning Algorithms for Worker’s Motion Recognition Using Motion Sensors. Computing in Civil Engineering 2019 2019, null, 51–58, 10.1061/9780784482438.007.
  29. Nipun D. Nath; Reza Akhavian; Amir H. Behzadan; Ergonomic analysis of construction worker's body postures using wearable mobile sensors. Applied Ergonomics 2017, 62, 107-117, 10.1016/j.apergo.2017.02.007.
  30. Ragaglia, M.; Zanchettin, A.M.; Rocco, P. Trajectory generation algorithm for safe human-robot collaboration based on multiple depth sensor measurements. Mechatronics 2018, 55, 267–281.
  31. Scimmi, L.S.; Melchiorre, M.; Mauro, S.; Pastorelli, S.P. Implementing a Vision-Based Collision Avoidance Algorithm on a UR3 Robot. In Proceedings of the 2019 23rd International Conference on Mechatronics Technology (ICMT), Salerno, Italy, 23–26 October 2019; Institute of Electrical and Electronics Engineers (IEEE): New York City, NY, USA, 2019; pp. 1–6.
  32. Yang, K.; Ahn, C.; Vuran, M.C.; Kim, H. Sensing Workers gait abnormality for safety hazard identification. In Proceedings of the 33rd International Symposium on Automation and Robotics in Construction (ISARC), Auburn, AL, USA, 18–21 July 2016; pp. 957–965.
  33. Tarabini, M.; Marinoni, M.; Mascetti, M.; Marzaroli, P.; Corti, F.; Giberti, H.; Villa, A.; Mascagni, P. Monitoring the human posture in industrial environment: A feasibility study. In Proceedings of the 2018 IEEE Sensors Applications Symposium (SAS), Seoul, Korea, 12–14 March 2018; pp. 1–6.
  34. Lim, T.-K.; Park, S.-M.; Lee, H.-C.; Lee, D.-E. Artificial neural network-based slip-trip classifier using smart sensor for construction workplace. J. Constr. Eng. Manag. 2016, 142, 04015065.
  35. Monaco, M.G.L.; Fiori, L.; Marchesi, A.; Greco, A.; Ghibaudo, L.; Spada, S.; Caputo, F.; Miraglia, N.; Silvetti, A.; Draicchio, F. Biomechanical overload evaluation in manufacturing: A novel approach with sEMG and inertial motion capture integration. In Software Engineering in Intelligent Systems; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2018; Volume 818, pp. 719–726.
  36. Monaco, M.G.L.; Marchesi, A.; Greco, A.; Fiori, L.; Silvetti, A.; Caputo, F.; Miraglia, N.; Draicchio, F. Biomechanical load evaluation by means of wearable devices in industrial environments: An inertial motion capture system and sEMG based protocol. In Software Engineering in Intelligent Systems; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2018; Volume 795, pp. 233–242.
  37. Nahavandi, D.; Hossny, M. Skeleton-free RULA ergonomic assessment using Kinect sensors. Intell. Decis. Technol. 2017, 11, 275–284.
  38. Peppoloni, L.; Filippeschi, A.; Ruffaldi, E.; Avizzano, C.A. A novel wearable system for the online assessment of risk for biomechanical load in repetitive efforts. Int. J. Ind. Ergon. 2016, 52, 1–11.
  39. Seo, J.; Alwasel, A.A.; Lee, S.; Abdel-Rahman, E.M.; Haas, C. A comparative study of in-field motion capture approaches for body kinematics measurement in construction. Robotica 2017, 37, 928–946.
  40. Yang, K.; Jebelli, H.; Ahn, C.R.; Vuran, M.C. Threshold-Based Approach to Detect Near-Miss Falls of Iron Workers Using Inertial Measurement Units. Comput. Civ. Eng. 2015, 148–155.
  41. Yang, K.; Ahn, C.; Kim, H. Validating ambulatory gait assessment technique for hazard sensing in construction environments. Autom. Constr. 2019, 98, 302–309.
  42. Yang, K.; Ahn, C.; Vuran, M.C.; Kim, H. Collective sensing of workers’ gait patterns to identify fall hazards in construction. Autom. Constr. 2017, 82, 166–178.
  43. Jebelli, H.; Ahn, C.R.; Stentz, T.L. Comprehensive fall-risk assessment of construction workers using inertial measurement units: Validation of the gait-stability metric to assess the fall risk of iron workers. J. Comput. Civ. Eng. 2016, 30, 04015034.
  44. Kim, H.; Ahn, C.; Yang, K. Identifying safety hazards using collective bodily responses of workers. J. Constr. Eng. Manag. 2017, 143, 04016090.
  45. Yang, K.; Ahn, C.; Vuran, M.C.; Aria, S.S. Semi-supervised near-miss fall detection for ironworkers with a wearable inertial measurement unit. Autom. Constr. 2016, 68, 194–202.
  46. Zhong, H.; Kanhere, S.S.; Chou, C.T. WashInDepth: Lightweight hand wash monitor using depth sensor. In Proceedings of the 13th Annual International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, Hiroshima, Japan, 28 November–1 December 2016; pp. 28–37.
  47. Baghdadi, A.; Megahed, F.M.; Esfahani, E.T.; Cavuoto, L.A. A machine learning approach to detect changes in gait parameters following a fatiguing occupational task. Ergonomics 2018, 61, 1116–1129.
  48. Balaguier, R.; Madeleine, P.; Rose-Dulcina, K.; Vuillerme, N. Trunk kinematics and low back pain during pruning among vineyard workers-A field study at the Chateau Larose-Trintaudon. PLoS ONE 2017, 12, e0175126.
  49. Faber, G.S.; Koopman, A.S.; Kingma, I.; Chang, C.; Dennerlein, J.T.; Van Dieën, J.H. Continuous ambulatory hand force monitoring during manual materials handling using instrumented force shoes and an inertial motion capture suit. J. Biomech. 2018, 70, 235–241.
  50. Hallman, D.M.; Jørgensen, M.B.; Holtermann, A. Objectively measured physical activity and 12-month trajectories of neck–shoulder pain in workers: A prospective study in DPHACTO. Scand. J. Public Health 2017, 45, 288–298.
  51. Jebelli, H.; Ahn, C.; Stentz, T.L. Fall risk analysis of construction workers using inertial measurement units: Validating the usefulness of the postural stability metrics in construction. Saf. Sci. 2016, 84, 161–170.
  52. Kim, H.; Ahn, C.; Stentz, T.L.; Jebelli, H. Assessing the effects of slippery steel beam coatings to ironworkers’ gait stability. Appl. Ergon. 2018, 68, 72–79.
  53. Mehrizi, R.; Peng, X.; Xu, X.; Zhang, S.; Metaxas, D.; Li, K. A computer vision based method for 3D posture estimation of symmetrical lifting. J. Biomech. 2018, 69, 40–46.
  54. Chen, H.; Luo, X.; Zheng, Z.; Ke, J. A proactive workers’ safety risk evaluation framework based on position and posture data fusion. Autom. Constr. 2019, 98, 275–288.
  55. Yan Zhang; Chen Zhang; Rico Nestler; Gunther Notni; Efficient 3D object tracking approach based on convolutional neural network and Monte Carlo algorithms used for a pick and place robot. Photonics and Education in Measurement Science 2019 2019, 11144, 1114414, 10.1117/12.2530333.
  56. Antonio G. Sestito; Tyler M. Frasca; Aidan O’Rourke; Lili Ma; Douglas E. Dow; Control for Camera of a Telerobotic Human Computer Interface. Volume 8B: Heat Transfer and Thermal Engineering 2015, 5, null, 10.1115/imece2015-53617.
  57. Jha, A.; Chiddarwar, S.S.; Bhute, R.Y.; Alakshendra, V.; Nikhade, G.; Khandekar, P.M. Imitation learning in industrial robots. In Proceedings of the Advances in Robotics on-AIR ’17, New Delhi, India, 28 June–2 July 2017; pp. 1–6.
  58. Fabian Mueller; Christian Deuerlein; Michael Koch; Intuitive Welding Robot Programming via Motion Capture and Augmented Reality. IFAC-PapersOnLine 2019, 52, 294-299, 10.1016/j.ifacol.2019.10.045.
  59. Tao, Q.; Kang, J.; Sun, W.; Li, Z.; Huo, X. Digital evaluation of sitting posture comfort in human-vehicle system under industry 4.0 framework. Chin. J. Mech. Eng. 2016, 29, 1096–1103.
  60. Wang, W.; Li, R.; Diekel, Z.M.; Chen, Y.; Zhang, Z.; Jia, Y. Controlling object hand-over in human-robot collaboration via natural wearable sensing. IEEE Trans. Hum. Mach. Syst. 2019, 49, 59–71.
  61. Albert, D.L.; Beeman, S.M.; Kemper, A.R. Occupant kinematics of the Hybrid III, THOR-M, and postmortem human surrogates under various restraint conditions in full-scale frontal sled tests. Traffic Inj. Prev. 2018, 19, S50–S58.
  62. Cardoso, M.; McKinnon, C.; Viggiani, D.; Johnson, M.J.; Callaghan, J.P.; Albert, W.J. Biomechanical investigation of prolonged driving in an ergonomically designed truck seat prototype. Ergonomics 2017, 61, 367–380.
  63. Oyekan, J.; Prabhu, V.; Tiwari, A.; Baskaran, V.; Burgess, M.; McNally, R. Remote real-time collaboration through synchronous exchange of digitised human–workpiece interactions. Futur. Gener. Comput. Syst. 2017, 67, 83–93.
  64. Prabhu, V.A.; Song, B.; Thrower, J.; Tiwari, A.; Webb, P. Digitisation of a moving assembly operation using multiple depth imaging sensors. Int. J. Adv. Manuf. Technol. 2015, 85, 163–184.
  65. Bortolini, M.; Faccio, M.; Gamberi, M.; Pilati, F. Motion Analysis System (MAS) for production and ergonomics assessment in the manufacturing processes. Comput. Ind. Eng. 2020, 139, 105485.
  66. Akhavian, R.; Behzadan, A.H. Productivity analysis of construction worker activities using smartphone sensors. In Proceedings of the 16th International Conference on Computing in Civil and Building Engineering (ICCCBE2016), Osaka, Japan, 6–8 July 2016.
  67. Malaisé, A.; Maurice, P.; Colas, F.; Charpillet, F.; Ivaldi, S. Activity Recognition with Multiple Wearable Sensors for Industrial Applications. In Proceedings of the ACHI 2018-Eleventh International Conference on Advances in Computer-Human Interactions, Rome, Italy, 25 March 2018.
  68. Philipp Agethen; Michael Otto; Stefan Mengel; Enrico Rukzio; Using Marker-less Motion Capture Systems for Walk Path Analysis in Paced Assembly Flow Lines. Procedia CIRP 2016, 54, 152-157, 10.1016/j.procir.2016.04.125.
  69. Bastian C. Müller; The Duy Nguyen; Quang-Vinh Dang; Bui Minh Duc; G. Seliger; Jörg Krüger; Holger Kohl; Motion Tracking Applied in Assembly for Worker Training in different Locations. Procedia CIRP 2016, 48, 460-465, 10.1016/j.procir.2016.04.117.
  70. Savvas Papaioannou; Andrew Markham; Niki Trigoni; Tracking People in Highly Dynamic Industrial Environments. IEEE Transactions on Mobile Computing 2016, 16, 2351-2365, 10.1109/TMC.2016.2613523.
  71. McGregor, A.; Dobie, G.; Pearson, N.; MacLeod, C.; Gachagan, A. Mobile robot positioning using accelerometers for pipe inspection. In Proceedings of the 14th International Conference on Concentrator Photovoltaic Systems, Puertollano, Spain, 16–18 April 2018; AIP Publishing: Melville, NY, USA, 2019; Volume 2102, p. 060004.
  72. Ham, Y.; Yoon, H. Motion and visual data-driven distant object localization for field reporting. J. Comput. Civ. Eng. 2018, 32, 04018020.
  73. Herwan, J.; Kano, S.; Ryabov, O.; Sawada, H.; Kasashima, N.; Misaka, T. Retrofitting old CNC turning with an accelerometer at a remote location towards Industry 4.0. Manuf. Lett. 2019, 21, 56–59.
Subjects: Others
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to : , , , , ,
View Times: 2.4K
Revisions: 2 times (View History)
Update Date: 14 Oct 2020
Video Production Service