A Game-Theoretic and Multimodal Interaction Framework for Collaborative Robots in Smart Manufacturing
DOI:
https://doi.org/10.31181/dmame7120241433Keywords:
Collaborative Robots, Human-Robot Interaction, Multimodal Perception, Game-Theoretic Optimization, Smart Manufacturing Systems.Abstract
Industrial assembly settings have encountered novel challenges following the integration of collaborative robots (cobots), particularly in maintaining a balance between effective human-robot interaction and operational efficiency. Interpreting and implementing complex human behaviours remains a significant difficulty, especially in conventional operations conducted under dynamic and unpredictable conditions. A critical requirement has emerged for the development of intelligent interfaces capable of autonomously regulating systems while facilitating seamless collaboration between human operators and robotic agents. This study focuses on a multimodal perception approach enhanced by game-theoretic optimisation, which has been shown to improve cobots’ responsiveness and strategic flexibility. Researchers have chosen this method due to its capacity to model interactive behaviours in variable environments. By integrating vision, auditory, and tactile sensing technologies, cobots are equipped to accurately interpret human communicative cues, even when operating with limited capabilities. The proposed framework utilises game theory to model the strategic and dynamic interactions between humans and robots, thereby addressing the dual challenges of task allocation and decision-making. This system supports optimal coordination and prompt conflict resolution, while enabling real-time behavioural adjustments based on utility-driven outcomes. Its efficacy has been validated through both simulated and practical implementations in industrial assembly scenarios, confirming its potential to enhance collaborative efficiency and safety while maintaining adaptability. Overall, this research presents a strategic design framework for cobot interaction that emphasises perceptual processing, contributing to substantial advancements in the evolution of intelligent manufacturing systems
Downloads
References
[1] Alessio, A., Aliev, K., & Antonelli, D. (2024). Multicriteria task classification in human-robot collaborative assembly through fuzzy inference. Journal of Intelligent Manufacturing, 35(5), 1909-1927. https://doi.org/10.1007/s10845-022-02062-4
[2] Andronas, D., Kampourakis, E., Papadopoulos, G., Bakopoulou, K., Kotsaris, P. S., Michalos, G., & Makris, S. (2023). Towards seamless collaboration of humans and high-payload robots: an automotive case study. Robotics and Computer-Integrated Manufacturing, 83, 102544. https://doi.org/10.1016/j.rcim.2023.102544
[3] Cai, M., Liang, R., Luo, X., & Liu, C. (2023). Task allocation strategies considering task matching and ergonomics in the human-robot collaborative hybrid assembly cell. International Journal of Production Research, 61(21), 7213-7232. https://doi.org/10.1080/00207543.2022.2147234
[4] Chen, L., Zhao, H., Shi, C., Wu, Y., Yu, X., Ren, W., Zhang, Z., & Shi, X. (2023). Enhancing Multi-Modal Perception and Interaction: An Augmented Reality Visualization System for Complex Decision Making. Systems, 12(1), 7. https://doi.org/10.3390/systems12010007
[5] Conti, C. J., Varde, A. S., & Wang, W. (2020). Robot action planning by commonsense knowledge in human-robot collaborative tasks. 2020 IEEE International IOT, Electronics and Mechatronics Conference (IEMTRONICS), 1728196159. https://doi.org/10.1109/IEMTRONICS51293.2020.9216410
[6] Conti, C. J., Varde, A. S., & Wang, W. (2022). Human-robot collaboration with commonsense reasoning in smart manufacturing contexts. IEEE Transactions on Automation Science and Engineering, 19(3), 1784-1797. https://doi.org/10.1109/TASE.2022.3159595
[7] Dei, C., Meregalli Falerni, M., Cilsal, T., Redaelli, D. F., Lavit Nicora, M., Chiappini, M., Storm, F. A., & Malosio, M. (2025). Design and testing of (A) MICO: a multimodal feedback system to facilitate the interaction between cobot and human operator. Journal on Multimodal User Interfaces, 19(1), 21-36. https://doi.org/10.1007/s12193-024-00444-x
[8] Ding, X., Guo, J., Ren, Z., & Deng, P. (2021). State-of-the-art in perception technologies for collaborative robots. IEEE Sensors Journal, 22(18), 17635-17645. https://doi.org/10.1109/JSEN.2021.3064588
[9] Duan, J., Zhuang, L., Zhang, Q., Zhou, Y., & Qin, J. (2024). Multimodal perception-fusion-control and human–robot collaboration in manufacturing: A review. The International Journal of Advanced Manufacturing Technology, 132(3), 1071-1093. https://doi.org/10.1007/s00170-024-13385-2
[10] Franceschi, P., Cassinelli, D., Pedrocchi, N., Beschi, M., & Rocco, P. (2024). Design of an assistive controller for physical human–robot interaction based on cooperative game theory and human intention estimation. IEEE Transactions on Automation Science and Engineering. https://doi.org/10.1109/TASE.2024.3429643
[11] Hameed, A., Ordys, A., Możaryn, J., & Sibilska-Mroziewicz, A. (2023). Control system design and methods for collaborative robots. Applied Sciences, 13(1), 675. https://doi.org/10.3390/app13010675
[12] Han, L., Zhang, J., & Wang, H. (2025). Shared Control in pHRI: Integrating Local Trajectory Replanning and Cooperative Game Theory. IEEE Transactions on Robotics. https://doi.org/10.1109/TRO.2025.3532510
[13] Li, S., Wang, R., Zheng, P., & Wang, L. (2021). Towards proactive human–robot collaboration: A foreseeable cognitive manufacturing paradigm. Journal of manufacturing systems, 60, 547-552. https://doi.org/10.1016/j.jmsy.2021.07.017
[14] Li, S., Zheng, P., Liu, S., Wang, Z., Wang, X. V., Zheng, L., & Wang, L. (2023). Proactive human–robot collaboration: Mutual-cognitive, predictable, and self-organising perspectives. Robotics and Computer-Integrated Manufacturing, 81, 102510. https://doi.org/10.1016/j.rcim.2022.102510
[15] Liu, S., Wang, L., & Vincent Wang, X. (2022). Multimodal data-driven robot control for human–robot collaborative assembly. Journal of Manufacturing Science and Engineering, 144(5), 051012. https://doi.org/10.1115/1.4053806
[16] Malobický, B., Hruboš, M., Kafková, J., Krško, J., Michálik, M., Pirník, R., & Kuchár, P. (2025). Towards Seamless Human–Robot Interaction: Integrating Computer Vision for Tool Handover and Gesture-Based Control. Applied Sciences, 15(7), 3575. https://doi.org/10.3390/app15073575
[17] Martinetti, A., Chemweno, P. K., Nizamis, K., & Fosch-Villaronga, E. (2021). Redefining safety in light of human-robot interaction: A critical review of current standards and regulations. Frontiers in chemical engineering, 3, 666237. https://doi.org/10.3389/fceng.2021.666237
[18] Mukherjee, D., Gupta, K., Chang, L. H., & Najjaran, H. (2022). A survey of robot learning strategies for human-robot collaboration in industrial settings. Robotics and Computer-Integrated Manufacturing, 73, 102231. https://doi.org/10.1016/j.rcim.2021.102231
[19] Proia, S., Carli, R., Cavone, G., & Dotoli, M. (2021). Control techniques for safe, ergonomic, and efficient human-robot collaboration in the digital industry: A survey. IEEE Transactions on Automation Science and Engineering, 19(3), 1798-1819. https://doi.org/10.1109/TASE.2021.3131011
[20] Rakkolainen, I., Farooq, A., Kangas, J., Hakulinen, J., Rantala, J., Turunen, M., & Raisamo, R. (2021). Technologies for multimodal interaction in extended reality—a scoping review. Multimodal Technologies and Interaction, 5(12), 81. https://doi.org/10.3390/mti5120081
[21] Rautiainen, S., Pantano, M., Traganos, K., Ahmadi, S., Saenz, J., Mohammed, W. M., & Martinez Lastra, J. L. (2022). Multimodal interface for human–robot collaboration. Machines, 10(10), 957. https://doi.org/10.3390/machines10100957
[22] Riar, M., Weber, M., Ebert, J., & Morschheuser, B. (2025). Can Gamification Foster Trust-Building in Human-Robot Collaboration? An Experiment in Virtual Reality. Information Systems Frontiers, 1-26. https://doi.org/10.1007/s10796-024-10573-z
[23] Ricci, A., Ronca, V., Capotorto, R., Giorgi, A., Vozzi, A., Germano, D., Borghini, G., Di Flumeri, G., Babiloni, F., & Aricò, P. (2025). Understanding the Unexplored: A Review on the Gap in Human Factors Characterization for Industry 5.0. Applied Sciences, 15(4), 1822. https://doi.org/10.3390/app15041822
[24] Rodriguez-Guerra, D., Sorrosal, G., Cabanes, I., & Calleja, C. (2021). Human-robot interaction review: Challenges and solutions for modern industrial environments. Ieee Access, 9, 108557-108578. https://doi.org/10.1109/ACCESS.2021.3099287
[25] Sun, Z., Zhu, M., Shan, X., & Lee, C. (2022). Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nature communications, 13(1), 5224. https://doi.org/10.1038/s41467-022-32745-8
[26] Tchane Djogdom, G. V., Otis, M. J.-D., & Meziane, R. (2025). A Theoretical Foundation for Erroneous Behavior in Human–Robot Interaction. Journal of Intelligent & Robotic Systems, 111(1), 1-24. https://doi.org/10.1007/s10846-025-02221-8
[27] Wang, S., & Jiao, R. J. (2025). Cognitive intelligent task allocation for human-automation symbiosis in Industry 5.0 manufacturing systems via non-cooperative game theory: a bi-level optimization approach. The International Journal of Advanced Manufacturing Technology, 136(3), 1717-1739. https://doi.org/10.1007/s00170-024-14890-0
[28] Wang, T., Zheng, P., Li, S., & Wang, L. (2024). Multimodal human–robot interaction for human‐centric smart manufacturing: a survey. Advanced Intelligent Systems, 6(3), 2300359. https://doi.org/10.1002/aisy.202300359
[29] Xu, J., Sun, Q., Han, Q.-L., & Tang, Y. (2025). When embodied AI meets industry 5.0: human-centered smart manufacturing. IEEE/CAA Journal of Automatica Sinica, 12(3), 485-501. https://doi.org/10.1109/JAS.2025.125327
[30] Xu, W., Yang, H., Ji, Z., & Ba, M. (2024). Cognitive digital twin-enabled multi-robot collaborative manufacturing: Framework and approaches. Computers & Industrial Engineering, 194, 110418. https://doi.org/10.1016/j.cie.2024.110418
[31] Zeng, F., Fan, C., Shirafuji, S., Wang, Y., Nishio, M., & Ota, J. (2025). Task allocation and scheduling to enhance human–robot collaboration in production line by synergizing efficiency and fatigue. Journal of manufacturing systems, 80, 309-323. https://doi.org/10.1016/j.jmsy.2025.03.006
[32] Zhang, F., Zhang, Y., & Xu, S. (2022). Collaboration effectiveness-based complex operations allocation strategy towards to human–robot interaction. Autonomous Intelligent Systems, 2(1), 20. https://doi.org/10.1007/s43684-022-00039-x
[33] Zhang, X., Sun, Y., & Zhang, Y. (2021). Evolutionary game and collaboration mechanism of human-computer interaction for future intelligent aircraft cockpit based on system dynamics. IEEE Transactions on Human-Machine Systems, 52(1), 87-98. https://doi.org/10.1109/THMS.2021.3116115
[34] Zhou, Y., Peng, Y., Li, W., & Pham, D. T. (2022). Stackelberg model-based human-robot collaboration in removing screws for product remanufacturing. Robotics and Computer-Integrated Manufacturing, 77, 102370. https://doi.org/10.1016/j.rcim.2022.102370
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Decision Making: Applications in Management and Engineering

This work is licensed under a Creative Commons Attribution 4.0 International License.