Towards Physical Interaction Control of Collaborative Industrial Robots Augmented with Scientific Machine Learning for SME Productions

Xingyu Yang

Research output: Book/anthology/dissertation/reportPh.D. thesis


The integration of collaborative industrial robots (cobots) into small and medium-sized enterprise (SME) production is driven by the inherent advantages of cobots, including enhanced safety and adaptability. However, deploying cobots for automation in SME production presents characteristics and challenges stemming from limited workspaces alongside human workers in close proximity, as well as dealing with semi-structured or dynamic environments. The characteristics and challenges of SME production prioritize safety and Human-Robot collaboration (HRC), necessitating the enhancement of the precision, perception, and physical interaction control of cobots within the HRC paradigm. With the primary objective of facilitating physical Human-Robot Interaction (pHRI) during cobot deployment in SMEs, this dissertation addresses multiple challenges through the utilization and integration of cutting-edge technologies, such as digital twins, scientific machine learning, robot vision, and physical interaction control. These advancements further enhance the capabilities of cobots for HRC in SMEs. The dissertation is structured to systematically confront and resolve these persistent challenges, as outlined below.

Firstly, cobots benefit from the strain wave gearing for enhanced flexibility and compactness. However, the unique joint reducer poses new challenges when investigating the robot’s dynamics. Because the joint has complex structures and compositions, while most research chooses to simplify it even though the non-linearities are apparent. Moreover, the complex joint dynamics have hindered the application of digital twin, which is one enabling technique of Industrial 4.0 for process planning, controlling, and optimization, for a more accurate bilateral behavioral mapping. Therefore, a dynamics-enhanced digital twin for the cobot joints is proposed by integrating the comprehensive joint dynamic model that considers the internal interactions and transmission mechanism into the framework of the digital twin, enabling bilateral data exchange and control in a real-time manner and reducing the sim-to-real deviation.

Secondly, there are two obstacles to precisely predicting the behavior of dynamic systems: the model and identification. Since non-linear system identification remains an open challenge, the idea of utilizing machine learning to identify such systems is attractive due to its strong regression capabilities. However, parameter identification requires a more interpretable tool, and the challenge remains in lifting the veil of machine learning and making it transparent. Therefore, the concept of scientific machine learning is developed by embedding physical models into neural networks and assigning neurons with direct physical meaning, following the paradigm of supervised learning. The results have demonstrated that the proposed physics-informed neural network has superior performance in identification, especially concerning non-linear terms and dynamic couplings compared to the conventional approach.

Thirdly, cobots would frequently engage in extensive physical interactions with both humans and their surroundings. Ensuring the safety and efficiency of these interactions requires the use of a precise pHRI model to predict interaction behavior accurately. However, cobots have complex structural dynamics, and the pHRI process itself is inherently intricate, necessitating the development of a generic pHRI model. Therefore, a comprehensive pHRI model is proposed and developed by integrating the complete dynamics of the cobot with a quasi-static contact model for interactions in a systematic manner, taking into account joint dynamics and inertial coupling. Furthermore, experiments of pHRI are conducted to validate the proposed generic model, showing a strong alignment with the model’s predictions.

Fourthly, cobots require a more adaptive control strategy that can enhance their compliance while maintaining good task performance. Impedance/admittance control is a promising approach with broader applications in pHRI. However, classical impedance/admittance control operates in an open-loop manner regarding impedance, and no measurements are adopted to ensure reference impedance tracking, especially when the reference impedance varies over time for increased adaptability. Therefore, a framework for controlling the impedance of a cobot in a closed-loop manner is proposed, utilizing online parameter estimation techniques. Additionally, a mobility-based evaluation paradigm is proposed to comprehensively assess compliant behavior, considering the frequency of input. The experimental results have demonstrated significantly improved performance in tracking both constant and variable impedance.

Finally, SME productions typically unfold in semi-structured or cluttered environments with diverse task requirements. These settings pose generic challenges when deploying cobot systems in SMEs for production automation, such as enhancing the perception of cobots to environments, accomplishing various tasks with a compact toolset, and rapidly fine-tuning cobot systems. Therefore, a generic framework for automating SME production with a cobot system, enabled by learning-based vision, a multifunctional gripper system, and digital twin technology, is proposed and implemented in collaboration with one of our industrial partners. The individual tests on function modules and final onsite experiments demonstrate that the proposed framework can effectively and accurately automate the selected production process, thus indicating the feasibility of extending the framework to other applications.
Original languageEnglish
PublisherAarhus University
Number of pages170
Publication statusPublished - Mar 2024


Dive into the research topics of 'Towards Physical Interaction Control of Collaborative Industrial Robots Augmented with Scientific Machine Learning for SME Productions'. Together they form a unique fingerprint.

Cite this