Open Access Mini- Review

AI-Driven Collaborative Robots: Enhancing Human-Robot Interaction for Emerging Industries

Loso Judijanto*

IPOSS Jakarta, Indonesia

Corresponding Author

Received Date:February 12, 2025;  Published Date:February 20, 2025

Introduction

Collaborative robots (cobots) are redefining industrial automation by enabling safe and efficient human-robot collaboration. Unlike traditional industrial robots confined to cages, cobots leverage advanced sensors, artificial intelligence (AI), and adaptive programming to work alongside humans in shared workspaces [1]. Their growing adoption in industries such as manufacturing, healthcare, and logistics highlights the importance of human-robot interaction (HRI) in balancing safety, productivity, and adaptability [2-4]. However, challenges persist in achieving seamless collaboration, including limited contextual awareness in dynamic environments, communication barriers due to unstructured human inputs, and the need for real-time adaptation to workflow variations [5-8]. For instance, studies reveal that 34% of HRI inefficiencies stem from rigid task programming and insufficient sensor fusion for environmental perception [3,9]. Recent advancements in multimodal HRI systemsintegrating vision, natural language processing (NLP), and tactile feedback-have demonstrated 25% faster task coordination in manufacturing assembly lines by enabling intuitive human-cobot communication [5,10]. Furthermore, AI-driven reinforcement learning frameworks allow cobots to optimize actions through continuous human feedback, reducing reprogramming requirements by 40% in precision tasks like surgical instrument handling [5,11]. This review explores recent AI-driven advancements in cobots, focusing on innovations that enhance HRI and their transformative potential for emerging industries.

Current State of Collaborative Robots

Modern cobots are distinguished by their adaptability, intrinsic safety mechanisms - such as force-limiting joints, collision detection - and user-friendly programming interfaces which make them accessible to non-expert users [12-15]. For instance, cobots like Universal Robots’ UR series feature rounded edges and torque sensors to minimize injury risks during accidental contact, setting a benchmark for safety-focused design [16]. Despite these advancements, significant challenges persist in enhancing HRI. Key issues include cobots’ limited contextual understanding, rigid task adaptability, and communication barriers that hinder seamless collaboration [17-19]. Many cobots still require explicit programming for new tasks, reducing their flexibility in dynamic and unstructured environments [20-23].

This limitation is particularly pronounced in sectors like healthcare and logistics, where tasks often demand real-time adaptability [23]. Addressing these challenges necessitates integrating advanced artificial intelligence (AI) systems, such as reinforcement learning and natural language processing, to enable cobots to learn from human demonstrations and adapt to complex instructions autonomously [24-26]. Furthermore, the lack of standardized protocols for HRI exacerbates these challenges. While frameworks like ISO/TS 15066 provide guidelines for safety in collaborative robotics, they do not adequately address AI-specific risks such as algorithmic bias or decision-making transparency [15, 27-29]. Future research must focus on developing interoperable systems that enhance cobot functionality while ensuring safety and reliability in diverse operational contexts.

Ai-Driven Innovations in Cobots

Machine Learning (ML) for Adaptability

Machine learning (ML) plays a pivotal role in enhancing the adaptability of collaborative robots (cobots), enabling them to learn from human demonstrations and environmental data. Through reinforcement learning (RL), cobots can optimize their actions in real time, such as dynamically adjusting grip strength during intricate assembly tasks, thereby improving operational efficiency and precision [30,31]. For instance, RL-driven cobots have shown significant improvements in manufacturing by learning optimal force application strategies through trial-and-error processes, which are guided by reward functions tailored to specific tasks [31- 33].

In logistics, AI-powered cobots utilize predictive analytics and computer vision to adapt to variable package sizes and weights, allowing them to streamline palletizing and sorting operations. This adaptability has led to measurable improvements in throughput, with some warehouses reporting productivity gains of up to 30% [34-36]. Moreover, advanced ML techniques, such as imitation learning and one-shot learning, empower cobots to acquire new skills with minimal human intervention. For example, cobots can observe a human operator performing a task once and replicate it with high accuracy, significantly reducing the need for extensive programming [37-39]. These advancements underscore the transformative potential of ML in enabling cobots to operate in dynamic environments. By leveraging real-time sensor data and adaptive algorithms, cobots can seamlessly transition between tasks, even those they have not encountered before. This capability not only enhances their flexibility but also positions them as indispensable tools in industries requiring rapid adaptation to changing demands [3,31,40].

Natural Language Processing (NLP)

NLP bridges communication gaps in human-robot collaboration by enabling voice-based commands, contextual intent recognition, and bidirectional feedback loops. Advanced frameworks integrate speech recognition with semantic analysis, allowing cobots to interpret nuanced instructions and adapt workflows in dynamic environments. For instance, systems like CoboVox leverage transformer-based architectures to map spoken commands to robotic actions, reducing programming complexity for non-expert users while maintaining 92.3% accuracy in industrial noise conditions [41-43]. Recent advancements in large language models (LLMs), such as GPT-4 and Code-Llama, have enabled cobots to resolve ambiguities in multi-step assembly tasks through iterative dialogue. Studies demonstrate that LLM-driven cobots achieve 40% faster error resolution in automotive manufacturing by contextualizing troubleshooting queries against CAD schematics [44-47]. However, challenges persist in real-time processing of code-mixed language inputs (e.g., English-technical jargon combinations), with error rates increasing by 18% in multilingual factory settings [48-49]. Emerging solutions combine few-shot learning with domain-specific embeddings to improve cross-lingual HRI robustness [50-52].

Computer Vision and Sensor Fusion

Modern AI-driven cobots integrate multi-modal perception systems combining 3D computer vision, LiDAR, and depth sensors to achieve robust spatial awareness in dynamic environments. Stereo vision systems, such as those employing parallel structured light technology, enable real-time 3D reconstruction of unstructured workspaces by capturing high-resolution depth maps in a single snapshot, overcoming motion-blur limitations of traditional sequential scanning methods [53-54]. For instance, cobots like Dobby leverage hybrid architectures where LiDARderived point clouds are fused with RGB-D camera data using deep learning algorithms, achieving sub-centimeter accuracy in obstacle detection while operating at speeds exceeding 1.5 m/s [55-60]. However, challenges persist in occluded environments, where heterogeneous sensor fusion frameworks-such as Kalman filtering combined with convolutional neural networks-are required to reconcile discrepancies between LiDAR’s sparse long-range data and vision-based dense depth predictions [59,61].

Tactile sensors address critical gaps in purely visual perception by providing haptic feedback during precision tasks. In surgical applications, piezoelectric tactile arrays with 12- μm spatial resolution now enable real-time force modulation during instrument manipulation, reducing tissue deformation errors by 38% compared to vision-only systems [62-65]. Recent advancements in MEMS-based tactile sensors further allow simultaneous measurement of normal (0-30 N) and shear forces (±15 N) with 98 mN resolution, enabling autonomous suturing cobots to detect suture thread slippage within 50 ms [66-67]. Despite progress, miniaturization and sterilization compatibility remain barriers, prompting innovations in biocompatible graphenepolymer composites for sterile environments [63,68].

Applications across emerging industries

Manufacturing

Cobots are redefining production workflows by automating repetitive tasks (e.g., screwdriving, welding) while enabling close collaboration with human workers on complex processes like realtime quality inspection. For instance, BMW’s integration of AIguided cobots in assembly lines has reduced ergonomic strain on workers by delegating repetitive motions, while machine learning algorithms optimize precision in tasks like circuit board soldering, achieving a 20% productivity boost [69,70]. Advanced cobots now incorporate adaptive workflows, such as Whirlpool’s AI-driven systems that dynamically adjust assembly processes for customized appliances, mirroring Industry 4.0’s demand for flexibility [70,71]. These systems also enhance defect detection through vision-based AI, as seen in Merck’s pharmaceutical plants, where cobots achieve near-perfect accuracy in pill inspection, minimizing waste [70,72].

Healthcare

Surgical cobots, such as Intuitive Surgical’s da Vinci system, enable millimeter-level precision in minimally invasive procedures, reducing average procedure times by 15% while improving patient recovery outcomes [73-75]. Beyond surgery, socially assistive cobots like Pepper and Paro employ emotion recognition AI to provide companionship for elderly patients, addressing caregiver shortages exacerbated by aging populations [73,76-78]. In rehabilitation, cobots like the Lokomat gait trainer leverage force-sensing technology to deliver personalized physical therapy, adapting support levels in real time based on patient progress [73,77]. These systems also mitigate occupational hazards, as evidenced by a 30% reduction in staff injuries reported in hospitals deploying cobots for hazardous tasks like sterilization [77,79].

Logistics

Amazon’s AI-powered cobots exemplify the shift toward adaptive automation, with autonomous mobile robots (AMRs) dynamically rerouting in response to real-time warehouse layout changes, improving inventory retrieval speeds by 40% [41,80,84]. Collaborative perception algorithms allow AMRs like DHL’s LocusBots to safely navigate shared workspaces, optimizing itempicking rates from 90 to 200 units/hour [82,85-86]. AI-driven predictive maintenance, as implemented by Siemens, further reduces downtime by analyzing cobot sensor data to preempt equipment failures [70,87]. These innovations align with IFR data showing a 45% YoY growth in logistics AMR deployments, driven by demands for scalable, error-free operations [85,87,88].

Challenges and future directions

Ethical and Economic Concerns

AI-driven automation risks displacing low-skilled jobs, exacerbating income inequality, with studies showing automation accounts for 50-70% of wage stagnation among routine-task workers since 1980 [89-91]. McKinsey estimates 85 million jobs globally could be disrupted by automation by 2025 [92], disproportionately affecting manufacturing and logistics sectors [93,94]. While high-skilled workers benefit from AI-augmented productivity gains [95], low-skilled workers face reduced labor share and wage erosion due to algorithmic task allocation [90,91]. Strategies like reskilling programs must address skill gaps in AI literacy, with hybrid workforce models showing 30% higher retention rates when paired with cobot-as-a-service (CaaS) adoption [96,97]. The CaaS model, projected to dominate 40% of cobot deployments by 2030 [96], reduces upfront costs for SMEs while enabling pay-per-use scalability [98,99]. However, ethical frameworks must govern AI’s distributional impacts to prevent concentrated corporate benefits at the expense of social equity [81,94,100-102].

Safety in Critical Environments

In healthcare, cobot reliability requires multi-layered failsafes, as surgical robots demand <0.1% error rates during invasive procedures [2,91,103]. Real-time anomaly detection systems combining Hidden Markov Models (HMM) and Support Vector Machines (SVM) achieve 98.66% accuracy in identifying abnormal physiological signals [104], critical for robotic-assisted surgeries. Hybrid architectures merging rule-based systems with machine learning improve decision transparency-PERML (Parallel Ensemble of Rules and ML) models reduce false positives by 41% compared to pure ML approaches while maintaining compliance with clinical protocols [105-107]. For instance, ISO 13482-compliant cobots in orthopedic surgeries now integrate force-limiting algorithms (<5N safety thresholds) with vision-guided path correction, reducing procedure times by 15% without compromising precision [5,103].

Standardization of HRI Protocols

Current HRI standardization gaps persist, with ISO/TS 15066 lacking provisions for dynamic risk assessment in AIdriven cobots [108-112]. While the framework establishes force/ pressure limits (e.g., <140N for transient contact) [113], it does not address algorithmic bias in task allocation-a critical flaw when reinforcement learning agents prioritize productivity over worker ergonomics [114]. Recent proposals advocate expanding ISO/TS 15066 to include:
1. Explainable AI (XAI) requirements: Mandating SHAP (SHapley Additive exPlanations) values for cobot decision logs [115,116];
2. Real-time bias auditing: Implementing counterfactual fairness checks during human-robot collaboration [107,116];
3. Cybersecurity benchmarks: Adopting NIST AI RMF guidelines for adversarial attack resistance in shared workspaces [116,117].

Interoperability remains problematic, with 68% of manufacturers reporting integration challenges between cobot brands [102,114]. Unified communication protocols like OPC UA over 5G could bridge this gap, enabling cross-platform HRI latency below 10ms [118].

Conclusion

AI-driven cobots are transforming industries by merging human dexterity with robotic precision, enabling breakthroughs in dynamic task execution and real-time adaptability. Innovations in machine learning (e.g., reinforcement learning for grip optimization), natural language processing (NLP-driven systems like CoboVox), and multimodal sensor fusion (e.g., Dobby’s 3D vision) have revolutionized human-robot interaction (HRI), particularly in healthcare-where surgical cobots reduce procedure times by 15%-and logistics, where Amazon’s AI cobots adapt to warehouse layout changes. However, these advancements coexist with pressing ethical dilemmas, including the displacement of lowskilled labor and algorithmic biases in task allocation. Mitigating these risks requires not only governance frameworks but also proactive strategies like reskilling programs and cobot-as-a-service (CaaS) models to ensure equitable economic transitions.

Safety remains a critical challenge, particularly in high-stakes environments like surgery, where hybrid AI systems combining rule-based protocols and machine learning (ML) are essential for fail-safe reliability. Furthermore, the lack of standardized HRI protocols-evident in the limited scope of ISO/TS 15066-hinders interoperability and raises concerns about accountability in shared workspaces. Future research must prioritize trust-building mechanisms, such as explainable AI (XAI) for transparent decisionmaking and adaptive learning systems that incorporate human feedback loops. Additionally, interdisciplinary efforts should address socio-technical gaps by developing participatory design frameworks that involve workers in cobot deployment processes. By balancing technological innovation with ethical foresight, cobots can sustainably augment human capabilities while safeguarding societal well-being.

Acknowledgement

None.

Conflict of Interest

No conflict of interest.

References

    1. S Proia, R Carli, G Cavone, M Dotoli (2021) A Literature Review on Control Techniques for Collaborative Robotics in Industrial Applications, in 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE), pp. 591-596.
    2. F Vicentini (2021) Collaborative Robotics: A Survey, J. Mech. Des 143(4).
    3. A Borboni, KVV Reddy, I Elamvazuthi, MS AL Quraishi, E Natarajan, et al. (2023) The Expanding Role of Artificial Intelligence in Collaborative Robots for Industrial Applications: A Systematic Review of Recent Works, Machines 11(1): 111.
    4. P Barattini, F Vicentini, GS Virk, T Heidegger Eds., (2019) Human-Robot Interaction: Safety, Standardization, and Benchmarking. Routledge Taylor & Fracis Group.
    5. C Taesi, F Aggogeri, N Pellegrini (2023) COBOT Applications-Recent Advances and Challenges, Robotics 12(3): 79.
    6. A Verma, SR Dhupam, S Bansod, RR (2024) Application of AI in Education. A Bibliometric Analysis, Int. J. Relig 5(8): 198-207.
    7. D Han, B Mulyana, V Stankovic, S Cheng (2023) A Survey on Deep Reinforcement Learning Algorithms for Robotic Manipulation, Sensors 23(7): 3762.
    8. H Chen (2025) Human-in-the-Loop Robot Learning for Smart Manufacturing: A Human-Centric Perspective, IEEE Trans. Autom. Sci. Eng, pp. 1.
    9. AC Simões, A Pinto, J Santos, S Pinheiro, D Romero (2022) Designing human-robot collaboration (HRC) workspaces in industrial settings: A systematic literature review, J. Manuf. Syst 62: 28-43.
    10. A Singh (2024) Cobots in Manufacturing: A New Era in Human-Robot Collaboration, AZO Robotics: Editorial Feature.
    11. Renaldi (2024) How Artificial Intelligence (AI) is Changing Robotics, BINUS University School of Information Systems.
    12. M Spezialetti, G Placidi, S Rossi (2020) Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives, Front. Robot. AI 7: 532279.
    13. G de M Costa, MR Petry, AP Moreira (2022) Augmented Reality for Human–Robot Collaboration and Cooperation in Industrial Applications: A Systematic Literature Review, Sensors 22(7): 2725.
    14. SA Green, M Billinghurst, X Chen, JG Chase (2008) Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design, Int. J. Adv. Robot. Syst 5(1).
    15. A Martinetti, PK Chemweno, K Nizamis, E Fosch-Villaronga (2021) Redefining Safety in Light of Human-Robot Interaction: A Critical Review of Current Standards and Regulations, Front. Chem. Eng, vol. 3.
    16. MotionAI (2023) Collaborative Robot Safety: 5 Strategies to Ensure Safe Operation Safety Standards for Collaborative Robots.
    17. F Vicentini (2020) Terminology in safety of collaborative robotics, Robot. Comput. Integr. Manuf 63: 101921.
    18. U Othman, E Yang (2023) Human–Robot Collaborations in Smart Manufacturing Environments: Review and Outlook, Sensors 23(12): 5663.
    19. E Matheson, R Minto, EGG Zampieri, M Faccio, G Rosati (2019) Human-Robot Collaboration in Manufacturing Applications: A Review, Robotics 8(4): 100.
    20. A Weiss, K Spiel (2022) Robots beyond Science Fiction: mutual learning in human-robot interaction on the way to participatory approaches, AI Soc 37(2): 501-515.
    21. TTM Tran, C Parker, M Tomitsch (2021) A Review of Virtual Reality Studies on Autonomous Vehicle-Pedestrian Interaction, IEEE Trans. Human-Machine Syst 51(6): 641-652.
    22. E Weiss, JC Gerdes (2023) High Speed Emulation in a Vehicle-in-the-Loop Driving Simulator, IEEE Trans. Intell. Veh 8(2): 1826-1836.
    23. M Lorenzini, M Lagomarsino, L Fortini, S Gholami, A Ajoudani (2023) Ergonomic human-robot collaboration in industry: A review, Front. Robot. AI 9: 813907.
    24. C Weidemann (2023) Literature Review on Recent Trends and Perspectives of Collaborative Robotics in Work 4.0, Robotics 12(3): 84.
    25. TB Sheridan (2016) Human-Robot Interaction, Hum. Factors J. Hum. Factors Ergon. Soc 58(4): 525-532.
    26. R Huang, K Shi, X Li, K Shi (2023) Editorial: Collaborative interaction and control for intelligent human-robot systems, Front. Neurorobot 17: 1147663.
    27. K Dautenhahn (2024) Human-Robot Interaction, The Encylopedia of Human-Computer Interaction.
    28. S Hopko, J Wang, R Mehta (2022) Human Factors Considerations and Metrics in Shared Space Human-Robot Collaboration: A Systematic Review, Front. Robot. AI 9: 799522.
    29. A Giallanza, G La Scalia, R Micale, CM La Fata (2024) Occupational health and safety issues in human-robot collaboration: State of the art and open challenges, Saf. Sci., 169: 106313.
    30. J Kober, JA Bagnell, J Peters (2013) Reinforcement learning in robotics: A survey, Int. J. Rob. Res 32(11):1238-1274.
    31. T Zhang, HMo (2021) Reinforcement learning for robot research: A comprehensive review and open issues, Int. J. Adv. Robot. Syst 18(3).
    32. J De Winter, A De Beir, I El Makrini, G Van de Perre, A Nowé, B Vanderborght (2019) Accelerating Interactive Reinforcement Learning by Human Advice for an Assembly Task by a Cobot, Robotics 8(4): 104.
    33. P Kormushev, S Calinon, D Caldwell (2013) Reinforcement Learning in Robotics: Applications and Real-World Challenges, Robotics 2(3):122-148.
    34. ZA Zafar, T Abbas, M Rafiq, N Khan, S Ahmed, NR Jadoon (2024) STEP-NC compliant 3-axis CNC machine controller based on Field-Programmable Gate Arrays, Int. J. Comput. Integr. Manuf, pp. 1-19.
    35. E Musaoglu (2024) Cobots in Warehousing: What They are and Their Impact on E-commerce Fulfillment, LOGIWA Warehous Management System.
    36. G Atzeni, G Vignali, L Tebaldi, E Bottani (2021) A bibliometric analysis on collaborative robots in Logistics 4.0 environments, Procedia Comput. Sci 180: 686-695.
    37. LAD Phan, HQT Ngo (2024) Systematic Review of Smart Robotic Manufacturing in the Context of Industry 4.0, in Context-Aware Systems and Applications, P. Cong Vinh and N. Thanh Tung, Eds., Cham: Springer Nature Switzerland, pp. 19-42.
    38. S El Zaatari, M Marei, W Li, Z Usman (2019) Cobot programming for collaborative industrial tasks: An overview,” Rob. Auton. Syst 116: 162-180.
    39. M Faccio (2023) Human factors in Cabot era: a review of modern production systems features, J. Intell. Manuf 34(1): 85-106.
    40. D Weller (2024) Artificial Intelligence and Collaborative Robots: The Workers of the Future? AMFG Autonmomous Manufacturing.
    41. F Sanfilippo, MH Zafar (2024) Human-Robot Teaming: A Universal Controller with Multi-Modal Feedback for Emergency Response Robots, in 2024 9th International Conference on Robotics and Automation Engineering (ICRAE), IEEE, pp.181-187.
    42. G Siwach, C Li (2023) Enhancing Human Cobot Interaction using Natural Language Processing, in 2023 IEEE 4th International Multidisciplinary Conference on Engineering Technology (IMCET), pp. 21-26.
    43. O Mohan Banur, BK Patle, S Pawar (2024) Integration of robotics and automation in supply chain: a comprehensive review, Robot. Syst. Appl 4(1): 1-19.
    44. Y Ge, Y Dai, R Shan, K Li, Y Hu, X Sun (2024) Cocobo: Exploring Large Language Models as the Engine for End-User Robot Programming, Proc. IEEE Symp. Vis. Lang. Human-Centric Comput. VL/HCC, pp. 89-95.
    45. H Luo, J Wu, J Liu, MF Antwi Afari (2024) Large language model-based code generation for the control of construction assembly robots: A hierarchical generation approach, Dev. Built Environ 19: 100488.
    46. S Vinoth Kumar, RB Saroo Raj, J Praveenchandar, S Vidhya, S Karthick, R Madhubala (2024) Future Prospects of Large Language Models: Enabling Natural Language Processing in Educational Robotics, Int. J. Interact. Mob. Technol 18(23): 85-97.
    47. J Lim, S Patel, A Evans, J Pimley, Y Li, I Kovalenko (2024) Enhancing Human-Robot Collaborative Assembly in Manufacturing Systems Using Large Language Models, in 2024 IEEE 20th International Conference on Automation Science and Engineering (CASE), IEEE, pp. 2581-2587.
    48. X Wang (2024) Advancing Human-Robot Interaction: The Role of Natural Language Processing in Robotic Systems, in Proceedings of the 2024 International Conference on Artificial Intelligence and Communication (ICAIC 2024), Y Wang et al, Ed., 1st Atlantis Press International BV, pp. 195-209.
    49. Y Cohen, M Faccio, S Rozenes (2025) Vocal Communication Between Cobots and Humans to Enhance Productivity and Safety: Review and Discussion, Appl. Sci 15(2): 726.
    50. KB Mustapha (2025) A survey of emerging applications of large language models for problems in mechanics, product design, and manufacturing, Adv. Eng. Informatics 64: 103066.
    51. V Saini, N Joseph (2024) Artificial Intelligence in Robotics using NLP, IFAC Proc 18(16): 1-10.
    52. HA Younis, NIR Ruhaiyem, W Ghaban, NA Gazem, M Nasser (2023) A Systematic Literature Review on the Applications of Robots and Natural Language Processing in Education, Electron 12(13):1-26.
    53. T Kevan (2022) Vision and Motion Control Cobot Applications, Robotics 24/7 Industrial Automation-Cobots.
    54. R Berkmanas (2023) AI Empoweres Cobots for Superior Performance, EasyODM Optical Defect Manager.
    55. A Buerkle, T Bamber, N Lohse, P Ferreira (2021) Feasibility of Detecting Potential Emergencies in Symbiotic Human-Robot Collaboration with a mobile EEG, Robot. Comput. Integr. Manuf 72: 102179.
    56. A Buerkle, H Matharu, A Al-Yacoub, N Lohse, T Bamber, et al. (2022) An adaptive human sensor framework for human–robot collaboration, Int. J. Adv. Manuf. Technol 119(1): 1233-1248.
    57. F Semeraro, A Griffiths, A Cangelosi (2023) Human-robot collaboration and machine learning: A systematic review of recent research, Robot. Comput. Integr. Manuf 79: 102432.
    58. E Newton (2023) What Unique Benefits Does AI Bring to Cobot Performance? MachineDesign.
    59. L Wijayathunga, A Rassau, D Chai (2023) Challenges and Solutions for Autonomous Ground Robot Scene Understanding and Navigation in Unstructured Outdoor Environments: A Review, Appl. Sci 13(17): 9877.
    60. L Hamad, MA Khan, A Mohamed (2024) Object Depth and Size Estimation Using Stereo-Vision and Integration With SLAM, IEEE Sensors Lett 8(4): 1-4.
    61. X Yang, Z Zhou, JH Sørensen, CB Christensen, M Ünalan, et al. (2023) Automation of SME production with a Cobot system powered by learning-based vision, Robot. Comput. Integr. Manuf 83: 102564.
    62. R Xiong (2022) A novel 3D-vision-based collaborative robot as a scope holding system for port surgery: a technical feasibility study, Neurosurg. Focus 52(1): E13.
    63. R Hashem, H El-Hussieny, S Umezu, AMR Fath El-Bab (2024) Soft Tissue Compliance Detection in Minimally Invasive Surgery: Dynamic Measurement with Piezoelectric Sensor Based on Vibration Absorber Concept, J. Robot. Control 5(5): 1399-1411.
    64. J Colan, A Davila, Y Hasegawa (2022) A Review on Tactile Displays for Conventional Laparoscopic Surgery, Surgeries 3(4): 334-346.
    65. W Yuan, S Dong, E Adelson (2017) GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force, Sensors, 17(12): 2762.
    66. C Huang, Q Wang, M Zhao, C Chen, S Pan, M Yuan (2020) Tactile Perception Technologies and Their Applications in Minimally Invasive Surgery: A Review, Front. Physiol 11: 611596.
    67. C Hou, Huxin Gao, Xiaoxiao Yang, Guangming Xue, Xiuli Zuo, et al. (2024) A piezoresistive-based 3-axial MEMS tactile sensor and integrated surgical forceps for gastrointestinal endoscopic minimally invasive surgery, Microsystems Nanoeng 10(1): 141.
    68. W Othman, Zhi Han A Lai, Carlos Abril, Juan S Barajas Gamboa, et al. (2022) Tactile Sensing for Minimally Invasive Surgery: Conventional Methods and Potential Emerging Tactile Technologies, Front. Robot. AI 8: 705662.
    69. JP Johnsomn (2025) The Robotics Rennaissance: How BMW is Leveraging AI for Smarter Factories, Car and Sound.
    70. D Chanda (2024) Top 10 Use Cases of Cobots and AI in Manufacturing, Idea Usher: Ushering in Innovation.
    71. D Pereira, A Bozzato, PDario, G Ciuti (2022) Towards Foodservice Robotics: A Taxonomy of Actions of Foodservice Workers and a Critical Review of Supportive Technology, IEEE Trans. Autom. Sci. Eng 19(3): 1820-1858.
    72. A Villalonga, YJ Cruz, D Alfaro, RE Haber, JL Martínez Lastra, F Castaño (2024) Enhancing Quality Inspection in Zero-Defect Manufacturing Through Robotic-Machine Collaboration, in 2024 7th Iberian Robotics Conference (ROBOT), pp. 1-6.
    73. RS Bharadwaj (2024) Cobots - A Helping Hand to the Healthcare Industry, Reflections Science.
    74. H Editorial (2024) The Role of Cobots in Surgical Procedures: Enhancing Precision and Efficiency in Healthcare, Hosipital and Health Management.
    75. A Ashwin (2024) Cobots have immense potential to enhance Minimally Invasive Surgery, Biospectrum: Opinion.
    76. M Salem, G Lakatos, F Amirabdollahian, K Dautenhahn (2015) Towards Safe and Trustworthy Social Robots: Ethical Challenges and Practical Issues BT - Social Robotics, in ICSR 2015, A Tapus, E André, JC Martin, F Ferland, and M Ammi, Eds., Cham: Springer International Publishing, pp. 584-593.
    77. (2023) R Editorial, How collaborative robotics are influencing hospitals and healthcare, RoboticsBiz.
    78. A Ashwin (2024) Can Cobots Become Indispensable in Healthcare? Biospectrum: Opinion.
    79. AITechPark Staff (2024) Revolutionizing the Healthcare Industry with Collaborative Robots, AITechPark Connecting Insights with Individual.
    80. T Chapman (2024) Amazon’s bid to Revolutionise Warehouse Automation, Digital Supply Chain.
    81. L Pietrantoni, Marco Favilla, Federico Fraboni, Elvis Mazzoni, Sofia Morandini, et al. (2024) Integrating collaborative robots in manufacturing, logistics, and agriculture: Expert perspectives on technical, safety, and human factors, Front. Robot. AI 11: 1-15
    82. (2024) TechmanRobot, How Cobots and Humans Are Working Together in Logistics, TechmanRobot Products.
    83. (2024) PLSLogistics Servicesm, What are Cobots and How They Impact Logistics, PLS Logistics Services Blog.
    84. Mecalux (2021) Cobots; at your service in the warehouse, Mecalux News.
    85. Robotnik (2024) Mobile robotics: Enhancing supply chain and logistics efficiency, Robotnik.
    86. Robotnik (2024) How does implementing mobile robots impact warehouse operations? Robotnik.
    87. (2024) Go4Robotics, Logistics 4.0: Autonomous Mobile Robots Open Up New Paths, Go4Robotics Content Hub.
    88. D Romero, Gregor Von Cieminski, Thorsten Wuest, Paolo Gaiardelli, Ilkyeong Moon et al. (2021) Advances in Production Management Systems: Issues, Trends, and Vision Towards 2030, in Advancing Research in Information and Communication Technology. IFIP Advances in Information and Communication Technology, M. Goedicke, E. Neuhold, and K Rannenberg, Eds., Cham: Springer International Publishing, pp. 194-221.
    89. PA Hancock, DR Billings, KE Schaefer, JYC Chen, EJ de Visser, et al. (2011) A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction, Hum. Factors J. Hum. Factors Ergon. Soc 53(5): 517-527.
    90. P Dizikes (2022) Study: Automation drives income inequality, MIT News.
    91. (2022) B2EAutomation, Ethics of Automation and Robotics: Top Challenges and Solutions, B2EAutomation.
    92. N Rubright (2023) The Ethics of Workplace Automation: Balancing Efficiency and Human Impact, Swiss Cognitive.
    93. B Walsh (2021) How automation led to stagnant wages and inequality, Axios Newsletter.
    94. G Cornelli, J Frost, S Mishra (2023) Artificial intelligence, services globalisation and income inequality, BIS Work. Pap vol. 1135.
    95. S Manning (2024) AI’s Impact on Income Inequality in the US: Interpreting recent evidence and looking to the future, Brookings.
    96. A Analytica (2024) Global Collaborative Robot Market to Surge Past USD 44.1 Billion by 2032 | Cobots as a Service (CaaS) Model is Making Buzz, GlobeNewswire.
    97. AM Tovar (2024) Empowering Low-Income Workers: AI-Driven Upskilling for a Brighter Future, Public Works Partners Insight.
    98. (2022) SMResearch, Collaborative Robots Market Size, Share, Global Industry 2030.
    99. TG Karippacheril, K Pela, A Alassani (2024) What we’re reading about the age of AI, jobs, and inequality, World Bank Jobs and Development.
    100. K Ramachandran, A Srivastava, V Panjwani, D Kumar, NR Cheepurupalli, CR Mohan (2024) Developing AI-powered Training Programs for Employee Upskilling and Reskilling, J. Informatics Educ. Res 4(2): 1186-1193.
    101. R Thomas (2023) AI and Power: The Ethical Challenges of Automation, Centralization, and Scale, fast.ai.
    102. Femi Osasona, Olukunle Oladipupo Amoo, Akoh Atadoga, Temitayo Oluwaseun Abrahams, Oluwatoyin Ajoke Farayola, et al. (2024) Reviewing the Ethical Implications of AI in Decision Making Processes, Int. J. Manag. Entrep. Res 6(2):322-335.
    103. B Ghosh, R Prasad, G Pallail (2022) Ethical Issues & Automation, Porchligt.
    104. VV Raje, S Goel, SV Patil, MD Kokate, DA Mane, S Lavate (2023) Realtime Anomaly Detection in Healthcare IoT: A Machine Learning-Driven Security Framework, J. Electr. Syst 19(3): 192-202.
    105. S Kierner, J Kucharski, Z Kierner (2023) Taxonomy of hybrid architectures involving rule-based reasoning and machine learning in clinical decision systems: A scoping review, J. Biomed. Inform 144: 104428.
    106. V Pillai (2024) Enhancing Transparency and Understanding in AI Decision-Making Processes, Iconic Res. Eng. Journals 8(1): 168-172.
    107. J Looney (2024) The Power of Anomaly Detection in Healthcare: An Overview, Dataiku.
    108. F Vicentini, M Askarpour, MG Rossi, D Mandrioli (2020) Safety Assessment of Collaborative Robotics Through Automated Formal Verification, IEEE Trans. Robot 36(1): 42-61.
    109. R Bogue (2017) Robots that interact with humans: a review of safety technologies and standards, Ind. Robot an Int. J 44(4): 395-400.
    110. D Koczi, J Sarosi (2022) The Safety of Collaborative Robotics - A Review, Ann. Fac. Eng. Hunedoara - Int. J. Eng 20(2).
    111. I Cohen (2024) Real-Time Anomaly Detection: Solving Problems and Finding Opportunities, Anodot Blog Posts.
    112. S Bouchard (2016) Standardizing Collaborative Robots: What is ISO/TS 15066? Engineering.com.
    113. (2016) ISO, ISO/TS 15066: Robots and robotic devices-Collaborative robots.
    114. (2024) EliteRobot, what is a Cobot? An Introduction to Collaborative Robots, EliteRobot Product Highlights.
    115. (2024) CCOHS-CCHST, Safety Hazards Robots and Cobots.
    116. (2024) NIST, AI Risks and Trustworthiness, NIST AI RMF Resources.
    117. (2016) AAA, ISO/TS 15066 Explained, Robotiq Tech Papers.
    118. A Groß, C Schütze, M Brandt, B Wrede, B Richter (2023) RISE: an open-source architecture for interdisciplinary and reproducible human–robot interaction research, Front. Robot. AI 10: 1245501.
Citation
Keywords
Signup for Newsletter
Scroll to Top