Member Login

ARTICLE: Industry 4.0 Awareness and Experience Workshop

These workshops were organised and run by Swinburne University of Technology’s Factory of the Future and were funded through the Victorian Government’s Digital Jobs for Manufacturing (DJFM) program. 

This article is written by PhD researcher from Swinburne University of Technology, Jagannatha Pyaraka.

In a series of enlightening workshops, Swinburne University of Technology has taken significant step in bridging the gap between industry professionals and the transformative potential of Industry 4.0 technologies. Over the past few weeks, four workshops were organized at strategic locations to maximize outreach and impact. The workshops were held at the VGBO office in Bundoora, Holiday Inn Dandenong, Rydges Geelong, and Mercure Ballarat. These sessions aimed to raise awareness and provide hands-on experience with collaborative robots (cobots), a foundation of modern industrial automation and other Industry 4.0 technologies such as AR, VR and wearable sensors.

The workshops attracted operations managers, CEOs, CFOs, and other key decision-makers eager to understand the practical applications and benefits of cobots in their respective fields. Accompanied by my ACC colleague, Dr. Anushani Bibile, we used the easily portable and cost-effective UFactory xArm6 cobot to demonstrate cobotics functionality.

The workshops commenced with an introduction to collaborative robots. Unlike traditional industrial robots, which often require extensive programming and are confined to specific tasks, cobots are designed to share a workspace with humans. Their ease of programming, adaptability to various tasks, and advanced safety features make them suitable for dynamic and evolving industrial environments.

To illustrate these points, we demonstrated a program involving the stacking of four objects. The objects were placed in predefined positions, and xArm6 was tasked with picking each object and stacking them. This exercise highlighted the cobot’s ability to perform repetitive tasks and its intuitive programming interface. Using Blockly, a visual programming language, participants observed how quickly and easily they could teach the cobot to execute tasks.

Following the demonstration, participants had the opportunity to interact with xArm6. They used Blockly to program the cobot for a simple pick-and-place task. This exercise allowed them to experience the user-friendly interface and the cobot’s responsiveness. The feedback was positive, with many participants noting how quickly they could learn to program and operate the cobot.

The hands-on session helped to remove common misconceptions about the complexity and inflexibility of industrial automation. By the end of the workshop, participants had a better understanding of how cobots can be integrated into their operations to enhance productivity, safety, and cost-effectiveness.

The workshops also emphasized the cost-effectiveness of cobots. Unlike traditional robots that require significant investment in programming and setup, cobots like the xArm6 offer an affordable solution without compromising performance. Their advanced safety systems, which allow them to operate safely alongside human workers, make them a viable option for businesses of all sizes.

Specific feedback from participants highlighted the positive impact and value of these sessions. One attendee noted, “The workshop provided a great insight into how Industry 4.0 can better impact our business and automate our processes.” Another participant appreciated the practical demonstrations, stating, “It was great to see the practical applications during the demonstrations.” Many attendees emphasized that the hands-on experience was invaluable, with one remarking, “Cobots demo was very stimulating. Thoroughly enjoyed the workshop.”

Before the workshop, common reactions included uncertainty about the complexity and applicability of cobots in their operations. After the sessions, many participants expressed confidence in integrating these technologies into their workflows, recognizing the potential for improved efficiency and innovation.

Overall, these workshops effectively bridged the knowledge gap for attendees, providing them with the tools and understanding necessary to embrace Industry 4.0 technologies. As more companies recognize the benefits of automation, the demand for cobots is set to rise, paving the way for a more efficient and innovative industrial landscape.

 

ARTICLE: Enhancing Collaboration Between Humans and Robots: The Critical Role of Human Factors Research

This article is written by Jasper Vermeulen, PhD researcher at the Australian Cobotics Centre.

 

Integrating collaborative robots (cobots) in factory environments offers substantial benefits for businesses, including increased operational efficiency and greater product customisation. Compared to traditional industrial robots, cobots are often smaller in size, offering both versatility in various tasks and cost-efficiency. From a technological perspective, the use of cobots can lead to significant improvements in processes.

Cobots: a double-edged sword?

While the advantages of cobots are clear, from a human-centric perspective, a more nuanced conclusion is required. In reality, cobots can present both benefits and challenges for operators. Cobots can help reduce physical strain and mitigate repetitive tasks. On the other hand, cobots may also increase mental effort and working closely together with cobots could cause stress. Furthermore, depending on the workspace and task, working with cobots could affect an operator’s posture for better or worse. This complexity highlights the need for studies into the operator’s experiences of working alongside cobots.

The Discipline of Human Factors

Human Factors is a field dedicated to the study of interactions between humans, technologies, and their environments. This scientific discipline is crucial for enhancing the safety and efficiency of socio-technical systems through interdisciplinary research. Specifically, in the realm of human-cobot collaboration, the discipline of Human Factors plays a pivotal role. By integrating diverse research perspectives—from Robotics and Usability Engineering to Design and Psychology—this discipline enables researchers to dissect and understand complex interactions and complex systems. More importantly, it provides a framework for translating these insights into practical applications, offering concrete design recommendations and effective technology implementation strategies.

Beyond safety

While safety in Human-Robot Interaction has been a central point in Human Factors research, studies specifically addressing human-cobot collaboration are relatively new. Traditionally, much research was aimed at safeguarding the human operator, ensuring their physical safety. Nevertheless, if we aim to improve the overall system performance and well-being of operators, we need to consider additional factors, besides safety. For instance, cobots typically operate at lower speeds as a safety measure, however, experienced operators might prefer a faster pace depending on the task and context. This suggests that speed adjustments could be made without compromising safety.

Looking Forward

As the adoption of cobots continues to grow in industrial settings, it is crucial to deepen our understanding of the factors influencing human-cobot collaboration. Researchers in Human Factors can offer valuable insights by examining the diverse experiences of human operators in cobot-assisted tasks, considering individual differences, different kinds of tasks, various workspaces and cobot capabilities.

Ultimately, while cobots offer the potential to streamline processes, enhance customisation, and reduce costs, their implementation should also focus on improving human operators’ physical safety and mental health. These considerations emphasise the importance of adopting new technologies in genuinely advantageous ways, ensuring a balanced approach to innovation and worker well-being.

Stay Informed on Human Factors in Human-Robot Collaboration

If you’re interested in the latest advancements in human factors research within the field of Human-Robot Collaboration, make sure to follow the activities of Program 3.1 at the Australian Cobotics Centre. We conduct human-centred research using real-world case studies in partnership with industry leaders, focusing on the impact of human factors on operators in practical cobot applications. Our current projects include exploring cobot integration in manufacturing tasks and investigating human factors in robot-assisted surgeries.

Follow our progress on the Australian Cobotics Centre’s LinkedIn page for the latest updates and insights.

ARTICLE: Robotic Blended Sonification: Consequential Robot Sound as Creative Material for Human-Robot Interaction

This article is written by Stine S. Johansen, Jared Donovan, Markus Rittenbruch (Human-Robot-Interaction Program) at Australian Cobotics Centre, and Yanto Browning, Anthony Brumpton (QUT)

Abstract
Current research in robotic sounds generally focuses on either masking the consequential sound produced by the robot or on sonifying data about the robot to create a synthetic robot sound. We propose to capture, modify, and utilise rather than mask the sounds that robots are already producing. In short, this approach relies on capturing a robot’s sounds, processing them according to contextual information (e.g., collaborators’ proximity or particular work sequences), and playing back the modified sound. Previous research indicates the usefulness of non-semantic, and even mechanical, sounds as a communication tool for conveying robotic affect and function. Adding to this, this paper presents a novel approach which makes two key contributions: (1) a technique for real-time capture and processing of consequential robot sounds, and (2) an approach to explore these sounds through direct human-robot interaction. Drawing on methodologies from design, human-robot interaction, and creative practice, the resulting ‘Robotic Blended Sonification’ is a concept which transforms the consequential robot sounds into a creative material that can be explored artistically and within application-based studies.

Keywords
Robotics, Sound, Sonification, Human-Robot Collaboration, Participatory Art, Transdisciplinary

Introduction and Background
The use of sound as a communication technique for robots is an emerging topic of interest in the field of Human-Robot Interaction (HRI). Termed the “Robot Soundscape”, Robinson et al. mapped various contexts in which sound can play a role in HRI. This includes “sound uttered by robots, sound and music performed by robots, sound as background to HRI scenarios, sound associated with robot movement, and sound responsive to human actions” [7, p. 37]. As such, robot sound encompasses both semantic and non-semantic communication as well as the sounds that robots inherently produce through their mechanical configurations. With reference to product design research, the latter is often referred to as “consequential sound” [11]. This short paper investigates the research question: How can consequential robot sound be used as a material for creative exploration of sound in HRI?

This research offers two key contributions: (1) an approach to using, rather than masking [9], sounds directly produced by the robot in real-time, and (2) offering a way to explore those sounds through direct interactions with a robot. As an initial implication, this enables explorations of the sound through creative and open-ended prototyping. In the longer-term, this has the potential of leveraging and extending collaborators’ existing tacit knowledge about the sounds that mechanical systems make during particular task sequences as well as during normal operation versus breakdowns. Examples of using other communication modalities exist, mostly relying on visual feedback. Visual feedback allows collaborators to see, e.g., intended robotic trajectory and whether it is safe to move closer to the robot at any time. This assumes, however, that the human-robot collaboration follows a schedule in which the collaborator is aware of approximately when they can approach the robot. Sometimes, this timing is not possible to schedule, and collaborators must maintain visual focus on their task. This means that it is crucial to investigate ways of providing information about the robot’s task flow and appropriate timings for collaborative tasks. In other words, there is a need for non-visual feedback modalities that enable collaborators to switch between coexistence and collaboration with the robot. In order to achieve this aim, it is necessary to make these non-visual modalities of robot interaction available for exploration as creative ‘materials’ for prototyping new forms of human-robot interaction.

Prototyping sound design for social robots has received particular attention in prior research, e.g., movement sonification for social HRI [4]. However, this knowledge cannot be directly transferred when designing affective communication, including sound, for robots that are not anthropomorphic, e.g., mobile field robots, industrial robots for manufacturing, and other typical utilitarian robots [1]. In prior research of consequential robot sound, Moore et al. studied the sounds of robot servos and outlined a roadmap for research into “consequential sonic interaction design” [6]. The authors state that robot sound experiences are subjective and call for approaches that address this rather than, e.g., upgrade the quality of a servo to reduce noise objectively. Frid et al. also explored mechanical sounds of the Nao robot for movement sonification in social HRI [4]. They evaluated this through Amazon Mechanical Turk, where participants rated the sounds according to different perceptual measures Extending this into ways of modifying robot sounds, robotic sonification that conveys intent without requiring visual focus has been created by mapping movements in each degree of freedom for a robot arm to pitch and timbre [12]. The sound in that study, however, was created from sample motor sounds as opposed to the actual and real time consequential sounds of the robot. Another way this has been investigated is with video of a moving robot, Fetch, overlaid with either mechanical, harmonic, and musical sound to communicate the robot’s inner workings and movement [8]. This previous research indicates that people can identify nuances of robotic sounds but has yet to address if that is also the case for real time consequential robot sounds.

Robotic Blended Sonification
Robot sound has received increasing interest throughout the past decade, particularly for designing sounds uttered or performed by robots, background sound, sonification, or masking consequential robot sound [9]. Extending this previous research, we contribute with a novel approach to utilising and designing with consequential robot sound. Our approach for ‘Robotic Blended Sonification’ bridges prior research on consequential sound, movement sonification, and sound that is responsive to human actions. Furthermore, it relies on the real-time sounds of the robot as opposed to pre-made recordings that are subsequently aligned to movements. A challenge for selecting the sounds a robot could make is that people have a strong set of pre-existing associations between robots and certain kinds of sounds. On one hand, this might provide a basis for helping people to interpret an intended meaning or signal from a sound (e.g., a danger signal), but it also risks that robot sounds remain cliched (beeps and boops), and may ultimately limit the creative potentials for robotic sound design. In this sense, Robotic Blended Sonification is an appealing approach because it offers the possibility of developing a sonic palette grounded in the physical reality of the robot, while also allowing for aspects of these sounds to be amplified, attenuated, or manipulated to create new meanings. Blended sonification has previously been described as “the process of manipulating physical interaction sounds or environmental sounds in such a way that the resulting sound signal carries additional information of interest while the formed auditory gestalt is still perceived as coherent auditory event” [10]. As such, it is an approach to augment existing sounds for purposes such as conveying information to people indirectly.

To achieve real-time robotic blended sonification, we use a series of electromagnetic field microphones placed at key articulation points on the robot. Our current setup uses a Universal Robots UR10 collaborative robotic arm. The recorded signals are amplified and sent to a Digital Audio Workstation (DAW), where they can be blended with sampled and synthesized elements and processed in distinct ways to create interactive soundscapes. Simultaneously to the real-time capture of the robot’s audio signals, we enable direct interactions with the robot through the Grasshopper programming environment within Rhinoceros 3D (Rhino) and the RobotExMachina bridge and Grasshopper plugin [3]. We capture the real-time pose of the robot’s Tool Center Point (TCP) in Grasshopper. Interaction is made possible via the Open Sound Control (OSC) protocol, with the Grasshopper programming environment sending a series of OSC values for the TCP. The real-time positional data also includes the pitch, roll, and yaw of each section of the robotic arm. Interaction with the robot arm is enabled through the Fologram plugin for Grasshopper and Rhino. The virtual robot is anchored to the position of the physical robot. The distance between the base of the robot and a smartphone is then calculated and used to direct the TCP towards the collaborator. This enables realtime interaction for exploring sounds for different motions and speeds. For our prototype, OSC messages from the robotic movements are received in the Ableton Live DAW, along with the Max/MSP programming environment, and then assigned to distinct parameters of digital signal processing tools to alter elements of the soundscape. The plan for the initial prototype setup is to use five discrete speakers: A quadraphonic
setup to allow for 360 degree coverage in a small installation space, along with a point source speaker located at the base of the robotic arm. The number of speakers is scalable to the size of the installation space and intent of the installation. The point source speaker alone is enough to gather data on the effects of robotic blended sonification on HRI, while multi-speaker configurations allow for better coverage in larger environments, enable investigations for non-dyadic human-robot interactions, and provide more creative options when it comes to designing soundscapes.

Directions for Future Research
Ways of using non-musical instruments for musical expressions have a long history within sound and music art. Early examples include the work of John Cage, e.g., Child of Tree (1975) where a solo percussionist performs with electrically amplified plant materials [2], or the more recent concert Inner Out (2015) by Nicola Giannini where melting ice blocks are turned into percussive elements [5]. In a similar manner, our approach enables performance with robotic sound, subsequently allowing for a creative exploration of how those sounds affect and could be utilised for better human-robot collaborations. With the proposed approach, we identify new immediate avenues for research in the form of the following research questions:

Robot Sound as Creative Material
In what ways can the consequential sound of a robot be used as a creative material in explorations of robot sound design? This can entail investigations through different configurations, including dyadic and non-dyadic interactions, levels of human-robot proximity, and different spatial arrangements. Furthermore, the interaction itself will play a crucial part in the way that the sound is both created and experienced, e.g., whether a collaborator is touching the robot physically or, as in our current setup, is interacting on a distance.

Processing Consequential Robot Sound
In what ways can or should we process the consequential sound material? Two key points are connected to this. First, the consequential sound forms a basis for the resulting sound output which can be modified in various ways. Future research can entail exploring these, including the fact that different robots produce different consequential sounds that subsequently, will lead to different meaningful modifications. Second, our approach can be complemented by capturing data from the surrounding environment to use as input for sound processing.

Engaging People in Reflection
How can we prompt people’s reflections about consequential robot sounds through direct interaction? While prior research has demonstrated ways to investigate consequential robot sound, e.g., through overlaying video with mechanical sounds, our approach enables people to explore sounds that result from their own interactions with a robot. This can be utilised for both structured and unstructured setups, depending on the purpose of the investigation. In our current setup, we invite for artistic exploration and expression. For more utilitarian purposes, the setup can be created in the context within which a robot is or could be present. This could support other existing methods for mapping and designing interventions into soundscapes.

Conclusion
In this short paper, we have described a novel approach for exploring and prototyping with consequential robot sound. This approach extends prior research by providing a technique for capturing, processing, and reproducing sounds in real-time during collaborators’ interactions with the robot.

Acknowledgments
This research is jointly funded through the Australian Research Council Industrial Transformation Training Centre (ITTC) for Collaborative Robotics in Advanced Manufacturing under grant IC200100001 and the QUT Centre for Robotics.

References
[1] Bethel, C. L., and Murphy, R. R. 2006. Auditory and other non-verbal expressions of affect for robots. In AAAI fall symposium: aurally informed performance, 1–5.
[2] Cage, J. 1975. Child of Tree. Peters Edition EP 66685. https://www.johncage.org/pp/ John-Cage-Work-Detail.cfm?work_ID=40.
[3] del Castello, G. 2023. RobotExMachina. GitHub repository. https://github.com/RobotExMachina.
[4] Frid, E.; Bresin, R.; and Alexanderson, S. 2018. Perception of mechanical sounds inherent to expressive gestures of a nao robot-implications for movement sonification of humanoids.
[5] Giannini, N. 2015. Inner Out. Nicola Giannini. https://www.nicolagiannini.com/ portfolio/inner-out-2/.
[6] Moore, D.; Tennent, H.; Martelaro, N.; and Ju, W. 2017. Making noise intentional: A study of servo sound perception. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, 12–21. New York, NY, USA: Association for Computing Machinery.
[7] Robinson, F. A.; Bown, O.; and Velonaki, M. 2023. The robot soundscape. In Cultural Robotics: Social Robots and Their Emergent Cultural Ecologies. Springer. 35–65.
[8] Robinson, F. A.; Velonaki, M.; and Bown, O. 2021. Smooth operator: Tuning robot perception through artificial movement sound. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’21, 53–62. New York, NY, USA: Association for Computing Machinery.
[9] Trovato, G.; Paredes, R.; Balvin, J.; Cuellar, F.; Thomsen, N. B.; Bech, S.; and Tan, Z.-H. 2018. The sound or silence: investigating the influence of robot noise on proxemics. In 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN), 713–718. IEEE.
[10] Tunnermann, R.; Hammerschmidt, J.; and Hermann, T. ¨ 2013. Blended sonification: Sonification for casual interaction. In ICAD 2013-Proceedings of the International Conference on Auditory Display.
[11] Van Egmond, R. 2008. The experience of product sounds. In Product experience. Elsevier. 69–89.
[12] Zahray, L.; Savery, R.; Syrkett, L.; and Weinberg, G. 2020. Robot gesture sonification to enhance awareness of robot status and enjoyment of interaction. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO MAN), 978–985. IEEE.

Author Biographies
* Stine S. Johansen is a Postdoctoral Research Fellow in the Australian Cobotics Centre. Her research focuses on designing interactions with and visualisations of complex cyberphysical systems.
* Yanto Browning is Lecturer at Queensland University of Technology in music and interactive technologies, with extensive experience as audio engineer.
* Anthony Brumpton is artist academic working in the field of Aural Scenography. He likes the sounds of birds more than planes, but thinks there is a place for both.
* Jared Donovan is Associate Professor at Queensland University of Technology. His research focuses on finding better ways for people to be able to interact with new interactive technologies in their work, currently focusing on the design of robotics to improve manufacturing.
* Markus Rittenbruch, Professor of Interaction Design at Queensland University of Technology, specialises in the participatory design of collaborative technologies. His research also explores designerly approaches to study how collaborative robots can better support people in work settings.

ARTICLE: Reflections from the 2023 OZCHI workshop on Empowering People in Human-Robot Collaboration

This article is written by Stine Johansen, Postdoctoral Research Fellow (Human-Robot-Interaction Program) at Australian Cobotics Centre.

 

At the OzCHI 2023 conference, researchers from the Australian Cobotics Centre (QUT and UTS) and CINTEL (CSIRO) co-organised a workshop on the topic of “Empowering People in Human-Robot Collaboration: Why, How, When, and for Whom”. Our previous workshop at the OzCHI 2022 conference showed that there is a growing interest in the area from both researchers and practitioners located in the regions of Oceania. In the 2022 workshop, discussions centred around human roles in human-robot collaboration, empathy for robots, approaches to designing and evaluating human-robot collaboration, and ethical considerations. With the 2023 workshop, we aimed to take a step further by (1) discussing underlying assumptions that shape our research and (2) identifying pathways towards shared visions for future research. While it is impossible to capture all the nuances of our discussions here, I will use the limited space in this article to provide a peek into two of the topics that emerged. I hope this can serve as an inspiration to anyone who is reflecting on the why, when, how, and who of empowering people in human-robot collaboration.

Topic 1: Robots as tools for creativity

While an increasing number of digital tools to support creative work come into the world, there are still questions left to be answered in terms of how that support can or should be designed. While a robot might aid someone in drawing, 3D printing, milling furniture, etc, it is up to people to ask the right kinds of questions for artistic expressions and experiences. Furthermore, while a robot might be able to manipulate physical materials, the processes of moulding, cutting, drawing, painting, etc., is part of an artistic conversation that artists and creative professionals have with those materials. Workshop participants proposed that there is a potential for further empirical studies of how creativity works as a basis for how robots can support that.

There are a number of examples out there where designers, developers, and artists explore roles that robots can play for creative work. Here are some that I have come across:

Youtuber and artist Jazza tried to evaluate the drawing capabilities of a small desk robot by line-us. The video starts with a highly unsuccessful replication of Jazza’s drawings and moves into an interactive game session, e.g., playing hangman. It seems that replicating an artist’s drawings is a fun gimmick but perhaps does not offer any further space for creativity. (See the video here)

The humanoid robot Ai-Da paints “self”-portraits which seems ironic when a robot inherently does not have a self or an identity—at least from the perspective of current understandings of consciousness. The artist, Aidan Meller, states that the point of Ai-Da is to raise questions around what role people have if robots are able to replicate our work. (The Guardian published this article about Ai-Da in 2021)

By the way, on the topic of robot consciousness, our workshop panel member Associate Professor Christoph Bartneck, University of Canterbury, hosts a podcast in which the topic was discussed. You can listen to the episode here.

In a more academic direction, the MIT Media Lab has conducted research on ways that robots can help children be creative. They designed a set of games that support children either through demonstrating how to implement a creative idea or by prompting children to reflect by, e.g., asking them questions. (Read about the research here)

Topic 2: Assumptions about robots

Even though, much research and development has already shown a multitude of ways that robots can perform tasks in work and everyday life, there are still underlying assumptions about robots and people that drive these developments. The phrases we use between ourselves, participants, collaborators, industry partners, etc, to describe a design concept or how a robot could solve a problem are part of a larger storytelling. Such storytelling comes through narratives of, e.g., robots taking jobs from workers. We might ask ourselves how we contribute to these narratives, both in public forums as well as research publications.

As a side note to this, fiction and ‘speculation’ is increasingly utilised as a tool for designing human-robot interaction. Some examples include Auger, 2014, Luria et al., 2020, and Grafström et al., 2022. Speculative design is not a new method, but rather becoming a well-established approach within human-computer interaction (HCI), interaction design, and now also human-robot interaction.

What are our visions and how can we get there?

Our shared visions for the future of human-robot collaboration are not necessarily surprising, but thankfully reassuring, that collaborative robots should support people. There are, however, a multitude of ways that people can be supported. These range from support (1) during an actual task, e.g., heavy lifting, improving work safety, and providing effective communication, (2) by fitting into dynamic and unstructured environments, and (3) as part of the foundation for people to have a healthy and rewarding work life.

Different pathways exist towards making this reality. Here are a few examples taken from the workshop discussion. First, while the Australasian context might present some unique challenges, we can still learn from other parts of the world, e.g., in terms of socio-economic pressures that drive robotic development. Second, we can continuously reframe the problems we choose to prioritise. There are perhaps opportunities to move away from the framing of robots performing “dull, dirty, and dangerous” work to robots performing collaborative, inclusive, and even creative work. Third, increasingly dynamic settings require robotic interfaces that provide modular solutions. This prompts the question of how end users might use modular robotic systems, and whether this approach is best suited for certain problems and contexts. Finally, participants agreed that we increasingly need a network of researchers in this area to support each other.

In the spirit of the last point, I invite researchers and practitioners to visit the Australian Cobotics Centre at QUT, Brisbane. You are also welcome to join our public seminars, both as audience and presenter. I look forward to continuing this crucial conversation.

References

James Auger. 2014. Living with robots: a speculative design approach. J. Hum.-Robot Interact. 3, 1 (February 2014), 20–42. https://doi.org/10.5898/JHRI.3.1.Auger

Anna Grafström, Moa Holmgren, Simon Linge, Tomas Lagerberg, and Mohammad Obaid. 2022. A Speculative Design Approach to Investigate Interactions for an Assistant Robot Cleaner in Food Plants. In Adjunct Proceedings of the 2022 Nordic Human-Computer Interaction Conference (NordiCHI ’22). Association for Computing Machinery, New York, NY, USA, Article 50, 1–5. https://doi.org/10.1145/3547522.3547682

Michal Luria, Ophir Sheriff, Marian Boo, Jodi Forlizzi, and Amit Zoran. 2020. Destruction, Catharsis, and Emotional Release in Human-Robot Interaction. J. Hum.-Robot Interact. 9, 4, Article 22 (December 2020), 19 pages. https://doi.org/10.1145/3385007

Online links

Jazza trying the line-us robot:

https://www.youtube.com/watch?v=oZYqrPnpDoY

Article about Ai-Da:

https://www.theguardian.com/culture/2021/may/18/some-people-feel-threatened-face-to-face-with-ai-da-the-robot-artist

MIT Media Lab projects on child-robot interaction for creativity:

https://www.media.mit.edu/projects/creativity-robots/overview/

Christoph Bartneck’s podcast episode on robot consciousness:

https://open.spotify.com/episode/5sFNVXTiv9Sh3u360DlZFy?si=808266bb27ea4b73

ARTICLE: Human-Robot Collaboration in Healthcare: Challenges and Prospects

This article is written by Amir Asadi, PhD researcher at the Australian National University (ANU) and a visiting researcher at Australian Cobotics Centre. It draws upon the introduction section of a paper he co-authored with Associate Professor Elizabeth Williams from the Australian National University, Associate Professor Glenda Caldwell from the Queensland University of Technology, and Associate Professor Damith Herath from the University of Canberra.

Today’s global healthcare system faces a pressing challenge: ensuring equitable access to healthcare amidst a severe workforce shortage. The World Health Organization predicts a shortfall of 10 million healthcare workers by 2030 [1], a situation worsened by an ageing population, increasing demand for medical services, and the COVID-19 pandemic. This shortage leads to a heavy workload for existing healthcare professionals, which research indicates can severely affect patient care quality [2].

In response to the challenges caused by the shortage of healthcare professionals, technological innovations offer a viable approach to reduce the workload on healthcare workers, which could ultimately improve patient care and health service quality. Among many cutting-edge technologies suggested for healthcare, robotics has emerged as a particularly promising area. Robots can assist in a variety of tasks, ranging from surgical procedures to patient care and physical rehabilitation. This leads us to the Human-Robot Collaboration (HRC) concept, where humans and robots work together, leveraging each other’s strengths to achieve shared goals [3]. HRC focuses on augmenting human efforts with robotic assistance in a safe, flexible, and user-friendly manner, thereby enhancing the efficiency and effectiveness of tasks, operations, and workflows [4].

In healthcare, HRC aims to create a symbiotic relationship between healthcare professionals and robots to improve patient care. This approach spans a wide array of applications, including physical rehabilitation, support for the elderly and disabled, surgical assistance, and responses to COVID-19, such as patient handling and disinfection tasks. The breadth of HRC research reflects a commitment to addressing the healthcare system’s immediate and long-term needs.

Despite the clear advantages highlighted by research into HRC in healthcare, its integration has been gradual, reflecting the healthcare sector’s traditionally cautious approach towards new technologies [5]. This slow pace of adoption is multifaceted. The initial aspect encompasses general challenges associated with introducing new technologies into healthcare, such as infrastructure limitations, resistance from healthcare professionals, complex market dynamics, and regulatory barriers [6]. Following this, concerns particular to robots in healthcare, including safety issues, questions of effectiveness, public acceptance, and fears that robots may replace human caregivers, further slow the adoption process within healthcare environments [7]. The next dimension involves the distinct challenges of fostering a collaborative relationship between robots and human users. These challenges include developing intuitive interfaces for seamless human-robot collaboration, ensuring the reliability of robots in diverse healthcare scenarios, and addressing ethical considerations around autonomy and collaborative decision-making in patient care.

Together, these facets of challenges underscore the complexity of integrating HRC in healthcare settings and, therefore, necessitate a comprehensive approach that extends beyond mere technological considerations. This approach must encompass aspects such as regulatory compliance, ethical standards, stakeholder engagement, and infrastructural adaptation. To move forward and advance research in this field, it is crucial to adopt a holistic socio-technical perspective that acknowledges the complex interconnectedness between people, technology, environments, and workflows.

Furthermore, fostering a dialogue among multiple disciplines is imperative for the successful adoption of HRC in healthcare. The diversity of challenges that HRC is facing makes it crucial to bridge fields such as robotics, Human-Robot Interaction (HRI), human factors, medicine, nursing, social sciences, psychology, and ethics. By integrating insights from these diverse fields, the aim is to design and implement robotic technologies in a manner that not only addresses practical challenges but also enriches the efficiency and quality of healthcare services.

To conclude, we can safely say that while the journey to fully realise HRC’s potential in healthcare faces numerous obstacles, its effective adoption could transform healthcare delivery significantly, a process that requires both a socio-technical approach and a broad multidisciplinary dialogue.

References:

[1]           World Health Organization (WHO), ‘Health workforce’. Accessed: Jan. 19, 2024. [Online]. Available: https://www.who.int/health-topics/health-workforce

[2]           D. J. Elliott, R. S. Young, J. Brice, R. Aguiar, and P. Kolm, ‘Effect of Hospitalist Workload on the Quality and Efficiency of Care’, JAMA Internal Medicine, vol. 174, no. 5, pp. 786–793, May 2014, doi: 10.1001/jamainternmed.2014.300.

[3]           J. Arents, V. Abolins, J. Judvaitis, O. Vismanis, A. Oraby, and K. Ozols, ‘Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review’, Journal of Sensor and Actuator Networks, vol. 10, no. 3, Art. no. 3, Sep. 2021, doi: 10.3390/jsan10030048.

[4]           L. Lu, Z. Xie, H. Wang, L. Li, E. P. Fitts, and X. Xu, ‘Measurements of Mental Stress and Safety Awareness during Human Robot Collaboration -Review’, Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 66, no. 1, pp. 2273–2277, Sep. 2022, doi: 10.1177/1071181322661549.

[5]           K. Nakagawa and P. Yellowlees, ‘Inter-generational Effects of Technology: Why Millennial Physicians May Be Less at Risk for Burnout Than Baby Boomers’, Curr Psychiatry Rep, vol. 22, no. 9, p. 45, Jul. 2020, doi: 10.1007/s11920-020-01171-2.

[6]           A. B. Phillips and J. A. Merrill, ‘Innovative use of the integrative review to evaluate evidence of technology transformation in healthcare’, Journal of Biomedical Informatics, vol. 58, pp. 114–121, Dec. 2015, doi: 10.1016/j.jbi.2015.09.014.

[7]           I. Olaronke, O. Ojerinde, and R. Ikono, ‘State Of The Art: A Study of Human-Robot Interaction in Healthcare’, International Journal of Information Engineering and Electronic Business, vol. 3, pp. 43–55, May 2017, doi: 10.5815/ijieeb.2017.03.06.

ARTICLE: Navigating Augmented Reality: User Interface and UX in Cobotics

Written by Postdoctoral Research Fellow, Dr Alan Burden from the Designing Socio-technical Robotic Systems research program in the Centre.  

The rise of collaborative robots (cobots) is a game-changer for various industries. These robots are designed to work alongside humans, enhancing productivity and efficiency. However, the real challenge lies in making this human-robot interaction as seamless as possible. Augmented Reality (AR) is a technology that has the potential to revolutionise this space by overlaying digital information onto our physical environment.

The Shift in Cobot Interfaces

Traditionally, human-cobot interactions have been facilitated through screen-based interfaces or specialised hardware. While these methods are functional, they often require a strenuous learning curve and can be less intuitive. Augmented Reality offers a paradigm shift. By overlaying digital guides, data, or even real-time analytics onto a workspace, AR can make the interaction with cobots more straightforward and efficient. This reduces the time needed for task completion and makes the process more intuitive, reducing the need for extensive training. As we move forward, we are poised to transition from digital 2D interfaces to more immersive 3D interfaces, further enhancing the user experience.

UX Design Principles in AR

User Experience (UX) design is pivotal in making AR-based cobot interaction effective. The objective is to create interfaces that are not just visually appealing but also user-friendly and functional. This involves a deep understanding of the user’s needs, their tasks with the cobot, and the environmental factors at play. For example, an AR interface for a cobot in a medical lab would need to consider sterility and precision. At the same time, one in a manufacturing setting might focus on speed and durability. The design process should be iterative, continually involving users in testing to refine the interface.

User Journey Mapping

Mapping the user’s journey is an invaluable tool in this design process. It involves creating a visual representation of all the interaction points between the user and the cobot facilitated by the AR interface. This helps identify potential issues, bottlenecks, or areas for improvement in the interaction process. For instance, if users find it challenging to access certain information quickly, the interface can be tweaked to make that data more readily available. The ultimate aim is to make the AR interface a tool that enhances, rather than hinders, productivity and user satisfaction.

Safety and Ethics

While AR offers many advantages, it raises important ethical and safety considerations. Data privacy is a significant concern, especially when sensitive or proprietary information is displayed in a shared workspace. The AR interface must also be designed to minimise distractions that could lead to safety hazards. For example, overly flashy or intrusive graphics could divert the user’s attention from critical tasks, leading to accidents. Therefore, ethical guidelines and safety protocols must be integrated into the design process.

What’s Next?

As AR technology continues to evolve, the possibilities for its application in cobotics are virtually limitless. Future developments could include gesture-based controls, adaptive learning algorithms that tailor the interface to individual user preferences, and even real-time collaboration features that allow multiple users to interact with a single cobot. These advancements will make the interaction more seamless and open new avenues for automation and efficiency in various industries.

As we stand on the brink of a new era in human-robot collaboration, enabled by the transformative power of Augmented Reality, we must pause to consider some critical questions.

Will AR interfaces become the new standard in cobotics, making traditional interfaces obsolete?

If we integrate more advanced features like gesture controls and adaptive learning algorithms, are we also prepared to address the complex ethical and safety considerations that come with them?

These questions serve as a reminder that while technology offers immense potential for improvement and innovation, it also demands a level of responsibility and foresight. As we navigate this exciting frontier, let’s ensure our approach is technologically advanced, ethically sound, and user-centric.

 

ARTICLE: How to ensure quality assurance when integrating a cobot

Written by Postdoctoral Research Fellow, Dr. Anushani Bibile and Research Program Co-Lead, Dr. Michelle Dunn, both from SUT

A collaborative robot (or cobot) is designed to work side-by-side with people and can support applications from welding, pick and place, injection moulding, CNC, packaging, palletising, assembly, machine tending and materials handling. The integration of cobots enables the delegation of many human-based skill activities, with cobots able to undertake a range of repetitious tasks, whilst offering high flexibility and increased productivity.

A collaborative robot arm is compact, occupying a smaller floorspace than a conventional robot and can offer great flexibility for ‘low-volume, high-mix’ production, or high specialisation environments.

It is easier to re-program and re-tool a cobot to undertake a range of actions, providing greater agility as well as reductions in cost of operation. As cobots are also designed to work safely side-by-side with human operators, reduced safety measures are required when compared with a conventional robot.

If you are thinking of integrating a cobot into your manufacturing process it is important to look at the quality assurance of your system. When implementing a conventional robot, you would ensure the quality assurance was satisfied during initial setup, but when you use a cobot, which can be reconfigured for different processes, you need to consider the quality assurance every time you make a change. Changes to the code by non-experts, will have to be checked and verified very closely and safety always needs to be considered. Therefore, quality assurance is critical for human-cobot systems in automated processes as it ensures that the products or services produced meet the required specifications and are safe for use.

Why are continuous quality assurance checks important for human-cobot systems?

  • Productivity: Quality assurance measures can help optimise the performance of a human-cobot system, improve productivity and reducing waste. This can include monitoring and controlling the system to ensure that it is working efficiently and identifying areas where improvements can be made.
  • Safety: Safety is a critical concern when it comes to human-cobot systems. A cobot does not need to be caged, therefore a malfunctioning or improperly programmed cobot can cause serious injury or damage to humans or equipment. Quality assurance measures help ensure that the cobot system is designed and programmed correctly, and that it is safe for use.
  • Compliance: Quality assurance measures can help ensure that a human-cobot system meets regulatory and industry standards. This can include performing audits and inspections to ensure that the system is operating within the required parameters and that all safety regulations are being followed.

If proper quality assurance measures are not in place, there are potential risks associated with human-cobot systems. Some of these risks include:

  • Malfunctioning: A cobot that is not properly programmed or maintained can malfunction, causing damage or injury to humans or equipment.
  • Inaccuracy: A poorly calibrated or inaccurate cobot can produce defective products or services, leading to waste, customer dissatisfaction, and potentially legal liabilities.
  • Cybersecurity: Human-cobot systems are susceptible to cyber threats, which can lead to system failures, data breaches, and other security issues. Quality assurance measures can help ensure that the system is secure and that appropriate cybersecurity protocols are in place.

Design of safety mechanisms must meet the corresponding industrial standards which are exemplified in the figure below. First, a cobot must meet the relevant safety requirements, laws and directives for general machinery such as the European Machinery Directive (2006/42/EC). Basic safety rules and regulations (known as Type A standards) must also be met. Specific applications of a cobot system must meet type B standards. Finally, the cobots as products must meet type C standards.

Safety Assurance standards and regulations for human and machine collaboration [1]

Finally, it is important to regularly review and update quality assurance protocols to keep pace with evolving technologies and changing workplace conditions. By remaining vigilant and proactive in preserving the quality assurance of cobots in automated processes, organisations can reap the benefits of cobot automation while minimising risks and maximising productivity.

[1]Bi, Z. M., et al. (2021). “Safety assurance mechanisms of collaborative robotic systems in  manufacturing.” Robotics and Computer-Integrated Manufacturing 67.

[2] Vicentini, F. (2021). “Collaborative Robotics: A Survey.” Journal of Mechanical Design 143(4).

[3] Cobot – Wikipedia

 

ARTICLE: Cobots in manufacturing: Good for skill shortages and much more.

Written by Research Program Co-Lead Professor Greg Hearn and PhD Candidate Nisar Ahmed Channa both from the Human Robot Workforce research program in the Centre.  

In this era of rapidly evolving technology landscape, almost every industry sector needs to keep pace with technological advancements to prosper and remain competitive. However, many companies struggle to develop or even adopt innovations in technologies, processes, or business models. COVID-19 is one recent example where manufacturing companies found it extremely challenging to generate an innovative response to address labor shortages caused by lockdowns and movement restrictions across many countries. As a result, many production units of large as well as small to medium manufacturing companies shut down for significant time periods. This negatively affected global supply chains in many other sectors because manufacturing industries provide input in the form of usable goods and services to many other industries. Soon after the global economic crises, the manufacturing companies were facing issues like increasing production costs caused by unavailability of raw material and increased labour costs. Covid-19 pandemic further fuelled these issues due to disruptions in global supply chains and restricted movement of human workers. Even after the pandemic, various countries are still facing issues like increased labour costs, and shortages of skilled labour. Resultantly, companies are now investing huge financial resources to future proof their manufacturing potential and reduce input and increase output.

One of the innovative solutions to these labor and skills shortages on which academics and industry experts are working is the adoption of collaborative robots (Cobots) in manufacturing. A Cobot is a special kind of robot, with context awareness, which can safely share a workspace with other Cobots or with human operators. Recent research suggests that Cobots can be used as alternatives to skilled human workers and thus can supplement the shortage of workers across the industries. For instance, to cope with labour shortages caused by pandemic and to meet increased demand, manufacturing companies in North America spent around 2 billion USD in 2021 to acquire 40,000 robots[i],[ii],[iii]. Similarly, rising labour costs, and an aging workforce, has led to an increase in the demand for Cobots in the automobile sectors of Europe and the Asia–Pacific region iii.

Some labour economists believe that the introduction of technologies like artificial intelligence (AI) and robots increases production and efficiency in manufacturing through the displacement of jobs traditionally being performed by human workers. However, under certain conditions, these technologies can create new jobs and upskill other jobs across the ecosystem of the related suppliers and services providers.

In line with the priorities for Australian manufacturing formulated by the Australian Advanced Manufacturing Growth Centre[iv], we argue Cobots could be “creatively productive” for Australian manufacturing not only because of their potential to reduce production cost efficiencies but also to enhance value differentiation, and potentially open up new revenue segments including through export[v]. Efficiencies can be achieved through optimisation of human-robot workflow design; accelerating workforce acceptance of robotic driven process efficiencies; reducing human errors in automation documentation; and by reducing downtime through enhanced work safety.  Value differentiation can be achieved by integration of Cobots in product design for rapid prototyping; by developing autonomous systems of quality assurance and better data analytics as value adding services; by improving capabilities for just-in-time and mass customisation products in existing markets; and by upskilling the manufacturing workforce for innovation leadership which in itself is a value differentiator. The fact that Cobots are designed to work alongside and close to people to perform their jobs and responsibilities can help companies to integrate and digitalise their business operations without compromising on lacking human aspects of the job. In many respects, Cobots are the hardware equivalent of augmented intelligence, rather than replacing people with autonomous equivalents. Cobots can supplement and improve human skills with super-strength, accuracy, and data capabilities, allowing them to perform more and add more value to the production process and to final product itself. It aids in creating strategic business value and improves efficiency, resulting in better, quicker delivery of products to customers in market.

[i] North American companies send in the robots, even as productivity slumps | Reuters

[ii] Robots marched on in 2021, with record orders by North American firms | Reuters

[iii] Rise of The Cobots in Automotive Manufacturing | GEP

[iv] https://www.amgc.org.au/our-purpose/about-advanced-manufacturing/

[v] Microsoft Word – Hearn et al ACRA Final Submission.docx (linklings.net)

6 Reasons Why We Need a Prototyping Toolkit for Designing Human-Robot Collaboration

Written by Postdoctoral Research Fellow, Dr Stine S Johansen and PhD Researcher, James Dwyer

In this short article, we will share 6 benefits of having a prototyping toolkit for designing human-robot collaboration (HRC). We will lift the curtain on our planned activities to work towards this in Program 2 of the Australian Cobotics Centre.

What type of human-robot collaboration are we talking about?

The Australian Cobotics Centre focuses on cobots in manufacturing settings. In these settings, robots are most often big and locked away in cages for safety reasons. They are useful for highly defined and repeatable tasks that require strength. In contrast, cobots are typically smaller and allow for people to safely carry out a task by handing over items to the robot or even by physically handling the robot.

Cobots address an increasing need for more adaptable robotic systems for customised and bespoke products. These types of products still require people in the manufacturing line to accommodate changes from product to product.

So, what could a prototyping toolkit look like?

Imagine a toolbox with screwdrivers, a hammer, cutters, etc. Similar to that, we already have tools in our design toolbox that work at a generic level or are appropriated to suit particular problems. But a toolkit for prototyping human-robot collaboration is still left for us to investigate. In Program 2, James Dwyer (PhD student) will contribute to our knowledge about how different prototyping tools can facilitate design processes of HRC. The goal is to develop a practical and affordable toolkit that can be used to enable designers, engineers, and end-users to work together towards human-robot collaboration in manufacturing settings and beyond.

What are the benefits of having a prototyping toolkit?

Knowing how a cobot can fit into an existing or new manufacturing setting requires substantial research. What if we had a way to make that process easier and more efficient for designers and clients as well as more accommodating for the final end-users of the cobot? This is the broad aim of a HRC prototyping toolkit. Here are 6 concrete benefits that we aim to support through our work in Program 2.

1) Accessible end-user engagement

Manufacturers often lack the expertise to define how a cobot could be used. They are, however, experts in their respective domain. Domain knowledge is not always something that can be documented in written reports. It is also the tacit knowledge that workers build through years of experience. A prototyping toolkit can enable that knowledge to play a role very early in the design and development process by lowering the currently high technical barriers to understand how a robot works. In Program 2, we rely on principles from participatory design which is a design practice to produce tangible outcomes together with end-users.

2) Cost and time efficiency

Facilitating a cobot integration project can require substantial costs and time which makes it non-viable for some manufacturers. The hardware investments require committing to a particular setup, but there are risks associated with such investments if feasibility of the concept has not been investigated early on. Therefore, it will be beneficial to have prototyping tools to conduct such investigations without the necessity of actual hardware. Prototyping tools can furthermore allow for quick and cheap iterations. Subsequently, there is a need for tools that facilitate the transition from early concepts to implementation and testing.

3) Flexibility

Given the opportunity for cobots to assist in manufacturing of customised products, there is a high need for flexible solutions. Crucial to realising flexibility is the establishment of design processes that bridge the gap between early stage conceptual development and technical integration. For cobots to effectively contribute to customised production, they must follow a rich understanding of work practices, production methods, and customisation requirements entailed in the manufacturing. This understanding can be developed through iterative design and a holistic approach, covering all aspects from conceptualisation, prototyping, and implementation. This will ensure that the cobots are versatile, adaptable, and able to meet changing production needs.

4) Risk mitigation

Even though cobots are generally equipped with safety measures such as a safe stop button and sensors to detect and stop collisions with people, it is still possible to get hurt by a faulty cobot that has not been adapted to its environment. Prototyping tools allow us to mitigate this risk in two ways. First, it is possible to create virtual models of the environment and cobot, meaning that we can simulate tasks and clarify potential safety risks we might not otherwise have detected purely from prior experience and safety standards. This allows us to develop safety measures long before anyone gets hurt. Second, while engaging end-users in the design process has many benefits, people with non-technical backgrounds are not necessarily comfortable interacting with a robot – especially an unfinished robot solution. Therefore, prototyping tools can support our engagement with end-users by removing the potential fear of getting hurt.

5) Enhanced creativity

As design researchers, we often engage in generative ideation activities to address research questions. Prototypes enable us to see facets of an idea that were not previously obvious. This is sometimes referred to as ‘filtering’ (for further reading on this topic, see our list of references). It’s like putting on special glasses that highlight the specific qualities we want to explore further while still capturing the essence of the entire concept. In order to use prototypes as filters, it is necessary to have a holistic understanding of the context within which the cobot will operate and how that context can change with the introduction of the cobot. A prototyping toolkit can help give us different lenses to explore facets of the context in early prototypes, thereby becoming a creative extension for designers. This could include prototyping tools such as facilitating Wizard-of-Oz methods, video prototyping, or virtual simulations.

6) Facilitating internal communication

Prototyping is an activity that allows us to both internalise and externalise ideas. In other words, prototypes enable us to internally reflect on what works and what does not work as well as communicate ideas to team members, clients, or anyone interacting with them. Prototypes have always had that role in design research, but with the technical barriers to quick prototyping for human-robot collaboration, there is a need to identify new ways to facilitate this role of prototypes.

We look forward to sharing our progress throughout the next few years. Please reach to us for further discussion, questions, or other inquiries.

Further reading:

Lim, Y. K., Stolterman, E., & Tenenberg, J. (2008). The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas. ACM Transactions on Computer-Human Interaction (TOCHI)15(2), 1-27.

Wensveen, S., & Matthews, B. (2014). Prototypes and prototyping in design research. In The routledge companion to design research (pp. 262-276). Routledge.

William Odom, Ron Wakkary, Youn-kyung Lim, Audrey Desjardins, Bart Hengeveld, and Richard Banks. 2016. From Research Prototype to Research Product. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). Association for Computing Machinery, New York, NY, USA, 2549–2561. https://doi.org/10.1145/2858036.2858447

Gopika Ajaykumar. 2023. Supporting End-Users in Programming Collaborative Robots. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23). Association for Computing Machinery, New York, NY, USA, 736–738. https://doi.org/10.1145/3568294.3579969

ARTICLE: Can we Unlock the Potential of Collaborative Robots?

Written by Dr Marc Carmichael and Louis Fernandez from the Australian Cobotics Centre.

Collaborative robots, or cobots for short, have gained significant attention in recent years due to their potential to work in close proximity and collaboration with humans. However, despite their name, there seems to be a lack of actual collaboration between humans and cobots in many, if not most, industrial settings.

The Australian Cobotics Centre aims to transform the Australian manufacturing industry through the deployment of collaborative robots, and in a recent webinar we discussed how significant benefits may be possible if more sophisticated forms of collaboration between humans and cobots can be practically achieved.

In this article we discuss this, starting with the basics of cobots, exploring the untapped potential of cobot-human collaboration, and how we hope to develop new ways of enabling humans and cobots to collaborate.

Defining Cobots and Industrial Robots:

Before we talk about the untapped potential of cobot-human collaboration, let’s start by understanding the basic differences between cobots and regular industrial robot arms.

Industrial robot arms are normally big, heavy machines you might see in factories or other environments that have repetitive and predictable jobs. Industrial robots are great at lifting heavy things quickly and accurately. However, this is also what makes them dangerous around people, so they need to be kept away from them.

On the other hand, cobots are much smaller and lighter. They also have technology that lets them ‘feel’ their surroundings. These functionalities allow them to work alongside humans. On top of that, they’re easier to program than industrial robots. This allows them to be quickly put to work on different tasks and makes them good for flexible jobs.

The Current State of Cobot Collaboration:

Even though cobots are capable of working beside people, they don’t very often really work with people. Feedback from experts and users, as well as research literature, have observed that cobots are being used more like traditional industrial robots. For example, cobots are often used in pick and place or palletising jobs. These applications look much like how industrial robots work, except cobots don’t need the protective cage around them. This raises the question: “Are we really using cobots to their full potential?”

Don’t get me wrong, using cobots as cageless industrial robots has great advantages. Not needing a cage means you have more space on your shop floor for other equipment, and you spend less time during the installation process. In addition, cobots are generally easier and faster to program compared to industrial robots. For example, cobots can be programmed by physically grabbing and moving their arm to show them where to go. This easy form of programming allows cobots to be easily set up and deployed, a benefit for small businesses getting into automation. Plus, cobots are getting better, with some having more reach and strength to handle different jobs. As they improve, we might see cobots and robots becoming harder to tell apart, and using cobots like cageless industrial robots might become common.

However, using cobots like industrial robots doesn’t make the most of what they can do. We should explore the challenges and opportunities of making cobot-human collaboration better.

Defining Levels of Human-Robot Collaboration:

Before we continue, it is important to define collaboration in the context of cobots. What collaboration means depends on the discipline, and terms are often used inconsistently or interchangeably. A classification that is becoming increasingly common, and which we personally like, is the following:

Level 0: Cell – this is the traditional approach used in industrial robots where humans are isolated from the robot, often by physical caging or fences.

Level 1: Co-existence – the human and cobot share the workspace, but work together on a task in a sequential fashion. For example, a cobot performs a packing task, with a human only entering the workspace to restock items. Sensors such as a safety area scanner are used to slow/stop the cobot when someone is in the vicinity.

Level 2: Co-operation – the human and cobot operate in shared space, with the worker guiding or influencing cobot operation via inputs (e.g. force, speech, gesture, etc). Cobot may adapt its motion based on human measurements.

Level 3: Collaboration – the human and cobot cooperate on joint task. Cobot learns and adapts by observing humans, to achieve a dynamic and supportive collaboration. Human and cobot are responsive to each other in a mutually beneficial manner, where both parties actively contribute to the task at hand.

Although it is sometimes difficult to define, these definitions can help distinguish different levels of interaction and collaboration between cobots and humans.

Exploring the Potential Gains and Barriers to Collaboration:

We would consider most cobot use cases to be Level 1 collaboration, where other than the cobot adapting its pre-programmed routine based on the presence of a human, there is next-to-no real collaboration between the two. To rephrase the previous question that we raised: “what are we missing out on by not going after Level 2 and Level 3 collaboration?”

There are some interesting and compelling proof-of-concepts by robotics researchers that demonstrate the potential to be achieved, See Further Reading for some examples. One study estimated a potential reduction in task completion time of up to 20%, suggesting significant benefits in productivity can be unlocked. Unfortunately, there are relatively few examples of high-level collaboration that have made their way to practical use.

In our program (Human-Robot Interaction) at the Australian Cobotics Centre, our goal is to increase the scope of genuine collaboration. Our efforts are focused on novel interaction approaches using multi-sensory interfaces, gesture control devices and augmented reality which can reduce training costs, enable rapid prototyping, and make robots safer and easy to use in production tasks.

It is our belief that addressing these challenges will lead to new methodologies for enabling rich and beneficial forms of human-robot collaboration. Combined with the work of our colleagues at the Australian Cobotics Centre whose programs are addressing technical, social, and organizational challenges, we are looking forward to sharing the outcomes we achieve and are excited about the future of cobotics.

Further reading:

Michaelis, J. E., Siebert-Evenstone, A., Shaffer, D. W., & Mutlu, B. (2020). Collaborative or simply uncaged? understanding human-cobot interactions in automation. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3313831.3376547

Guertler, M., Tomidei, L., Sick, N., Carmichael, M., Paul, G., Wambsganss, A., Hernandez Moreno, V., & Hussain, S. (2023). When is a robot a cobot? moving beyond manufacturing and arm-based cobot manipulators. Proceedings of the Design Society, 3, 3889-3898. https://doi.org/10.1017/pds.2023.390

Kopp, T., Baumgartner, M., & Kinkel, S. (2020). Success factors for introducing industrial human-robot interaction in practice: an empirically driven framework. The International Journal of Advanced Manufacturing Technology, 112(3-4), 685-704. https://doi.org/10.1007/s00170-020-06398-0

Male, J. and Martinez-Hernandez, U. (2021). Collaborative architecture for human-robot assembly tasks using multimodal sensors. 2021 20th International Conference on Advanced Robotics (ICAR). https://doi.org/10.1109/icar53236.2021.9659382

Carmichael, M. G., Aldini, S., Khonasty, R., Tran, A., Reeks, C., Liu, D., … & Dissanayake, G. (2019). The ANBOT: an intelligent robotic co-worker for industrial abrasive blasting. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros40897.2019.8967993

Zhuang, Z., Ben-Shabat, Y., Zhang, J., Gould, S., & Mahony, R. (2022). Goferbot: a visual guided human-robot collaborative assembly system. 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros47612.2022.9981122