Member Login

ARTICLE: Beyond Efficiency: Ethical Considerations of Adopting Cobots

Collaborative robots, commonly referred to as “Cobots,” are among the most groundbreaking technological advancements of our time. Academics and industry experts firmly believe that Cobots have the potential to revolutionise global manufacturing. A Cobot is a context-aware robot equipped with artificial intelligence and vision capabilities, enabling it to safely coexist with both human operators and machines in the same workspace.

The adoption of Cobots in manufacturing is one of the key enablers of Industry 5.0. The concept of Industry 5.0 was first proposed by Michael Rada[i] in 2015, after it was felt that Industry 4.0, the predecessor of Industry 5.0 was unable to meet the increasing demands of personalization and customization of goods. Through incorporation of highly advanced systems such as artificial intelligence, automated systems, internet of things, and cloud computing, Industry 4.0 was aimed at enhancing operational efficiency and productivity by connecting physical and virtual worlds. However, the rapidly evolving global business dynamics shifted the industry paradigm from not just efficient production but also high-value mass customization and personalization of goods. It was widely believed that Industry 4.0 was unable to address these changes. Therefore, Industry 5.0 was coined to address changing industrial dynamics focusing on collaboration between advanced production systems, machines and humans.

To reap the enormous benefits associated with this technology, its adoption necessitates careful consideration of the risks that could potentially affect the well-being of human operators who work collaboratively with Cobots.

Ethical Considerations of Adopting Cobots

Ethical considerations when adopting Cobots encompass a wide range of social factors[ii]. As defined by the British Standards Institution[iii], ethical hazards are any potential source of harm that compromises psychological, societal, and environmental well-being. While collaborative settings involving Cobots offer benefits like reducing physically demanding tasks for humans, they have also brought forth new risks and ethical considerations that demand attention during their planning and use. In following sections, I will discuss some of the ethical considerations of adopting Cobots:

Emotional Stress

Understanding potential worker emotional stress may result in designing better human-Cobot interaction systems that minimise stress and enhance the overall user experience. Cobots may cause emotional stress among users for several reasons. For instance, users might feel they have less control over their work environment when Cobots are involved, especially if the Cobots operate autonomously. This can lead to feelings of anxiety and stress. Moreover, Cobots are often used for tasks that require high precision and concentration, thus pressure to perform these tasks accurately can be mentally exhausting and stressful. The constant need to monitor and interact with Cobots can trigger physiological stress responses, such as increased heart rate and tension. Organisations can consider these factors when designing and implementing cobots.

Social Environment

Understanding potential social environment related disruptions, manufacturers can develop strategies to mitigate workers’ concerns and create a harmonious work environment. Unless workers are involved in the design and planning of Cobot implementations, they may disrupt the social harmony of the workplace in several ways, for example by raising concerns about job security among workers, or causing anxiety and tension due to the fear of being replaced by robots. This can lead to confusion and ambiguity about job roles, causing stress and disrupting team cohesion. Furthermore, the presence of Cobots can alter social interactions in the workplace, with some workers viewing them as teammates while others see them as intruders, potentially leading to conflicts. Additionally, the increasing autonomy of Cobots raises ethical questions about decision-making and accountability.

Social Acceptance

By comprehending social acceptance related community factors, strategies can be developed to enhance the acceptance of Cobots. Communities play a crucial role in determining the acceptance of new technologies. Several key factors influence the acceptance of Cobots. Different cultures exhibit varying levels of comfort and acceptance towards technology. Some cultures place a higher level of trust and enthusiasm for technological advancements, which can lead to greater acceptance of Cobots. The opinions and behaviours of peers, family, and colleagues can significantly impact an individual’s acceptance of Cobots. Communities with higher levels of education and awareness about the benefits and functionalities of Cobots tend to accept them more readily. Government policies and incentives that promote the use of Cobots can positively influence community acceptance. Supportive regulations and funding for Cobot integration can encourage businesses and individuals to adopt this technology.

Data Collection

Firms adopting Cobots need to devise their data management policies and ensure workers that collected data will not be used by any other third party. Considering that Cobots collect a variety of data from their safety systems, there’s a risk that operators and user data could be collected, used, and sold without consent. Research indicates that many industry organisations were already interested in the potential value of this data in developing future products and services.

The addressal of these ethical considerations can ensure that the adoption of Cobots contributes positively to society and aligns with our social values. Thus, by prioritizing ethics, we can foster trust and acceptance of Cobots in manufacturing.

[i] https://www.linkedin.com/pulse/industry-50-from-virtual-physical-michael-rada/

[ii] https://www.centreforwhs.nsw.gov.au/__data/assets/pdf_file/0019/1128133/Work-health-and-safety-risks-and-harms-of-cobots.pdf

[iii] https://knowledge.bsigroup.com/products/robots-and-robotic-devices-guide-to-the-ethical-design-and-application-of-robots-and-robotic-systems

Meet our E.P.I.C. Researcher, Mariadas Roshan


Mariadas Roshan is a Postdoctoral Research Fellow in the Quality Assurance and Compliance program and is currently involved in an industrial-based cobot automation project.

We interviewed Mariadas recently to find out more about why he does what he does.

 

 

Tell us a bit about yourself and your research with the Centre? Include the long-term impact of what you are doing.

I am a Postdoctoral Researcher in the Quality Assurance and Compliance program (P4) at the Australian Cobotics Centre, based at Swinburne University of Technology. I hold a Bachelor’s degree in Mechatronics Engineering and completed my PhD in Robotics, where I focused on developing autonomous ultrasound imaging systems using computer vision and collaborative robots.

At the Centre, my research focuses on integrating collaborative robotics and intelligent vision systems into real-world manufacturing environments to improve quality control, ensure compliance, and enhance human-robot collaboration. I am currently working on an industry-based project that aims to automate and optimise quality assurance in manufacturing. In parallel, I’m involved in projects exploring how collaborative robots can be used for both autonomous and teleoperated ultrasound imaging in healthcare.

I believe that the potential of robots is still underutilized or not fully understood in many real-world applications. My long-term goal is to help bridge that gap, by driving the adoption of robotics across diverse sectors like manufacturing and healthcare. This will ensure that these technologies are accessible, effective, and aligned with real human needs.

Why did you decide to be a part of the Australian Cobotics Centre?

I’ve always had a passion for robotics, which led me to pursue a Bachelor’s degree in Mechatronics Engineering and later a PhD in Robotics. During my PhD, I became increasingly aware that much of the research being done either didn’t reach industry or lacked practical impact. I’ve always been motivated by research that addresses real-world problems and delivers tangible solutions to industry challenges.

When I was looking for a postdoctoral opportunity, the Australian Cobotics Centre stood out because of its strong focus on implementing collaborative robotics in Australian manufacturing. The Centre’s mission aligns closely with my own values, ensuring that cutting-edge research leads to meaningful, real-world outcomes. What also attracted me was the Centre’s diverse, interdisciplinary team, which considers not only the technical aspects of cobot implementation but also the human, organisational, and design perspectives. It’s an environment where I can contribute while also learning and growing alongside experts from various fields.

What project are you most proud of throughout your career and why?

One of the projects I’m most proud of took place during my PhD, when I worked part-time as a mechatronics engineer at a start-up mask manufacturing company at the height of the COVID-19 pandemic. At the time, Australia had only one local mask production facility, and the manufacturing machines were imported, which meant we had very limited technical support. I was part of the team responsible for designing and deploying the automation of the mask production process, an effort that was both urgent and technically demanding. Despite the limited resources and high-pressure environment, we successfully established a functioning production line. Even though regulatory compliance wasn’t my core area, I also took on the responsibility of overseeing lab testing and navigating the TGA approval process, as our team lacked expertise in that area. It was a demanding but incredibly rewarding experience, especially knowing the direct impact it had during a national health crisis.

Another project close to my heart is one I’m currently involved in is a teleoperated and autonomous collaborative robot system for ultrasound imaging. The goal is to create a solution that can support healthcare professionals and provide better access to diagnostic services in regional and remote communities. I’ve been contributing to this outside of my main projects because I strongly believe in its potential to make a real impact in the healthcare sector.

What do you hope the long-term impact of your work will be?

Through my experience working on industry-focused projects, I’ve come to realise that many companies, especially SMEs, are still hesitant to adopt robotics due to factors like perceived risk, high initial investment, and a general lack of awareness, particularly in Australia.  I hope the long-term impact of my work will be to bridge that gap by demonstrating the real-world value and practicality of robotics across industries.

I want to support wider acceptance and adoption of robots, not just in manufacturing but also in underexplored areas like healthcare, where robotics can have a significant impact. Ultimately, I aim to contribute to a future where robotics is seen not as a complex or risky investment, but as a valuable and accessible tool that can enhance productivity, improve safety, and create better outcomes for people and businesses alike.

Aside from your research, what topic could you give an hour-long presentation on with little to no preparation?

Without a doubt, cricket! I’ve been a huge fan since I was young. While I don’t get as much time these days to watch full matches, I still closely follow scores, player stats, and expert analyses. I especially enjoy watching podcasts and technical breakdowns of games, where strategies, player skills, and match dynamics are discussed in depth. Whether it’s team strategies, player performance trends, or predicting outcomes based on pitch conditions and line-ups, I could easily give a detailed and passionate talk on any aspect of the game.

What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering

By James Dwyer and Dr Valeria Macalupu (both QUT)

 

 A tangible, adaptable and modular interface for embodied explorations of human-robot interaction concepts.

As robots become increasingly integrated into various industries, from healthcare to manufacturing, the need for intuitive and adaptable tools to design and test robotic movements has never been greater. Traditional approaches often rely on expensive simulations or complex hardware setups, which can restrict early-stage experimentation and limit participation from non-expert stakeholders. The kinematic puppet offers a refreshing alternative by combining hands-on prototyping with virtual simulation, making it easier for anyone to explore and refine robot motion. This work is particularly critical for exploring intuitive ways surgeons can collaborate with robots in the operating room, improving Robot-Assisted Surgery (RAS).

 What is the kinematic puppet?

The kinematic puppet is an innovative tool that combines physical prototyping and virtual simulation to simplify the design and testing of robot movements and human-robot interactions. The physical component is a modular puppet constructed from 3D-printed joints equipped with rotary encoders and connected by PVC linkages. This flexible and cost-effective setup allows users to customise a robot arm to suit a variety of needs by adjusting linkage lengths and joint attachments.

On the digital side, a virtual simulation environment (developed in Unreal Engine) creates a real-time digital twin of the physical puppet. This integration via Wi-Fi/UDP enables immediate visualisation and testing of HRI concepts. By bridging the gap between physical manipulation and digital analysis, the kinematic puppet makes it easier for anyone to experiment with and refine robot motion in an interactive and accessible way.

Figure 1. The physical and virtual components of the kinematic puppet.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

How does the user interact with the puppet?

In the demonstration, users engage with the system by physically manipulating the kinematic puppet to control a digital twin of the robot arm, guiding it through a virtual cutting task. As they direct the arm’s movements, a virtual cutting tool simulates material removal in real time.

The system provides continuous feedback through both visual displays and haptic responses, creating an immersive and intuitive experience. This interactive environment challenges participants to balance precision and speed, highlighting the importance of both accuracy and efficiency in robotic tasks.

By making the abstract process of programming robotic movements tangible, the kinematic puppet empowers users to experiment and learn in a dynamic environment.

Figure 2. James showing how the kinematic puppet works.

Demonstration at HRI 2025 – An experience for HDR students.

Presenting the Kinematic Puppet at the Human-Robot Interaction Conference 2025 provided valuable insights into how our research resonates with the broader robotics community. Attendees were particularly drawn to the system’s modularity and reconfigurability and appreciated the puppetry-based approach as an intuitive method for exploring human-robot interaction concepts.

The demonstration wasn’t without challenges. Technical issues before the demo required some mildly frantic rebuilding of the code solution the morning of, highlighting a common research reality: experimental prototypes often accumulate small bugs through iterative development that compound unexpectedly. An all-too-common challenge that reflects the messy nature of research and something that isn’t always visible in polished publications.

Reviewer feedback highlighted potential applications we hadn’t considered, particularly around improving accessibility of research technologies. While most attendees engaged enthusiastically with the concept, some appeared to struggle to connect it to their work. It took time for me to find effective ways to explain the purpose and value of the approach—a good reminder that not every method resonates equally in a diverse field and how important it is to tailor explanations to your audience, even within a given research community.

For an HDR student, this experience underscores the importance of exposing your work to the research community early. The value lies not in positive reception, but in the process of presenting the work itself. Getting to explain my work to others forced me to articulate and refine my thinking, an opportunity that is missed when work is conducted in isolation. These interactions helped me understand how my work fits within the broader landscape and sparked new reflections on its purpose and potential applications that I might have missed otherwise.

You can read more about this demo here: https://dl.acm.org/doi/10.5555/3721488.3721764

Dwyer, J. L., Johansen, S. S., Donovan, J., Rittenbruch, M., & Gomez, R. (2025). What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Melbourne, Australia.

TL; DR.

  1. Accessible Design: The kinematic puppet combines physical prototyping with virtual simulation for intuitive human-robot interaction design.
  2. Intuitive Feedback for Seamless Experience: Users control a customisable robot arm through hands-on manipulation while receiving real-time visual and haptic feedback. This novel approach supports Robot-Assisted Surgery design processes by enabling the intuitive exploration of human-robot interactions.
  3. Creative Inspiration: Inspired by film animation techniques and puppeteering, this low-cost, adaptable tool enables rapid prototyping and innovative experimentation in human-robot interaction research more broadly.
  4. Communicating Complex Research Concepts: Often requires tailoring explanations to diverse audiences. Even within a specialised community like HRI, individuals connect with ideas differently, and finding effective ways to articulate the purpose and value of novel methodological approaches is an ongoing challenge that improves with practice.
  5. Early Exposure of Research Work: Presenting research work to the community provides invaluable benefits beyond simply positive reception. The process of presenting forces the articulation and refinement of ideas, reveals how your work fits within the broader research landscape, and often uncovers applications and connections you might otherwise miss when working in isolation.

HRI 2025: A Successful Conference for our researchers!

The ACM/IEEE International Human-Robot Interaction Conference (#HRI2025) in Melbourne has wrapped up and our team had a fantastic time at the conference.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Special congratulations to our researchers who were awarded prizes for their submissions:
* Best Demo for “What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering”, by PhD researcher James Dwyer, Stine Johansen, Jared Donovan, Rafael Gomez FDIA & Markus Rittenbruch (QUT (Queensland University of Technology))
* Best Late-Breaking Report for “Redrawing Boundaries: Systemic Impacts of Rehabilitation Robots in Clinical Care Settings” by visiting PhD researcher, Amir Asadi, Damith Herath, Grant Shaw, Glenda Caldwell & Elizabeth Williams

It was also exciting to have several of our researchers present their work:
* PhD researcher Jasper Vermeulen and postdoctoral researcher Alan Burden led the in-person part of the Virtual, Augmented, and Mixed Reality for Human-Robot Interaction Workshop (VAM-HRI), organised by: Selen Turkay, Maciej Woźniak, Gregory LeMasurier, Glenda Caldwell, Jasper Vermeulen, Alan Burden
* PhD researcher Yuan Liu from QUT (Queensland University of Technology) presented a paper in the workshop: “Augmented Reality for Human Decision Making and Human-Robot Collaboration: A Case Study in A Gasket Room in Manufacturing”
Müge Belek Fialho Teixeira was part of the Cultural Robotics: Diversified Sustainable Practices workshop organised by Belinda Dunstan, Jeffrey TKV Koh, Hooman Samani, Müge Belek Fialho Teixeira
* Jasper Vermeulen presented a late breaking report “Investigating Human Factors in Mako-Assisted Total Knee Arthroplasty Surgeries”, by Jasper Vermeulen, Alan Burden, Glenda Caldwell, Müge Belek Fialho Teixeira, & Matthias Guertler
* Stine Johansen, Markus Rittenbruch & Jared Donovan presented their paper on Embodied Composition for Imagining Robotic Sound Space
* Fouad (Fred) Sukkar & Teresa Vidal Calleja shared their insights on enabling safe, active, and interactive human-robot collaboration via smooth distance fields
* Dong An & Markus Rittenbruch explored the design of extended reality-enabled tangible interaction to enhance collaborative robot interaction.
* The Swinburne team, led by Mats Isaksson hosted a tour of the Intelligent Robotics Lab, giving attendees a chance to see some more robot demos.

One of the highlights of the conference was also the opportunity for our teams from University of Technology Sydney, Swinburne University of Technology, and QUT (Queensland University of Technology)—along with friends, both old and new from all over Australia and the world—to come together for a dinner.

The full conference proceedings can be found HERE.

We are already looking forward to #HRI2026 in Edinburgh!

Swinburne Secures AEA Ignite Grant for AI-Enhanced Teleoperated Echocardiography

Congratulations to our Swinburne University of Technology lead Prof Mats Isaksson and postdoc, Mariadas Capsran Roshan on their successful Australian Research Council AEA Grant!

Our researchers have received significant funding through the AEA Ignite grant for its project titled “AI-Enhanced Haptically-Enabled Robot for Teleoperated Echocardiography”. The funding will support early-stage research commercialisation, allowing the team to advance their innovative robotic platform for remote cardiac imaging.

The AEA Ignite grants provide up to $500,000 over 12 months to assist researchers at Australian universities in developing proof-of-concept solutions in industry-relevant settings. This grant ensures that the team can continue their work in refining their AI-driven robotic system, which aims to enhance the accessibility and accuracy of echocardiography procedures.

The project is led by a team of researchers, including Mats Isaksson, Mariadas Capsran Roshan, Mauricio Hidalgo Florez, Hailing Zhou, and Gavin Lambert from Swinburne University of Technology; Adrian Pranata from RMIT University; and Tom Marwick and Leah Wright from the Baker Heart and Diabetes Institute. Their collaborative efforts aim to push the boundaries of medical robotics and improve patient outcomes through cutting-edge technology.

With this funding, the team can dedicate their focus entirely to advancing their robotic platform, bringing them closer to commercial viability and real-world application in the healthcare sector.

For more information on other funded projects, visit: AEA Ignite Funded Projects.

ARTICLE: Integrating Vision-Guided Cobots into Steel Manufacturing

A cobot equipped with a laser-mounted end-effector points at a detected short bar. A green dot marks the identified short bar, providing a clear visual cue for operators.

A demonstration of vision-guided collaborative robotics has shown what the future of automation in steel product manufacturing could look like. As part of the Australian Cobotics Centre’s (ACC) Biomimic Cobots Program, this research initiative fosters university-industry partnerships to drive technological advancements.  Researchers from the Robotics Institute at UTS and the Research Engineering Facility at QUT, in collaboration with InfraBuild, deployed a custom AI-based “shorts” detection system integrated with a collaborative robot (cobot) that aims to enhance safety and maintain quality control in an active production environment.

The industry partner, InfraBuild, operates a manufacturing process that involves producing hot steel bars in various shapes and sizes. Quality control is maintained through a manual process where workers, operating in 12-hour rotating shifts, identify and remove defective short-length bars, known as “shorts”, from a conveyor. This task is both physically demanding and requires continuous focus to reduce errors and ensure workplace safety.

A key requirement for the solution was that it integrate seamlessly into existing operations without necessitating extensive modifications to plant, equipment, or processes. However, due to the wide variety of products that InfraBuild manufactures, off-the-shelf automation solutions were not suitable for accurately identifying and removing every type of bar produced. Given these requirements, a vision system consisting of various sensing modalities and a cobot were selected. This choice minimises the disruption to Infrabuild’s current workflow since cobots can operate safely alongside human workers without the need for extensive guarding and offers the flexibility and the option to revert to manual operation if needed.

A major milestone was achieved during the demonstration, successfully showcasing the “shorts” detection and cobot bar tracking system functioning in a live factory environment.  During the demonstration, the AI-based “shorts” detection system successfully detected short bars in real time. This information was communicated through a graphical user interface, displaying live video streams from two cameras mounted on InfraBuild’s conveyor line. The interface also featured coloured indicators: dots marking the detected start and end of a bar, its corresponding length displayed in the centre, and the average length per run. If a short bar was detected, a red bounding box highlighted it, and its corresponding length measurement changed from green to red, providing a clear visual cue for operators. The additional information provided from each production run, offers valuable insights for InfraBuild’s quality assurance processes. Additionally, InfraBuild noted that the vision system alone was a valuable addition, as it would enable operators on the factory floor to more quickly identify and remove defective bars when necessary.

By leveraging real-time detections from the vision system, the cobot dynamically adjusted its actions, indicating the bars identified as “shorts” by pointing at them. A laser mounted on the cobot’s end-effector highlighted these bars, allowing staff from the ACC and InfraBuild to clearly see the identified short bar. This milestone demonstrated the adaptability of vision-guided cobots, which, unlike traditional automation systems requiring structured environments, can respond dynamically to changing conditions in manufacturing processes.

This trial serves as a proof of concept for integrating robotic vision systems into InfraBuild’s broader production lines and offers valuable insights for other SME manufacturing companies looking to implement similar cobot-enabled automation solutions. By demonstrating the potential of vision-guided cobots, this initiative represents a step toward smarter, safer, and more flexible manufacturing systems. Showcasing a live cobot system in a factory was a first and major milestone for the ACC, proving that it is possible to address challenging problems found in industry. This achievement provides insight on the commercial viability of such technologies, marking a step for InfraBuild as they move toward the next phase of development.

Graphical user interface of the AI-based ’shorts’ detection system. The top image displays a run with no short bars detected. In the bottom image, a short bar is identified, highlighted by a red bounding box, and its length measurement in the centre of the interface changes from green to red, providing a clear visual indicator.