POSTED: 28 Oct, 2024
Artificial Intelligence (AI) is appearing in many aspects of our life and work, and advancements are rapid and continuous. For most of us, it has been hard to keep up. Regulations designed to protect our way of life and conditions of work, have also struggled to keep pace with the development of AI in ways that can reduce harm arising from the use of AI, while ensuring Australia can capitalise on the possibilities that AI offers.
Recognising that Australia’s current regulatory environment has not kept pace with AI capability, and following extensive consultations, the Australian Government recently released proposed guardrails for the safe and responsible development and deployment of AI. Outlining ‘high-risk AI’ these guardrails are put forward in the proposals paper titled: Introducing mandatory guardrails for AI in high-risk settings, which can be found here.
The guardrails complement the previously released Voluntary AI Safety Standards and provide some guidance to developers, organisations and individuals, on how to build and use AI responsibly and safely. Unfortunately, like many technologies, even when created with the best of intentions, AI can be used in ways that are deliberately or inadvertently harmful with negative consequences for individuals or society. For example, case examples and much academic research has already demonstrated that AI can not only replicate existing biases but embed them in automated decisions that result in individuals being excluded or otherwise discriminated against on the basis of race or gender. This can have significant implications especially when AI is used to automate decisions that impact on the lives or livelihoods of individuals.
One situation that has been explored in academic studies is when AI is used to automate recruitment shortlisting or hiring decisions. In these cases, research has shown that without human oversight, AI training data can contain pre-existing biases that may exclude under-represented groups from the AI-compiled shortlist for a job. This has obvious implications for access to employment and an income for individuals or particular groups, and it also has implications for diversity and the associated benefits of innovation, creativity and idea generation within organisations. Organisations may also experience more direct effects arising from the malicious use of AI to expose enterprise vulnerabilities or as they are subjected to more sophisticated scams, fraud and cyber-security attacks.
Taking a risk-based approach to regulation similar to that adopted by the several States in the USA and the European Union in the EU AI Act 2024, the guardrails proposed in Australia focus on the development and deployment of AI in high-risk settings. While the Australian guardrails are still in development, the proposals paper provides a useful summary of high-risk settings identified in other countries. These include (among others):
- biometrics used to assess behaviour, mental state or emotions;
- AI systems used to determine access to education or employment (as in some automated recruitment systems);
- AI systems used to determine access to public assistance or benefits; and
- AI systems used as safety components in critical infrastructure.
Research currently being undertaken by Australian Cobotics Centre researchers, suggests that some organisations in Australia are using AI for biometric identification or for recruitment or in other ways that may be considered ‘high-risk’ under the use cases applied in other country contexts. It is therefore critical for Australian organisations to monitor the Australian Government’s Consultation Hub and ongoing work on Artificial Intelligence to keep abreast of proposed regulatory changes, and consider how any current or planned use of AI within their organisation aligns with principles for promoting safe and responsible use of AI in Australia.
Recent News
2024 Industry Symposium and Cobot mini-Expo
The Australian Cobotics Centre successfully hosted its first Industry Symposium and Mini-Expo on Thursday, 5th December 2024, at the QUT Kelvin Grove ...
ARTICLE: How Vocational Education and Training (VET) looks to meet the skills needs of the advanced Manufacturing Sector
As manufacturing moves to more advanced methods of production that utilises technologies such as cobots, vocational education and training (VET) provi ...
2024 Centre Awards
At the annual ACC Symposium, an awards evening was held with nominations put forward by Centre members in the lead up to the event. Our annual awards ...