It was a full house at our Building Workforce Capability with AI event in Adelaide this week, where the atmosphere was thick with equal parts curiosity and excitement.
As seen in previous stops on this tour, the conversation in the care sector has shifted from "What is AI?" to a much more practical discussion on how to make it work for our people.
From the nuances of digital twin residents to the psychological safety of our frontline staff, here are ten key takeaways from the Adelaide speakers.
1. Focus on the Improvement, Not Just the Tech
Ausmed CEO Will Egan began by highlighting that technology should always serve a purpose rather than being the focus itself. He explained that successful organisations do not simply look at technology for its own sake, but rather consider what amazing improvements can be offered to people's lives.
By focusing on the experience rather than the tool, providers can better identify where AI truly adds value to care delivery.
"We must sit here and say 'What are some amazing experiences or improvements we want to give people in their life, and do we have the technologies available to enable that?'"
— Will Egan, CEO, Ausmed
2. Move from Administrative Minutes to Care Minutes
Craig Carter from ACH Group emphasised that the primary goal for aged care providers should be to convert time spent on paperwork into time spent with residents.
He raised that if tasks that do not add value are removed or automated, more minutes can be spent on the frontline. This shift allows the workforce to make a greater impact where it matters most, improving both the worker experience and customer satisfaction.
"Our focus is particularly on efficiencies, the people we do have, how do we best utilise them, and how do we turn every admin minute into a care minute, because that's where they make the greatest impact."
— Craig Carter, Executive Manager Transformation & Digital, ACH Group
3. Replace Tasks, Not Roles
The panellists were quick to debunk the idea of robot carers taking over entire professions.
Margeaux Bartholomew-Carle pointed out that while AI can handle repeatable and replaceable administrative tasks, it is fundamentally incapable of replacing clinical judgment or reasoning. This technology is a tool that evolves a role rather than eliminating it, allowing staff to focus on high-value care.
"AI doesn't replace clinicians, and AI doesn't replace clinical judgment, clinical reasoning and evidence-based practice. It does however replace the replaceable, repeatable tasks that make clinicians want to quit."
— Margeaux Bartholomew-Carle, Founder, Ardant
4. Treat AI Like an Intern, Not a Manager
To safely integrate AI into a clinical workflow, Craig Carter suggested a helpful mental model: view AI as a highly capable intern.
It can draft reports, suggest next steps, and gather information with incredible speed, but it should never be the one to sign off on a decision. The ultimate responsibility and clinical accountability must always remain with the human professional.
"I would just caution everyone to treat AI as more like an intern, just suggesting what the best thing would do, not the manager telling you what to do."
— Craig Carter, Executive Manager Transformation & Digital, ACH Group
5. Address "Shadow AI" with Clear Guardrails
One of the most concerning insights was the prevalence of shadow AI, which occurs when staff use personal, unsecured tools to complete work because official systems are too slow or restrictive.
Margeaux highlighted that without clear, documented guardrails, clinicians might upload sensitive data to public models on their own devices.
The solution is not a total ban, but providing safe, approved alternatives and clear instructions on permissible use.
"I've had a real big problem with shadow AI, which is basically when you haven't really put really strict guardrails in place, saying you can do this, only that you are not allowed to do that."
— Margeaux Bartholomew-Carle, Founder, Ardant
6. Bridge the Gap Between Completion and Comprehension
Will Egan shared data from early AI learning simulations showing that high completion rates in traditional training often mask low actual comprehension. By using AI to facilitate verbal practice, such as a staff member simulating a difficult conversation with a family member, organisations can identify where their workforce is actually struggling to apply their knowledge.
"The bad news is that, underlying the compliance rates that we see, there are much lower comprehension rates. It's much lower than you'd expect."
— Will Egan, CEO, Ausmed
7. Leverage AI for Proactive Decision Support
At the point of care, AI's greatest strength is its ability to turn reactive data into proactive insights. Craig Carter explained that while sensors for movement and pulse rates have existed for years, AI is the component that finally makes that data useful. It can prompt staff when a resident is at high risk of a fall, rather than just recording the event after it occurs.
"With AI, we can now do something with [the data] because whilst we know it's there and we can go check on it, it's not actually prompting us and it's not telling us what we need to do next, or telling us that there's a high likelihood that somebody's at risk of a fall for instance."
— Craig Carter, Executive Manager Transformation & Digital, ACH Group
8. Use AI for Ambient Knowledge Transfer
Handover is a critical yet vulnerable time in care settings, especially for agency staff who may arrive late and miss the briefing.
Craig shared a use case in which AI gathers information from various sources to provide a consistent handover to any worker, regardless of when their shift starts. This ensures that essential details regarding resident safety and overnight activity are never lost.
"What we're able to do now is to replay that handover, and it's actually the AI that gathers the information and says, this is what happened overnight."
— Craig Carter, Executive Manager Transformation & Digital, ACH Group
9. Digital Etiquette is the New Literacy
As client portals make organisational actions more visible to families, the concept of digital etiquette has become essential. Craig Carter noted that it is no longer just about knowing how to use a tool, but about understanding that digital delays or actions are now visible to the people being served.
This visibility means that digital responsiveness is now a direct part of the customer experience.
"Digital literacy and understanding exactly how to use the tools and what the tools mean is essential. But I think one we don't talk about enough is digital etiquette."
— Craig Carter, Executive Manager Transformation & Digital, ACH Group
10. Start with Low-Hanging Fruit to Build Trust
The consensus for those just starting was to avoid trying to change everything at once and instead focus on low-risk applications.
Margeaux Bartholomew-Carle recommended using AI for drafting social media posts or marketing content to build workforce confidence. By starting in areas where errors have low consequences, organisations can build the psychological safety needed to eventually tackle more complex clinical tasks.
"I go to the low risk - this is where I feel comfortable - applications of AI. The low hanging fruit, like drafting content and things like that."
— Margeaux Bartholomew-Carle, Founder, Ardant
In summary
The Adelaide event underscored a critical truth: the successful integration of AI in the care sector is a human-centric project, not a technological one. By focusing on practical applications that convert administrative time into care minutes, treating AI as a capable intern, and carefully implementing guardrails to manage 'shadow AI,' providers can empower their workforce and deliver demonstrably better care.
The conversation is clearly moving beyond theory to execution. Join us as we continue this essential discussion and explore local innovations at our upcoming events in Perth and Sydney. Find more details and register at ausmed.com.au/organisations.events.

