Artificial intelligence is edging closer to everyday use in Australian aged care and retirement living, with industry voices predicting a rapid expansion in robot companions, behaviour monitoring tools and pain management apps. Supporters say the technology could lift quality of life and help tackle loneliness, while critics argue the ethics and safety of AI use are still too murky, particularly without stronger rules.

The Federal Government is still working through how AI might be introduced safely and effectively. In July 2024, former aged care minister Anika Wells unveiled Australia’s first Aged Care and Digital Strategy, designed to modernise the system and backed by an AI framework. As her office stated at the time: “The focus is on preserving choice and leveraging technology to make in-person and face-to-face services more accessible and efficient,” with the Department of Health and Aged Care reviewing research, guidance and safety controls. The five-year plan also proposed a pilot program of up to 20 health professionals to test “the potential of AI in providing better information”.

The Neo robot by Californian company 1X was launched in November 2025 to handle chores for everyday users

With around 1.35 million Australians using some level of aged care, the appeal of tools that reduce paperwork and streamline tasks is obvious. Liana Donleavy from Aged Care Research and Industry Innovation Australia says AI already used in health settings can translate into “workforce efficiency”, helping frontline staff get back to hands-on support. Digital scribes, translation aids and tools that assist clinical assessments are among the common examples. “They’re trying to support the workforce to remove the burden of particular activities,” she said. “Removing those tasks then enables the workforce to spend more time with the residents and their clients.”

AI is also being linked with cameras and wearable devices to flag risks such as falls, spikes in pain, or changes in routine that may indicate a problem. Some providers build their own platforms, while others partner with tech firms.

For loneliness, some services are experimenting with AI-enabled robots that can offer company and prompt social interaction, intended to complement, not replace, professional care. Ageing Australia CEO Tom Symondson argues wellbeing is often underfunded compared to physical health. “We spend all of our time and money [on] keeping people physically well … We don’t spend nearly enough time or money on [their] wellbeing,” he said. He points to the flow-on effects when people are isolated. “A lot of people end up needing residential care because they’re malnourished and depressed because living on their own is not what works for them – it’s not what works for most of us. “If you can reduce that somehow, you have huge benefits … a life requires vibrancy and engagement and interaction, and so anything that can help with that is beneficial.”

Symondson says robots can help cover the social gaps when staff are stretched. “You don’t want this kind of technology taking away from the human carer,” he said. “But, if all of your carers are doing other tasks, there isn’t as much time to spend on social benefits. “Some of the things that robots can do is they can walk down the corridor and can just check in on people while the human staff are busy, maybe helping someone with medication or feeding somebody or taking them to the shower.”

Two models reportedly being used in Australia include Abi and Daisy. Abi, developed by Australian start-up Andromeda, was created to “provide personalised companionship to residents in their rooms more often” and can converse across many languages while remembering previous interactions. Daisy, built by Singapore’s Dex-Lab with the Australian Nursing Home Foundation, is designed for older people living with dementia and can lead group activities and speak languages including Mandarin and Cantonese.

How quickly could this scale up? Symondson believes it could happen very fast: “in a flash”. “That’s what happened in Asian countries, in places like Japan,” he said. “They’ve gone, ‘We’ve got a workforce crisis, we don’t have enough people, so we have to do something different – we’ll invest in this.’”

The urgency is real. A 2023 State of the Nation report from Ending Loneliness Together found 39 per cent of Australians over 65 reported feeling lonely, with 16 per cent saying they felt that way often or always. Still, Monash University sociologist Pei-Chun Ko cautions that evidence is limited on whether AI delivers lasting relief from loneliness, and some older people may struggle to integrate the tools into daily life. “The interest is there … It’s an emerging topic in this field,” she said. “This is a mixed bag because loneliness is really about the quality of meaningful relationships. “We need to understand that these technological adaptations are connected to the user’s experience.”

Safety and regulation are now central to the debate. The office of eSafety Commissioner Julie Inman Grant has raised concerns about AI chatbots, robots and apps that mimic human relationships, warning: “If these technologies are used in a caring capacity over time, there could also be legitimate concerns about social engineering and acting in Australian adults’ best interests.” The office also said: “We are also concerned about risks relating to harmful or inappropriate content, manipulation, or over-reliance on these systems for emotional support. “These risks reinforce the need for providers to take a Safety by Design approach, embedding strong safeguards, transparency measures and user protections into their products from the outset, rather than responding to harms after they occur.”

Ending Loneliness Together chair Michelle Lim also worries about products built outside care settings, where the underlying design priorities may not match real-world needs. “We don’t know who is behind the algorithms and who is behind the technology of these AI companions – and I’m pretty sure they’re not health or community workers,” she said. “If there’s something wrong or someone’s in trouble, how do you flag that? “AI companies tell you what you want to hear; they don’t disagree with you, they don’t challenge you. “The use of these companions will affect the way we think, the way we behave, it might even make some of our more unusual beliefs more concrete.”

Lim wants tighter oversight, especially when mental health is involved. “When you deal with emotions, and particularly if someone’s depressed, the risk goes up,” she said. “The technology, the frameworks which AI is based on – they’re not developed by psychologists, they’re not developed by healthcare workers or doctors, they’re developed by engineers.”

Ko also highlights privacy risks when sensitive personal data is collected and stored by tech companies rather than health professionals, and warns poor outcomes are possible without proper supervision. “There are some case studies that provide quite negative outcomes, meaning that if the AI robots are without any regulations or control or monitored by health professionals, the consequences might be very bad,” she said.

Donleavy argues many providers are trying to proceed carefully, focusing on whether technology is safe, accepted and genuinely useful. “There’s no point in providers implementing something that the end users aren’t going to want to use or like,” she said. She supports standards, but cautions against making implementation impossible. “There definitely needs to be best practice guidelines and frameworks around the use of it … [but] if there are too many rules and regulations, I am not sure whether it would become too difficult for people to implement. “If there is something that is low-risk that can be implemented, that’s actually going to support the way the workforce works, that improves quality of care for older people because it means the staff are there [and] freed up – why wouldn’t anybody implement that?”