An ANZSOG-funded research project is exploring the increasing use of robots in care services to replace or complement the roles of humans. In this article, researchers Helen Dickinson, Nicole Carey, Catherine Smith and Gemma Carey explore some of the long-term implications for governments from the rise of robots.
The rise in number of citizens needing government-provided care services and advances in technology make it inevitable that robots will play a far greater role in care services, including services most of us will access at some point in our lives (e.g. education and health) and those that only a small proportion of the population will access (e.g. disability services or prison).
Since at least the 1970s, many countries have experienced significant changes in relation to care services. Groups needing care services are increasing in numbers, becoming older, have greater levels of disability and chronic illness and higher expectations about the quality of services that should be delivered. At the same time, care services are finding it increasingly difficult to recruit appropriate workforces.
Horizon-scanners and futurists have told us for some time that robots will play a larger part in our everyday lives and will replace some of us in our current jobs. For all the attention that these kinds of predictions have gained in the media, many of us have not seen quite the dramatic changes promised. However, a combination of forces including technological development, pressures for governments to contain costs and rising public expectations mean that we will likely see greater use of robots across many more facets of public services in the coming years. Our research examines the implications of this for the delivery of care services and the role that government should play in stewarding these innovations.
Robots are already here
Robots already have a number of applications in the provision of care services broadly defined. Applications include manual tasks such as transporting goods, meals, linens (e.g. Robocart), conducting surgery (e.g. ZEUS), dispensing medication (e.g. CONSIS), checking on residents of residential homes and sensing for fall hazards (e.g. SAM), providing rehabilitation (e.g. Hand of Hope), as learning tools in the classroom (e.g. NAO, Pepper), as a virtual assistant for the National Disability Insurance Scheme (Nadia) and also for social interaction (e.g. Zorabot, PARO, Mathilda).
Advancements in Artificial Intelligence mean that many new care applications will take on more advanced roles which aim to combine the execution of particular tasks along with social functions, where these technologies learn about individuals from previous interactions. One of the first tasks of our research project is to develop a typology of robots in care services that can provide a way of differentiating between these different technologies and their functions.
Can machines really care?
Some of the developments in care robotics will undoubtedly drive efficiencies, improve some services and outcomes for those using these. However, others may bring unanticipated or unintended consequences. As MIT Professor Sherry Turkle argues, we need to consider the human value of different care activities and whether it maintains this value if it is carried out by a machine. There is a risk that if we do not suitably consider what tasks are being substituted by technology then we could inadvertently lose some of the value in the delivery system.
As an example of these issues, the greatest recent expansion of applications in aged care is in the social domain, seeking to reduce social isolation. Robots such as Matilda are being used to engage people with dementia, through play, dancing, and making Skype calls to family members. Some of these robots have sensors so they can detect aspects of individuals’ emotions and daily schedules and use this data to interact with people in a way that is perceived as consistent with the act of caring. Other robots, such as ElliQ, aim to serve monitoring, communication and well-being purposes, that aim to keep older people living independently for longer and as a means of maintaining engagement with their family and friends.
In these applications, we believe there is a need to investigate a number of these factors in more detail. One facet of this might be the implications of surveillance in private/public geographies of care. Although it may seem a helpful development to be able to monitor people in their homes, what are some of the implications for privacy and security? Moreover, does surveillance equate to care that might be provided in situ?
There is a substantial literature arguing that care is a reciprocal activity, not simply something that is done to a person, so what might be lost if care is carried out by a machine? Additionally, we need to consider the embodied experience of touch and expression of care, and what the trade-offs are in safety and security for the cared-for in the different iterations of these arrangements.
Working to protect the rights of vulnerable groups
Many of these applications seem helpful ways to prevent social isolation in aged care and disability services, yet in other spaces there have been significant concerns expressed surrounding their application. In the US, similar technology that is being used in nursing homes to connect older people to families and friends has been rolled out to an estimated 600 prisons across the country, where in-person visits have either been significantly restricted or stopped entirely, in favor of video calls.
While the prisons cite security concerns, experts and public alike have deemed it inhumane and counter-productive. There are important differences in the prison and nursing home examples (although both constitute different forms of care). In the latter families and friends do not just Skype but physically inhabit an avatar in the same room and this is intended to supplement and not replace face-to-face contact.
Yet there are also worrying similarities, in both public framing and recipient demographics. Both groups are psychologically and physically vulnerable, and prone to social exclusion. Both groups are likely to be in need of training or therapy programs which can be mediated digitally or in-person. And while both technologies are presented to the public as a way of increasing family connection, they’re sold to the purchasers (prison and nursing home administrators, or government departments) as cost-saving measures.
There might be nothing new in this, but it means that there is an important balance to be maintained in stewarding these technologies to ensure that we can open additional avenues for social inclusion and communication, without decreasing or offering an excuse to multiply the barriers in front of physical interaction. This is where governments play an important role as stewards of technologies, developing guidelines, recommendations, and legal baselines. Our project will be a step in supporting this endeavor.
Authors and affiliations:
Helen Dickinson, Public Service Research Group, University of New South Wales, Canberra
Nicole Carey, Self-Organizing Systems Research Group, Harvard University
Catherine Smith, Youth Research Centre, University of Melbourne
Gemma Carey, Centre for Social Impact, University of New South Wales