Let’s get real about measuring service outcomes
We love evidence at Power to Persuade, and advocate for evidence-based approaches. But there are times when the definition of “evidence” can detract from effectiveness. In today’s post, Lanie Stockman, Good Shepherd Australia New Zealand’s Outcomes and Evaluation Specialist, argues for ensuring evidence collection remain manageable and respectful of programs and their clients.
In 2012 ACOSS - the Australian peak body for social service organisations – surveyed community organisations about their readiness to meet obligations in taking on government-funded contracts. The survey responses found government contract requirements to be the “single most significant area of regulation and reporting across the community sector.” These requirements place greater demands on a sector that already has limited resources. Furthermore, extensive reporting requirements arguably provide little in terms of knowledge about the effectiveness of the sector.
At that time ACOSS urged, “[given the]understanding from government funders that community services are embedded in the communities they serve and their ability to identify and respond to community needs is a core value, it is important for any programme evaluation or performance indicators to be suitable for the type of service contracted”.
While government funders must be able to vouch for the efficient and effective outlay of public funds and manage the risks associated with outsourcing services, it seems there is still some way to go in evidencing service outcomes in practical and meaningful ways, despite government department assurances to reduce ‘red tape’ for community organisations in collecting service data.
Yes, this is what data collection can look like…
Good Shepherd Australia New Zealand provides a number of services aimed at improving the safety, well-being and life opportunities of vulnerable children and their families. One such service - a supported playgroup for young mothers and their children run by a specialist facilitator - focusses on developing children’s social, emotional and physical skills through play and enables the development of mothers’ social and support networks and parenting skills.
The playgroup uses the Abecedarian method, employing teaching and learning strategies that emphasise children’s language acquisition. This approach is backed by rigorous longitudinal research, running in the US since 1972.
The researchers found that mothers who used Abecedarian strategies with their children were more likely to undertake post-secondary education and experience better employment outcomes than those who did not. The children were more likely to experience educational benefits, better physical and mental health, and a reduction in risky behaviours by the time they reached earlier adulthood, compared to children who did not attend early childhood education using the Abecedarian approach.
Despite these rigorous findings underpinning the Abecedarian approach, the service is still required to demonstrate that the playgroup approach is ‘evidence based’: a concept appropriated from the medical field to denote programs that are ‘scientifically proven’ by experimental study designs such as randomised control trials. While the notion that evidence should inform services is axiomatic, concepts of ‘robust’ evidence are highly contested, as has been discussed elsewhere.
In the day-to-day reality of providing a playgroup one afternoon each week for young mothers and their children, practitioners are potentially drawn away from the core work of supporting children’s development and mothers’ social connections if data collection consumes the time that is allocated to provide the service. These data sets include obtaining:
· Clients’ details including income source, level, payment and frequency;
· Playgroup attendance;
· Clients’ self-rating of changes as a result of attending the playgroup;
· Referrals of clients to specialist services;
· Case notes;
· Evidence of children’s improvements in verbal and non-verbal language;
· Evidence of mothers’ use of Abecedarian strategies at home; and
· Client satisfaction survey.
While meaningfully adding to the research that underpins the service approach is important, it is unrealistic for a very small service to produce evidence meeting standards of experimental study designs without significant additional resources. Furthermore, such methodologies may not be the most suitable ways of evidencing outcomes of young mothers and their children. There is the additional worry that providing this data becomes so onerous for the playgroup participants that they cease participation, or don’t fully experience the potential benefits.
A new mantra: meaningful but feasible?
The Government’s key research body in the area of family wellbeing – the Australian Institute of Family Studies – supports an approach that is less burdensome and more realistic about the outcomes of services such as this playgroup.
Previous evaluations of the scheme that funds the playgroup and similar services acknowledge the challenges of measuring outcomes as well as the gaps and inconsistencies in local level data.
Contributing to a myriad of data sets may meet funding requirements but may not achieve much more. Worse, it may actually detract from the effectiveness of the intervention. We must find ways to feasibly access meaningful information to evidence what works (and doesn’t work) for children and families. Alternatives may include:
1. Undertaking (and using) well-designed, strategic and appropriately-funded service evaluations to allow a deeper examination of service impacts.
2. Setting service performance and accountability measures with respect to the real cost of data collection, such as through cost-benefit analyses.
3. Valuing anecdotes and observations of clients and staff to elicit views on changes – whether negative or positive.