This article first appeared in Interaction v26, 4/13 – Australian Institute on Intellectual and Developmental Disabilities

The Department of Families, Housing, Community Services and Indigenous Affairs (FaHCSIA) is currently involved in examining ways in which care and support can be provided to people with disabilities through an insurance approach over a long period of time. One such initiative is the Practical Design Fund (PDF) which involved identifying practical methods to prepare people with disabilities, carers, family members, the disability sector workforce and disability sector organisations to transition into the DisabilityCare Australian national disability insurance scheme (formerly known as the NDIS).  Projects funded through the Practical Design Fund provided individuals and organisations with the opportunity to contribute towards the transition to self-directed and/or individualized funding, support people with disabilities to exercise choice and control, support the growth and skilling of the disability workforce, and assist disability organisations to transition into the NDIS environment.  Most projects funded through the PDF were completed by 30 April, 2013.

One of the outcomes from a Practical Design Fund project[1] highlighted the extent of monitoring activity routinely undertaken in self-directed / self-managed arrangements where these have been managed by people for a number of years.  Monitoring activities commonly incorporate daily, weekly and monthly reviews of financial and staff management practices, and people involved in the project explained that they successfully manage these arrangements within existing quarterly funding agreements with a host service provider or a government department, or directly within their own company structure.

However, the focus of the project was on exploring the ways in which family members and people with disabilities evaluate their self-directed / self-managed arrangements.  Evaluation can be seen to serve a number of purposes, not least of which includes providing essential information about the value of initiatives (whether programs, services, or individual responses such as those associated with self-directed /self-managed practices), knowing the extent of progress being made towards large or long-term goals and what is still needed, as well as what impact is evident from what is being done.   Similarly, evaluations provide an opportunity to ask a number of questions about whether the process or program has achieved what it aimed to achieve.  Examples of standard evaluation questions include:

  • Did we deliver what we said we were going to deliver?
  • How well are resources used?
  • What difference did it make?
  • Are we meeting the identified need?
  • How well did we need expected outcomes?
  • What have we learned?
  • What impacts have we seen?
  • What can we take from this approach & use to inform future decisions?

Outcomes from conducting interviews with people who have been self-directing/self-managing arrangements for some time indicate that while they undertake extensive monitoring practices, limited attention is placed on the role of evaluating these arrangements.   In fact, interview questions about evaluation of these arrangements invariably resulted in participants referring to this process as being an audit, which is the prevalent method used by governments to monitor the activity and practices of funded organisations.   However, there are key differences between monitoring, evaluations and audit processes.  As Table 1 shows, audits tend to focus on compliance and provide the funding body with reassurance about the organisation’s accountability towards inputs, activities and outputs.  Evaluations, on the other hand, identify the lessons learned, focus on outcomes and overall goals, and are conducted on a periodic basis.

Table 1:  Comparing key features of audits, monitoring / review, and evaluation

 Monitoring / reviewsEvaluationAudit
Why?Checks progress

Informs decisions & remedial actions

Update plans

Supports accountability

Assesses progress & worth

Identifies lessons learned & make recommendation for longer-term planning & organizational learning

Provides accountability

Ensures compliance

Provides assurance

Provides accountability

When?Ongoing during the life of the programPeriodic

Can occur after a program has been completed

According to (donor / funder) requirement
Who? Internal

Involves key stakeholders involved in the program

Can be internal or external to the organization or serviceTypically external to program

Can be internal or external to the organization / service

Link to logical hierarchyFocuses on inputs, activities, outputs & short-term outcomesFocuses on outcomes & overall goal/sFocuses on inputs, activities, and outputs

(Adapted from White & Wiles 2008)

When asked about evaluating their arrangements, a number of people stated there had not been any specific interest from government or service providers in learning about what has made these arrangements work, what is done to address the challenging times that invariably arise over the years, or what suggestions for improvements these people could suggest for their particular arrangement or for self-directed / self-managed practices in general.  This could be seen to be a loss of significant information from people who have expertise in managing these forms of support arrangements.  At the same time, these participants expressed resistance to the notion of sharing any further information back to the government. This level of resistance was largely based upon the apparently poor treatment participants, particularly those living in the eastern Australian states, said they had received in both wanting to assume control of their lives and support arrangements, as well as in the practical implementation and maintenance of these arrangements over time.   As the environment surrounding the initiatives undertaken by DisabilityCare Australia unfolds, there will be a corresponding requirement to change previous practices from a deficit and disempowering management model to a professional, empowering, respectful and proactive approach in managing these forms of arrangements.

An acknowledgement of the important role of including the lived experience of people with disabilities and family members or significant other people in the way forward for Australia’s disability sector was referred to in the recent National Disability Insurance Scheme update 44 (20.6.2013) which states “the most important things we learn will come from the experiences of people with disability participating in the scheme, as well as their families and carers, service providers and community organisations”.  Incorporating the lived experience of people directly involved in managing these forms of arrangements cannot be stressed or emphasized enough.  While it is important to ensure the focus of this learning will come from (amongst other factors) yet-to-be established self-directed arrangements in the launch sites, there is also an ideal opportunity for DisabilityCare Australia to develop a broader evaluation initiative that encompasses the voice and experience of people with disabilities, family members and significant other people involved in existing self-directed / self-managed arrangements, representatives of service provider organisations, peak bodies, and government representatives.

Research indicates that this approach requires the development of a monitoring and evaluation plan that ideally needs to be established during the planning stage of the overall evaluation process, and certainly before commencing implementation. This process includes (amongst other components) the design of the methodology, identification and management of actual and potential risks, as well as data collection.  One way that this could be achieved would be to develop a robust and comprehensive co-produced evaluation methodology.  This type of methodology refers to the delivery of public services in an equal and reciprocal manner between professionals, people using services, family members, and significant other people (Boyle & Harris 2009), while also acquiring learning from individual arrangements.   This means that people who have been involved in managing their self-directed / self-managed arrangement for some time should be afforded respect and an essential recognition of their expertise in the development of an effective and meaningful evaluation methodology, as opposed to being relegated to a position of continued disempowerment and disablement yet again by not having their opinion or voice considered or respected at all within the process.  Moreover, incorporating a participatory methodology, increasingly viewed as ‘best practice’ in evaluation research (Fisher & Robinson 2007), would ensure that meaningful participation of a range of stakeholders occurs throughout the evaluation process.  This form of methodology encourages the lived experience of people with disabilities, family members as well as other stakeholders to be incorporated into the qualitative data collection process. Similarly, having a structure around integrating program learning, reflection and improvements arising from evaluation strategies would be strengthened by having a broader educative role in assisting people with disabilities and family members understand and appreciate the importance of evaluation within their particular arrangement. This means that all stakeholders (including those individuals who initially appear resistant towards sharing their wisdom with other people, including the government) would need to understand the importance of having a long-term focus on outcomes and impacts.  In addition, evaluation questions need to be seen to be relevant and useful to the person with disabilities and/or family members and significant other people, as well as to government and non-government organisations.  As a result of these activities, the program can be more effective in terms of the results that it achieves, and add value to monitoring and evaluating development programs where results and achievements are not easily understood with the use of quantitative indicators alone, but requires the deeper insights of a qualitative, conceptualized story of the development process (Earl, Carden & Smutylo 2001).

As a result of obtaining higher level, relevant feedback associated with the lived experiences and learning acquired from these people, information could also be shared with people who are interested in these arrangements but have not commenced as yet, as well as provide practical assistance to people who may need other information to assist them with their current practices.   This means that the agency (in this context, DisabilityCare Australia national disability insurance scheme) needs to start collecting performance information now to ensure that useful performance information is supplied to government and to the evaluators who at a later date will want to assess the effectiveness and appropriateness of implementation and service delivery (Ryan 2003).  Moreover, considering approaches such as these would ensure that policy linked to the implementation of ongoing strategies does not continue to be developed both geographically and in consciousness at a distance from the lived reality of people with disabilities and their families (Chenoweth 1997).

In conclusion, providing an opportunity to incorporate this level of expertise from people who have been self-directing /self-managing for some time could go a long way towards informing not only DisabilityCare Australia’s overarching evaluation process, but the Australian government as well as the wider community about the successes and learning acquired from implementing and maintaining self-directed / self-managed practices over time.  Similarly, the opportunity to build upon Australia’s knowledge base about what helps to make these arrangements work and what can be done to minimize the likelihood of failure occurring will enhance the likelihood of continual development of support responses to people with disabilities, family members or significant other people who may require additional assistance due to factors including (amongst others) age, cultural background, ethnicity, limited literacy and/or numeracy comprehension, or limited family supports to undertake these arrangements.

 

References:

Boyle, D. & Harris, H. (2009). The Challenge of Co-production: how equal partnerships between professionals and the public are crucial to improving public services, nef/NESTA, London, UK.

Chenoweth, L. ‘Policy Skills for Human Services’, in Murray, R. & O’Brien, P. (1997).  Human Services: Towards Partnership & Support, edited by O’Brien, P. & Murray, R. Dunmore Press, Annandale, Australia.

Earl, S., Carden, F., & Smutylo, T. 2001. The Challenges of Assessing Development Impacts, [online] www.idrc.ca/evaluation.

Fisher, K. & Robinson, S. (2010). Will policy makers hear my disability experience? How participatory research contributes to managing interest conflict in policy implementation, Social Policy & Society, 9 (2), 207-220.

National Disability Insurance Scheme update 44 [online] http://us6.campaign-archive2.com/?u=055092cc7e42efbfc41d80045&id=c7fddd259c&e=4976acac80

Rees, K. 2012. “It’s not just about the support:  Exploring the ways in which family members and people with disabilities evaluate their self-directed / self-managed arrangements”, commissioned by the Practical Design Fund, FaHCSIA, Canberra, ACT. [online] www.gitana.com.au

Ryan, B. (2003). Death by evaluation? Reflections on monitoring and evaluation in Australia and New Zealand, Evaluation Journal of Australasia, 3 (1), 6-16.

White, G. & Wiles, P. (2008). Monitoring Templates for Humanitarian Organisations, Commissioned by the European Commission Director-General for Humanitarian AID (DG ECHO), p40.

[1] Rees, K. 2012. “It’s not just about the support:  Exploring the ways in which family members and people with disabilities evaluate their self-directed / self-managed arrangements”, commissioned by the Practical Design Fund, FaHCSIA, Canberra, ACT.

 

From initial funding managed through fault-based compensation mechanisms and more recently through the receipt of government funds, Kathy Rees has established and maintained a successful self-directed arrangement for her daughter for the past thirteen years.  Kathy is committed to the reality of people with disabilities and family members being able to successfully manage their own funds and to have control of their own lives. While she continues to assist individuals and organisations where she can with the transition from standard service delivery to self-directed / self-managed arrangements, Kathy continues to research and explore ways to improve how these arrangements can be managed over a long period of time.  Kathy received funding through the Practical Design Fund to explore what people are doing to evaluate and maintain these arrangements over time.

Pin It on Pinterest