Current Size: 100%

Current Style: Standard



Disability Equality Specialist Support Agency


Evaluation as a method of research

Evaluation as a method is useful to examine a policy or programme and to make an assessment of it. According to Weiss:

Evaluation is a systematic assessment of the operation and/or the outcomes of a programme or policy, compared to a set of explicit or implicit standards, as a means of contributing to the improvement of the programme or policy.


The research team were aware that in the case of an external review, many of those invited to become involved in the evaluation might feel they were being judged. It was important therefore to reassure staff and participants of the importance and value of the evaluation for the future implementation of the Steps to Mainstreaming Participation Programme in their centres. Murray et al. (1994)2 identify a number of resources needed to carry out an effective evaluation. These are ‘financial resources,’ ‘time,’ ‘skills,’ ‘availability of and access to information’ and ‘key informants.’


Aim of the evaluation

The aim of this evaluation is to review and make recommendations on the overall Steps to Mainstreaming Participation Framework in relation to its potential to becoming a community development mainstreaming strategy and to review the key elements of the Making Choices, the Step Forward training programmes and the Mentoring Programme, with ten participating Community Development Projects and Family Resource Centres. In the course of the evaluation, it was agreed that due to its very small size, an evaluation of the Mentoring Programme could be discontinued.


To undertake this evaluation a number of key tasks were identified and incorporated:

  • Examine the impact of the Making Choices and Step Forward training programmes in terms of the first steps to the engagement
  • Examine and identify the specific challenges relating to the delivery of the programmes for stakeholders and make recommendations based on best practice
  • Examine and make recommendations in relation to the overall content of the Steps to Mainstreaming Participation Framework including its purpose as a community development mainstream strategy
  • Identify the resources, technical support and the expertise required to maintain the ongoing development of the Framework.
  • Examine and make recommendations regarding the optimum partnership element of this work.
  • Compare or contrast the projects using a community development framework


Type of evaluation method employed in this research

The methods employed in this evaluation were:

  • Roundtable discussions with project participants, facilitators and staff in two regions
  • Phone interviews with individual members of project teams
  • Discussions with project promoters, funders and facilitators
  • Site visit to one project
  • Document examination


Particular emphasis was placed on the evidence, opinions or data which would signal a need to adjust or alter the Programme in the light of practice.


Evaluation Design - strengths and weaknesses

The evaluation used a ‘look back’ technique to ascertain how the participants and staff of the projects recalled and remembered their experiences and attitudes at the time. The research team were aware that using such a technique might lead to a number of weaknesses, such as individuals not being able to remember specific details, individuals having selective recall and individuals adjusting their views with the benefit of hindsight.


The design of the evaluation presumed – wrongly as it turned out – that local area based projects would have maintained detailed records of participants, their backgrounds and outcomes of participation in the programme. This absence proved a disadvantage in making a comprehensive assessment.


The time, which had elapsed since the end of some project’s activities and the commencement of the evaluation, meant that not all key individuals were available for interview.


Roundtable discussions

To facilitate the evaluation two roundtable discussions were organised, one in Galway city and one in Limerick city. Staff and up to three participants in CDPs and FRCs residing in these or near to these areas were invited to attend these meetings. Additionally one programme in Raphoe, Co. Donegal was taken as a case study where staff and participants were met and interviewed in their Family Resource Centre, which hosts the Steps to Mainstreaming Participation Framework.


Three general open questions were used in the round table discussions with staff and participants. These were:

  1. Looking back, what were your expectations of the programme?

  2. Looking back what was your experience of the programme?

  3. How did it turn out?


Two information notes were issued, one from DESSA and one describing the role of Ralaheen in the research, to invite staff and participants. (Appendix 2).


Telephone interviews were made to projects. A feedback form was given to those in attendance at the meeting requesting data on the numbers who expressed interest in attending the programme initially, numbers who attended the programme, and numbers who decided to opt out of the programme before completion.


Tables 2 and 3 outline the projects invited to attend the roundtables.

Table 2 - Limerick Roundtable discussion 10th October, 2008

Limerick Roundtable Discussion Projects invited Projects attended
Maldron Hotel, Limerick City Hospital FRC Hospital FRC
  West Limerick CDP West Limerick CDP
  Our Lady of Lourdes CDP  
  Southill CDP Southill CDP
  Southill FRC Southill FRC
Participants who attended 5  
Staff who attended 5  

Source: Information provided on scheduling roundtable discussions and attendance at the meeting.


Table 3 - Galway Roundtable discussion 20th October, 2008

Galway roundtable discussion Projects to be invited Projects attending
Marriott Hotel, Galway City East Clare CDP  
  Ballybane CDP Ballybane CDP
  Aonad Resource Centre Aonad Resource Centre
  Na Calai CDP Na Calai CDP
  Cosgallen CDP  
    Mentors Tuam
    Sligo Northside CDP
    Loughrea FRC
Participants who attended 4  
Staff who attended 5  
Numbers of support persons 3  

Source: Information provided on scheduling roundtable discussions and attendance at the meeting.


Discussions with Programme promoters and funders

The evaluation team met with the Programme promoters and funders, with the evaluation Advisory Group, with the Board of DESSA, with the Manager of DESSA and with Programme facilitators. Two written progress notes on the evaluation were prepared for these meetings and oral presentations were made.


Analysis of the principles underpinning the programme

At the start of the evaluation, the team requested documents relating to the Programme from DESSA and as many as were available were sent to the research team for review and analysis. It was useful to review the background documents to the Programme in order to ascertain how it unfolded in each site.


The use of evidence in social initiatives

The review found very little use of social ‘evidence’ in the Programme. By evidence we mean statistics on disability, prevalence of disability in geographical areas, benefit and allowance claiming, informal accounts of beneficiaries of the Programme, information coming from Local Partnership or Leader Groups, who have usually completed socio economic profiles. For example, the Census 2006 provides extremely important statistical evidence on people with disabilities, by age groups and small area of residence or district. This evidence is free and public and can be downloaded with about 1 hours training for a frequent computer Internet user. Evidence strengthens the sense of solidarity – ‘we are not alone’ – provides a stimulus to other potential partners and generates legitimacy for co funding applications.