An Evaluation of the Personal Development Partnership

This report reviews the management and outcomes from the Personal Development Partnership programme.


Appendix 1: Methods

In this annex we outline key elements of our approach to the evaluation, design and methods.

Overall Approach

Our approach was both innovative and practical. We used a pluralistic evaluation combined with contribution analysis (see Mayne, 2008). This provided a strong and multi-layered framework for the evaluation. It ensured that the evaluation worked with an agreed theory of change among participants and made explicit the links between inputs, outputs and outcomes. This combined approach was illuminated in, for examples, workshops and the no-going reviews of the matrices for development, and criteria for successes.

The PDP aims to enhance the lives of young people facing a range of problems and issues. The evaluation approach, and the many tools and processes utilised, were predicated upon our recognition of the realities of a multi-organisation and multi-site partnership. With varying agendas and expectations for such developments the evaluation team had to be flexible, appreciate the wider context to issues, and draw upon a range of research approaches and skills.

The Stated Aims and Objectives of the Evaluation

The aim was to map and explore the implementation, delivery and effectiveness of the PDP Project.

The objectives were to:

  • map and assess the process of setting up and implementing the PDP, including examining the cost and use of funding;
  • explore the effectiveness of partnership working between organisations and providers
  • identify any emerging approaches and good practice demonstrated by providers and/or partnership organisations, along with any necessary areas of improvement
  • examine young people's aims, experiences and reflections on their participation in programmes offered by providers
  • determine the potential value and impact of project activities on young people's attitudes, social behaviours and subsequent engagement with education, training and employment
  • facilitate exploration and understanding of the issues impacting upon the effectiveness of both individual providers and the wider Junction project.
  • disseminate findings to delivery partners through the life of the project to inform on-going development and delivery of the project.

The Evaluation Framework

We opted for a parallel framework that combined pluralistic evaluation with contribution analysis.

Pluralistic Evaluation Research Design

Pluralistic evaluation (see both Smith and Cantley, 1985; Moss et al., 2008) takes account of the various stakeholders who may well have a general agreement with the project aims/objectives/outcomes but might still have to reconcile those with their own service parameters and political operating environment. It allows for variation in the notion of success for the project and takes that into account, teasing out the common core across the stakeholders as well as addressing the specifics to each stakeholder. Included in that are the young peoples and parents criteria of what would make this a successful project. The success criteria can shift over time as some are met and dropped and others come on board as a result of changes in the service/partnership/policy environment, so the criteria are dynamic and will be reviewed. Coupling this with contribution analysis allows us to develop theories of change for each group, and to identify successful outcomes from varied perspectives.

Contribution Analysis

The revival of logic model approaches to planning evaluations (see both Kaplan and Garrett, 2005; Morrison, 2009) has seen a recent refinement through the development of contribution analysis. This has at its core the question of attribution; 'to what extent are observed results due to programme activities rather than other factors?' (Mayne, 2008: 1). Contribution analysis is predicated on developing and clarifying the theory of change underpinning a service/policy and establishing chains of potential causality, that is, what does the project set out to achieve, and what steps will lead to these outcomes and why do we think that? Developing a clear attribution chain enables the identification of suitable evidence for each stage in the project.

Contribution analysis also allows consideration of which elements of potential outcomes the project has direct control of, and which can be achieved through direct and indirect influence. In this way we can isolate contributing factors and offer sharper research design, including understanding of the risks to achieving outcomes and how to minimise these. Contribution analysis is predicated on involving all of the partners in discussion about the theory of change and thus encouraging investment and ownership in the project aims and processes, together with the evaluation.

In summary, we asserted that using these two approaches in tandem allows the evaluation to recognise the political realities of the programme, the consequent range of agendas alongside the theoretical underpinnings. This allows us to illuminate causality more readily. These are not mutually exclusive or in conflict: they are complimentary ways of looking at outcomes and outcome assessment: a pluralistic model elicits the success criteria agenda that can be used to evaluate the project and contribution analysis lets us unpick the steps to success, and understand theories of change at work in the project.

Contribution Analysis and Pluralistic Evaluation in Practice

As noted contribution analysis (CA) involves developing and clarifying the theory of change underpinning a service and establishing chains of potential causality, that is, what does the project set out to achieve, and what steps will lead to these outcomes and why? Three results chains were developed, one for young people, one for the main project workers (PDA's) and one for the partnership. The risks and assumptions for the result chains were assessed and are set out in Appendix 2 This provided monitoring criteria for the project which is detailed in Appendix 2 This report provides an assessment of the extent to which assumptions were correct and risks were mitigated and provides an overall assessment of the contribution of the project.

Again, as noted above, pluralistic evaluation takes account of the various stakeholders who may well have a general agreement with the project aims/objectives/outcomes but might still have to reconcile those with their own service parameters and political operating environment. These criteria help to inform the CA assessment, but are also revisited by stakeholders at various points during the evaluation. The results from this assessment are integrated into the report and presented in Appendix 3

Data Collection

At every stage consent was sought and documented. Ethical guidelines were adhered to and procedures kept under constant review. All data was anonymised and pseudonyms used.

Data was collected using mixed methods from staff, partners, referral agents, young people and their families, project materials and through workshops. Methods and sources are presented in Table One below. A project database was also established, although this was not up and running until September 2012. This now provides statistical, assessment and outcome data on the young people involved in the project.

Table 1: Data Sources

Source of Evaluation Data Numbers and Rounds Total Different Interviewees Total Interviews
Interviews with PDP Partners, Managers and Key External Partners Ist Round=16 2nd Round=11 3rd Round =10 16 37
Interviews with Referral Agents Ist Round =10 2nd Round=45 3rd Round=10 65 65
Bi-monthly Catch-ups with the PDP staff 12 x 3 PDAs plus new PDAs x 2 5 38
Case Study Interviews 6 young people, their parents, PDAs, referrers, activity providers and other 33* 27
The Views of Young People 3 discussion groups, one in each area; one Scottish Power discussion group 19*
PDP Evaluation Workshops 4 workshops
PDP Development Workshops 2 workshops
Statistical Analysis of PDP Quantitative Data 5 overviews
Secondary Analysis of PDP Quarterly Reports; PDP Partner data 3 reports; various partner data
Secondary analysis of other relevant databases & Justice related data sets (juvenile and adult) Two databases
PDP Database Development 6 meetings
Evaluation Input into PDP Development Plans 2012 4 meetings

Combined Approaches and On-going Development of PDP

The evaluation team took a developmental approach to evaluation, especially as the team started working at the outset of the project. This meant that findings were fed back to staff, managers and stakeholders in workshops throughout the project in order to maximise learning. This has meant that the programme was able to adapt to this information and this is particularly noticeable in the development of management structures. This also allowed for respondent validation of the findings of the evaluation following MacPherson and Williamson (1990).

The intention of the evaluation team was to involve young people in the evaluation and a 'big brother room' was set up in each site with a video camera for young people to record any feedback. These relied on the project workers in the sites encouraging young people to use them. There was no feedback recorded in this way and, as a result, the idea had to be abandoned.

An overview of the data type and source is presented in Table Two below.

Table 2: Data overview

Type Description Sources
Administrative Data Referrals, case level, exit data, yp review, staff time and costs Database, staff time monitoring exercise
Operational and Partnership Service implementation Staff interviews, partnership meeting observation, staff exit interviews
Client level data Perspectives of young people and families, journeys through PDP, outcomes Interviews with young people, families, referrers and focus groups with young people, database

Analysis

Data was collected by specific members of the project team utilising existing expertise and analysed across themes. Analysis was conducted regularly and in particular to feed into the workshops. Both contribution analysis and pluralistic evaluation created categories and themes for analysis.

This helped to draw together the various datasets qualitative and quantitative, client and service, partner and service, descriptive and explanatory, statistical and non-statistical, theories of change and wider notions of success. We suggest setting up a series of vertical and horizontal analytic zones (in consultation with the evaluation commissioners) under which to capture the aims and objectives. Vertical and horizontal analytic zones were established as listed below in Table 3.

Table 3: Analytic Zones

Vertical analytic zones: Horizontal Analytic Zones:
service pathways role of the PDAs/PDC role of the YPDA role of the MCMC role of the activity providers young people and their families partnership working implementation results chains success criteria costs lessons

References

Kaplan S. and Garrett K. (2005) 'The Use of Logic Models by Community-Based Initiatives', Evaluation and Program Planning, 28: 167-172.

MacPherson, I. and Williams, P. (1990) '"Not Quite What I Meant!" The Use of Respondent Validation', Research Policy and Planning, 10(1): 10-13.

Mayne J (2008) Contribution Analysis: An Approach to Exploring Cause and Effect, ILAC methodological brief available at http://www.cgiar-ilacorg/files/publications/briefs/ILAC Brief16 Contribution Analysis.pdf

Morrison A (2009) Contribution Analysis: An Introduction Presentation to Scottish Government. Edinburgh: Scottish Government.

Moss C., Walsh K., Jordan Z., and MacDonald L. (2008) 'The Impact of Practice Development in an Emergency Department: A Pluralistic Evaluation', Practice Development in Health Care, 7(2): 93-107.

Smith, G. and Cantley, C. (1985) Assessing Health Care: A Study in Organisational Evaluation. Milton Keynes: Open University Press.

Contact

Email: Ban Cavanagh

Back to top