The Report

Section 2 - Process Lessons

2.3 Setting Targets and Measuring Performance

The performance targets set within the sample fall largely within two categories:

  • output and expenditure targets on which SRB funding depends; and

  • scheme lifetime targets derived from baseline indicators, as illustrated in the example from Barnsley below.

Good Practice Example

Barnsley's Highway to Success - Performance Targets

Scheme lifetime targets include:

  • to improve take-up of further education among school-leavers from 47 per cent to 62 per cent;
  • to reduce the percentage of students dropping out in first year from 17 per cent to 10 per cent;
  • to increase the number of schools offering GNVQ from 3 to 13 (out of 14);
  • to increase the skill levels of Barnsley managers:
  • to increase those with qualifications at NVQ level 3 from 55 per cent to 61 per cent;
  • to increase those with qualifications at NVQ level 4 from 42 per cent to 46 per cent.

Both sets of indicators are necessary but also have their limitations, particularly as tools of project or scheme management. SRB output definitions largely relate to throughput or input measures (e.g. 'number of training weeks') or are imprecise (e.g. 'number of pupils benefiting from projects to promote personal and social development').

Scheme life-time impact measures are more likely to relate more closely to learning objectives. However, there are a number of factors to take into account.

  • By the time the evidence is collected, it may be too late to adjust the programme (though this can be eased by interim evaluations).

  • A wide range of external factors invariably have an impact - for example, skill levels of managers or college staying-on rates - and it is therefore not easy to disentangle the scheme's impact from all the other influences.

  • In some cases the scheme may simply be incapable of delivering the impacts identified.

  • There is at best an indirect relationship between improvements in these high level impact measures and specific projects. It is rarely possible to identify particular individuals responsible for achieving a given impact indicator.

At the time of the research for this study, only a few partnerships had identified measurable performance indicators (beyond SRB outputs) that could be related to project performance.
  • Looked After Children, which tracks changes in the relative performance in exam results of its highly specific client group, and the county average;

  • White City Partnership's CASE (Raising Achievement in Primary Schools), where the project team undertook its own baseline testing of the pupils to ensure consistency across the ten participating schools;

  • A self-assessment of the impact of Investment in Excellence by participating teachers; and

  • Hattersley's Early Years Project that has developed a 'Co-ordinated Individual Record System' (CIRS), designed to co-ordinate and up-date information held separately by Education, Social Services and the Health Authority.

Most of the schemes have plans for formal mid-term evaluations, although many are unclear about the approach to adopt. When the research for this study was undertaken, only one external evaluation was available (for the Co. Durham Partnership Scheme for Young People). This largely relied on a combination of output review and key actor interviews in order to assess perceptions of performance. These are critical, as the views of those most closely involved are most likely to illuminate how results were achieved (or impeded), as well as what those results were. Nevertheless, there is scope significantly to improve the understanding of, and approach to, target setting and monitoring as a tool of project management.

Key Lessons

There remains much work to improve regeneration partnerships' approach to the design and use of performance indicators. These need to be readily measurable during the scheme's lifetime, to provide a guide to progress towards the scheme's ultimate objectives. They also need to go beyond the SRB outputs and their concentration on throughput measures. This is discussed further in Section 4.

Click here to go to the previous page
Back
Click to return to our Home Page
Home
Click here to go to the next page
Next