Skip to main content

Pupil premium reviews: using system leaders

Posted by: , Posted on: - Categories: Pupil premium review, School-led system

John Dunford

Sir John Dunford discusses the value of using system leaders to review your use of the pupil premium.

Narrowing the gap between pupil premium pupils and their peers is not an easy task, but a substantial number of schools have shown that it can be done and that the attainment of pupil premium-eligible pupils can be raised to the national average for non-pupil premium pupils and beyond. To do this requires a rigorous approach to the spending of pupil premium funding and policies should be regularly evaluated in all schools.

This rigorous evaluation must be carried out within schools by the leadership team and governing body. It may also be beneficial – even if the school is doing quite well for pupil premium-eligible pupils – to commission an external review from time to time. Using system leaders like national leaders of education to provide an objective assessment can help you to ensure that the funding is making as much impact as possible.

The review

The purpose of a pupil premium review is to use an evidence-based approach to assess how much impact a school is making when spending its pupil premium, and how it might increase its effectiveness. When it comes to the pupil premium, all schools should be using proven intervention strategies rather than simply doing more of what they've always done. Trying something different which is known to be effective, rather than staying with well-established approaches that are comfortable, is a key principle in effective pupil premium use.

It’s important that the review be led by an experienced, independent system leader with a good track record in improving outcomes for disadvantaged pupils. That way you get a fresh, objective view from someone who really knows what works.

To support you, NCTL offers an online directory of reviewers where you can find system leaders they have designated as expert reviewers. All the reviewers have met stringent achievement criteria with their disadvantaged pupils, and are used to working with other schools as headteachers of teaching schools or as national or local leaders of education.

I recently worked with the Teaching Schools Council and NCTL to produce a guide to effective pupil premium reviews, which offers advice on the commissioning and content of reviews. It provides a rigorous and tested framework for school and system leaders to use to make the most of their review. You might also find the following examples useful in giving you an idea of the review process.

Holbrook Primary School, Coventry - Michelle Harris and Dawn Lama

Ofsted recommended a pupil premium review for our school because the good progress of disadvantaged pupils in some year groups wasn't consistent across the school. We found the review to be a very positive experience that helped to move the school forward.

Holbrook Primary is a larger than average primary where 39 different first languages are spoken by pupils. We have above average numbers of pupils attracting the pupil premium. Our local authority put us in touch with an NLE (Michele Marr from Caludon Castle School) and an HMI not involved with the inspection. We invited them to carry out the review as a team.

The reviewers looked at our data and visited the school to talk to staff and pupils. They made a series of recommendations based on their own experience and evidence of what has been shown to work well in other schools. We created an action plan that has resulted in innovation and changes in emphasis. Data analysis has been intensified, and it focuses on two new questions:

  1. Is the attainment gap closing?
  2. If not, why not?

The senior leadership team was restructured to create an additional assistant headteacher with specific responsibility for the pupil premium. Year leaders have new powers and new responsibilities for the progress and outcomes of disadvantaged pupils in their care.

The action plan suggested a new focus on reading, including extending learning hours before and after school. Teachers plan the activities and work with teaching assistants specifically trained in supported and guided reading who deliver the programme. We've seen a rapid improvement in pupils’ reading which has laid the foundation for further progress across the curriculum.

Birches Head Academy, Stoke on Trent - Janet Hetherington and Karen Healey

Birches Head Academy is a secondary school with a well above average proportion of students attracting the pupil premium.

We used the NCTL reviewer directory and found an NLE, Margaret Yates, whom we knew had the relevant expertise, and she agreed to carry out the review.

After a full day discussing the experiences of pupils, teachers and leaders, Margaret created a report that acknowledged what had already been accomplished, and offered a consistent set of improvement recommendations for us to work into our strategy and practice. These included:

  1. empowering middle leaders, senior leaders and governors in new ways
  2. changing the way data was monitored and used
  3. attendance and behaviour were added to the consideration of pupil outcomes
  4. a fresh focus on progress, especially in maths and English

Margaret visited the school again after two months to see how the new strategy was working out. There has been a positive impact on the objectives we’re trying to achieve with pupil premium funding around English and maths and improvements for target groups in attendance, behaviour and engagement.

Having a successful review rests on each school ensuring it works for them – having clear objectives, ensuring the report addresses the school context, and working in collaboration with the reviewer make it a positive exercise.


I hope these examples and the information above help you to evaluate the effectiveness of your pupil premium strategies, both internally and, when you consider that it would be beneficial, from an external perspective.

Good luck!

If you have any questions or feedback, please comment below. To keep up to date with this blog, you can sign up for email updates or follow NCTL on Twitter.

For more information or opportunities to get involved with our work, visit our pages on GOV.UK.

Sharing and comments

Share this page

1 comment

  1. Comment by Neil Donkin posted on

    As a system leader conducting a review or a school commissioning a review, I recommend that you look carefully at your RAISE data to establish what the real gap is. Firstly consider your disadvantaged scatter graph for 2014 and your FSM scatter graphs for earlier years. You are highly likely to find that once the prior attainment, shown by the “expected score”, is taken into account, the overwhelming number of students’ actual scores relate to their expected scores in the same way, whether or not they are in the premium (DfE disadvantaged) group or the others group. Also, the progress of the overwhelming number of students in the premium group and in the others groups is very similar when measured by the overall aggregation of the subjects, which count in the value added measure. In addition, the outliers on the scatter graph are most likely to include pupils in both groups.

    Secondly, look carefully at the prior attainment of pupils in the two groups. For most schools, that of the premium group is lower on average points score, as it is when measured by the prior attainment bands. Your RAISE summary document indicates that nationally, the lower the prior attainment the lower the percentage chance of conversions and that in constructing value added estimates, the lower the prior attainment the lower the estimate. Finally, RAISE demonstrates that attainment of the high band is greater than that of the middle band and that of the middle band greater than that of the low band.

    It is dishonest not to take a more detailed analysis into account if you are reviewing the difference. In addition, though complex, to make sense of a comparison over time, it is necessary to consider the relative difference of the prior attainment, as well as the relative differences in the various measures which are shown in RAISE.


Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.