The recently published 2014 post-16 performance tables show that A-level students at Newham Sixth Form College (NewVIc) improved their performance faster than the national average for the second year running. They also show that our advanced vocational students performed better than the national average for the second year running. We’re obviously delighted, but is this part of a long term trend?
NewVIc is a large provider with substantial numbers of both A-level and vocational students and there are national post-16 performance table data published for the last 21 years, so it should be easy to get robust data. Unfortunately the performance tables do not give us 21 years of comparable measures.
To be useful and comprehensible, performance tables should measure the things we value, show them clearly and help us to see the trends we are most interested in. This means having measures which are stable and comparable over a period of years. When the tables become too complex or too many changes are made to what is being measured from year to year, the danger is that they can lose their value and we can lose confidence in them.
England’s annual post-16 tables have seen their fair share of changes in the 21 years they have been published and it is hard to find any data which have been recorded in a stable enough way to get a 21 year trend.
The points system has changed at least twice in that time, although this can be overcome by converting a provider’s score to a percentage of the national average score for that year. The big change has been the combining and then separating of A-level and vocational performance. In the first four years (1994-1997), only A-level points were recorded, vocational achievement was measured as a % pass rate. In 1998, a single new combined advanced measure was introduced for A-level and vocational performance. From 2002, separate A-level and vocational point scores were no longer published and for 10 years only the combined measure was available. A separate A-level measure was reintroduced in 2012 followed by a separate vocational measure in 2013 and the combined measure was also withdrawn in that year.
These changes reflect the changing views about the equivalence of the two types of qualification. The current approach regards them as so different that to combine scores would be an ‘apples and pears’ exercise. However, this issue was also addressed through a review of vocational point scores which led to downgraded values for vocational distinctions, merits and passes.
In the case of our college, we have 8 years of A-level point scores from 1994 to 2001 which shows NewVIc students consistently achieving around 70% of the English average points. We then have no data for 10 years, followed by 76%, 81% and 85% points per student for 2012-14. We can therefore claim to have improved faster than the national average since 2001 although there’s a big gap in the middle of the story.
For vocational courses, NewVIc students were regularly performing around the national average with point scores averaging 100% up to 2001. Since the ‘big gap’ we have seen two years of 112% vocational point scores – well above the national average.
So what happened in the 10 year ‘big gap’? In our case, the combined advanced point score rose fairly steadily from 66% in 2002 to 88% in 2011. Again, a faster than average improvement over a long period.
So, despite the shortcomings we can try to establish longer term relative trends. In order to help sixth form providers which have both A-level and vocational provision to convert their ‘separated’ data back to ‘combined’ data in order to establish their long term performance trends, I offer a very simple methodology* which allows providers to bring in the 11 ‘lost’ years (2002 to 2012) into their run of data. See below for the guidance.
In NewVIc’s case, this re-combined data suggests that our advanced students taken as a whole have continued the steady trend of improvement and are now achieving 98% (2013) and 99% (2014) of the national average points per student – our best results ever despite a cohort with lower than average GCSE points scores on entry.
The other concern about these tables is that unlike their key stage 4 equivalents they don’t represent the performance of a whole age cohort, only those who are in education and taking advanced qualifications. Level 2 vocational qualifications did put in a brief appearance for a short while early on and they are set to return again. But with the raising of the participation age we should now be able to compare the performance of a whole age cohort of students at all levels and area by area rather than simply by provider – some of which are very selective and keep out a large proportion of the cohort.
We celebrate the improvement in our students’ exam scores and we want the best for all our students but we also need to remember that exam performance is only one dimension of educational success and keep it in perspective.
So let’s keep analysing the data we have while making the case for simpler and more comprehensive tables that tell us what we need to know to help make the system more successful for all young people.
*How to calculate combined advanced points scores for any particular year or institution from data in the performance tables.
A: number of A-level students (institution)
B: number of vocational students (institution)
C: institution’s average A level point score per student (the same method applies to points per entry)
D: institution’s average vocational point score per student
E : national average A-level point score
F : national average vocational points score
Institutinal combined score : [(A x C) + (B x D)] / (A + B)
Weighted national average for comparison: [(A x E) + (B x F)] / (A + B)