One of the biggest claims against the EMA is that of ‘deadweight’. That is, whilst EMA might persuade some kids who wouldn’t have otherwise stayed on to do so, an awful lot of kids who get it would have stayed on without it. Therefore, a large chunk of it is wasted.
Anecdotally, this completely chimes with my experience. I would guess about 90% of the kids I have taught who received EMA would have stayed on without it. Some research does bear me out – in a rigorous survey commissioned by the last government and published by this one, the National Foundation for Educational Research suggest that 88% of pupils receiving EMA would stay on without it.
On their website and in the Guardian, the NUS reject that survey for methodological reasons, some of which I believe do have some foundation. They also cite their own research which shows only 45% of pupils would be able to continue without it. However, I cannot find the methodology or the provenance of this research. All I can find is a page on their website that suggests all they did was set up an online survey and ask students to respond. I hardly think the way to respond to a survey you think has methodological flaws is to conduct one with many more methodological flaws. I am therefore going to ignore this research.
They do however cite two respected academic papers as backing up their point, one from the IFS and one from the CfBT education trust. Based on these, they make this this semi-literate point attacking the decision to axe EMA:
To make such a far-reaching decision based on one survey, particularly when all other research - including that by the Institute for Fiscal Studies, which found EMA increased participation by up to 7.3 per cent - is astonishingly irresponsible.
Let's have a look at this other research then. The IFS did indeed find that EMA increased participation by up to 7.3% percentage points. That 'up to' is quite misleading, as from what I can see the IFS say that EMA increased particpation by 5.5 percentage points for males and 7.3 percentage points for females, which would suggest that actually the overall figure is somewhere in the middle of those two numbers. Plus, the IFS analysed the participation rates in two different ways, the other of which gave an improved participation rate of nothing for males and 2 percentage points for females. Anyway, I'll take the most generous 7.3% because I am generous, and even if you do this, then far from contradicting the NFER, this almost completely reinforces its point about deadweight. 43% of all 16-17 year old students get EMA. Reduce that by 7.3 and it leaves 35.7 percentage points of kids who are getting EMA but who would have presumably still been at school. Note I have italicised percentage points. Little stats lesson here for the NUS – 35.7 percentage points is NOT the same as 35.7%. Allow me to explain with an example:
We have 100 sixth form students. 35.7 get EMA but would be at sixth form anyway. 7.3 get EMA and wouldn’t be in the sixth form without it. The other 57 are at sixth form and don’t get the cash. Of the 43 kids who get EMA, it’s only 7.3 kids who are actually being incentivised by it to stay on at 6th form. The other 35.7 would be there anyway. Thus, the deadweight cost is 35.7/43, or 83 % - almost exactly the same as my anecdotal guess and the more rigorous NFER survey! That is, the NUS cite a paper and a statistic that entirely supports the theory they are trying to knock down! And that's with me selecting the data that most supports their case - if you take the 5.5 percentage point figure that the IFS say is the male improvement rate, or if you take their figures from the other type of analysis they did, then the deadweight problem looks even worse.
As if that were not hilarious enough, the NUS also quote from another rigorous survey of EMA which defends its deadweight costs. I disagree with this paper, but it is certainly numerically correct. It defends EMA on the deadweight charge not by pretending deadweight doesn’t exist, but by saying that there is a lot of deadweight in a lot of other policy areas. This appears to me a very weak argument, but I will not get into that here. The point is that again, a paper the NUS cites as refuting the deadweight charge actually agrees that there is significant deadweight. Even more amusingly, the NUS manage to completely mis-cite it – either deliberately or because they are innumerate. The paper actually says this:
Unlike most other areas of policy however, the level of deadweight in EMA allowances can be calculated (which may be the reason why it is the focus of attention). The most recent figures show that 43% of full-time students aged 17-18 receive EMAs. Taking an average of recent evaluations EMA may have increased participation by up to 7 percentage points, leaving some 36 percentage points as deadweight. Although probably better than many other areas of policy this may still be though too high.
You’ll notice that this says 36 percentage points – as I have shown above, in this case, those 36 percentage points equate to 83 percent. But how is this statistic reported in the NUS briefing paper? Yep, you guessed it:
the work carried out by Mick Fletcher, a leading authority on EMAs, where he found that the 'deadweight cost' is about 36 per cent.
What are the chances of kids who can’t spell incentive being able to work this out? Remote, I would say.
I wonder if the NUS are being ‘astonishingly irresponsible’, or simply innumerate.
I despair, I really do.