How to assess the effectiveness of the government’s Prevent counter-extremism programme

In the aftermath of the Manchester bombing, many questions were raised regarding the effectiveness of the government’s counter-extremism programmes, with particular focus on the Prevent arm of that strategy. For example, the FT reported that Salman Abedi was referred to Prevent, but that report was not followed up (although note that the police state that they cannot find any record of Abedi’s referral to Prevent). More generally, the Prevent programme has been called “toxic” by the now mayor of Manchester Andy Burnham and the Home Affairs Select Committee.

However, there are a number of examples of the Prevent programme also having done a lot of good. For example, two teenagers were stopped from travelling to Syria after being referred to Prevent by their parents in 2015, and it is credited with helping to stop 150 people (including 50 children) from going to fight in Syria in 2016.

Importantly, much of the discussion regarding Prevent’s effectiveness has focused on anecdotal evidence, the odd stylised fact here and there, a couple of case stories. Most criticisms or praise of Prevent focus on a few examples where it has not worked, been implemented badly, or succeeded. There are even calls to expand or to shut down Prevent without any evidence of whether or not it is actually an effective programme.

Indeed, as far as I’m aware, there has been no (publicly available) rigorous or systematic assessment of Prevent’s effectiveness. (Note that although the recently-launched book “De-Radicalisation in the UK Prevent Strategy: Security, Identity and Religion” by M. S. Elshimi claims to constitute such an assessment, its results are based on an absurdly small sample of only 27 people and therefore cannot be considered a systematic analysis.) However, conducting a systematic assessment could be a relatively simple procedure.

In particular, if data are available on the number of extremist / terror convictions and/or number of people successfully and unsuccessfully “treated” by Prevent at the level of individual local authorities, then it would be possible to use variations across those local authorities to assess Prevent’s effectiveness.

Put simply, the “outcome” variable (i.e. metric that assesses Prevent’s success) could be the number of extremist / terror convictions or the proportion of people referred to Prevent that are successfully treated. (Obviously if the number of convictions is used, it would be important to allocate those convictions to the local authority in which the extremist grew up and/or resided rather than where the extremist activity was carried out.) Of course, there would also be technical considerations regarding whether the outcome variable is a “count” variable, is bounded due to being expressed in percentage terms etc., but those can be dealt with relatively easily.

The explanatory “variable of interest” that would then measure the actual effectiveness of spending on the Prevent strategy would be each local authority’s annual budget for Prevent. If Prevent was effective, one would expect this variable to be negatively related to the number of convictions (since a successful Prevent would stop people before they committed a crime) and positively related to the proportion of successfully treated people. Alternative variables of interest could include the number of Prevent-dedicated personnel in each local authority or the amount of Prevent training that is provided to practitioners – each of these could be investigated to try to identify the most effective/important aspect of the Prevent strategy.

Note that it is unlikely that there would be any simultaneity between the Prevent budget (or other variable of interest)  and the outcome variable – although it is plausible that current Prevent spending would be based on past extremist activity in a local area (i.e. local areas with higher extremist activity get more money for Prevent), it is unlikely to be the case that current Prevent spending reacts quickly enough to be affected by current extremist activity. Nonetheless, this could be investigated by using lags of the Prevent budget variable as instruments or as the variables of interest themselves (since it could well be the case that Prevent takes time to have an impact).

As Prevent has been running since 2003, and there are roughly 400 local authorities in the UK, that should give a sizeable panel of data on which to conduct some relatively simple regression analyses. Of course, a number of other factors would need to be taken into account – for example, the population of each local authority, the average income within it, any changes to Prevent guidelines and/or the introduction or suspension of other counter-extremism strategies. The “identification” of the impact of Prevent would therefore come through variation in Prevent spending (or other Prevent-related variables of interest) and outcomes across local authorities and across time.

Now, I don’t have the data to conduct this analysis. However, I suspect that the data are out there – I understand some organisations have created databases containing details of all extremist-related convictions over a reasonably lengthy period of time (for example, The Henry Jackson Society has a dataset on all Islamist-related convictions from 1998-2015, but this would also need to be supplemented with data on other forms of extremism covered by Prevent).  Moreover, local authorities / the Home Office / the relevant government authority no doubt have records of the amounts that were spent on Prevent by local authorities (as well as the number of Prevent-related personnel etc.) on an annual basis . As such, combining the two together (along with the various controls) should provide a useable dataset fairly easily.

Hence, if the government and/or organisations with an interest in Prevent really do want to assess how effective is the Prevent strategy, then it actually isn’t very difficult to do so.

Fact-checking a few claims about the NHS

What with the campaigning for the general election having gotten into full swing last week, many claims have been made regarding which Party would be better for which aspect of security, the economy, education etc. One particular video regarding the NHS started doing the rounds on Facebook a few days ago. This video makes a number of claims regarding the supposed impact that the recent Coalition and Conservative governments have had on the NHS, with the video then going on to suggest that a Conservative government would be bad for the NHS. For a bit of excitement, here is said video:

 

 

The claims made in that video are many. Some are valid, whereas others are not. Let’s take each of them in turn.

Claim 1: We are experiencing the largest sustained drop in NHS funding as a percentage of GDP since the NHS was founded.

Reality: This claim is false. As per the information shown in the graph below (from the Institute for Fiscal Studies) NHS spending as a proportion of GDP has been stable over the past couple of years, and the decrease between 2009 and 2012 was no larger or longer than decreases in the mid-to-late 1970s or mid-1990s.

bn201_fig1

Moreover, the more relevant metric of NHS spending per capita continues to increase – in other words, more is spent per person on the NHS than ever before, although the rate of that increase has slowed in recent years.

bn201_fig2

Claim 2: If the internal market was abolished we [i.e. the NHS] could save billions.

Reality: This claim is also false. The internal market actually creates savings and is not “wasteful” as is claimed in the video. On the contrary, it promotes competition and stimulates the NHS to provide better services – importantly, the benefits of competition in healthcare are well established. Furthermore, it is actually the refusal of many within the NHS to accept the proven benefits of competition that is causing some harm to the NHS – indeed one of NHS Improvement’s main aims is to promote and encourage “buy-in” of competition among those in the NHS. Hence, abolishing the internal market would actually cost billions rather than save them.

Claim 3: Health tourism costs the NHS £200 million per year, which is insignificant in terms of the overall cost of the NHS.

Reality: This is generally true – although the costs to the NHS associated with people who are not ordinarily resident in the UK are of the order of £2 billion per year, that includes many people who did not come to the UK specifically and solely to use the NHS (i.e. it includes people who are not “health tourists”. Instead, estimates put the upper bound of the costs associated with those who travel to the UK for the sole purpose of using the NHS at around £300 million per year. When compared to the total annual NHS budget of about £90 billion, the costs associated with health tourism are indeed a trivial amount.

Claim 4: Immigrants are not ruining the NHS, they’re running the NHS.

Reality: True. Immigrants from within the EU currently represent about 10% of doctors and 4% of nurses. If non-EU immigrants are included, therefore, the figures are likely to be slightly (although probably not a huge amount) higher. Given that there are already quite severe labour shortages within the NHS, it is clear that without the immigrants currently working within the NHS, the functioning of the NHS would be severely hampered. Moreover, immigrants are net contributors in terms of taxes vs benefits, so also contribute to the NHS in that way. Hence, the claim that immigrants are not ruining the NHS is clearly valid.

Claim 5: 1 in 10 nursing posts are vacant and the nursing bursary has been scrapped

Reality: True. The nursing bursary was indeed scrapped at the start of the year – this means that there is a much-reduced incentive for people to train to become nurses as they will now have to pay £9,000 in tuition fees per year in order to do so. This is likely to lead to problems recruiting sufficient nurses in future. Notwithstanding that, there are also problems recruiting nurses now – the Royal College of Nursing suggests that 1 in 9 nursing posts are now vacant. This figure is actually marginally worse than that claimed (11% vacancy rate vs the 10% claimed).

Claim 6: Tens of thousands of sick patients waited on A&E trolleys this past winter

Reality: Likely to be true. Using data from Quality Watch (and a bit of approximation / extrapolation), roughly 6 million people attended A&E last winter. Of these, around 15% were not seen within the government target of four hours – i.e. about 900,000 people waited more than four hours in A&E. Now, it seems unlikely that all of these people waited on trolleys specifically, but even if only 10% of these people (i.e. 1.5% of all admittances to A&E) did then the “tens of thousands” figure would be accurate. Hence, this claim seems plausible.

Conclusion: As with most of these election video type things, the video contains some claims that are true, some that are likely to be true, and some that are demonstrably false. Does this mean that the Conservatives are the worst Party for the NHS? Who knows?! That’s for you to decide and take into account (if you want to) when you vote. But at least when doing so, you’ll now have a more complete set of facts when you do.

 

Evidentiary standards are slipping

Over the past month, there have been a number of instances in which a politician or journalist has made a bold claim, and then ignored or been unable to provide any evidence to support those claims.

For example, Fraser Nelson claimed that being in the EU had been a net detriment to the UK’s trade, and that the evidence he had seen supports that view. However, when provided with evidence that contradicted his claim, and when challenged to provide the evidence to which he referred, Nelson did not provide any sort of response. Likewise, Michael Gove claimed that there was evidence to indicate that leaving the EU would provide the UK with a “net dividend”. However, when pressed to provide the evidence that he claimed existed, Gove did not do so; nor did he respond to the provision of evidence that contradicted his view.

This is not just a problem for right-leaning opinion makers either; it affects left-leaning ones just as much. For example, despite copious evidence (from the Low Pay Commission) that increasing the minimum wage too high would be detrimental to the employment rate of low-income earners, Jeremy Corbyn claimed that increasing the minimum wage to £10 per hour would raise their living standards.  Again, Corbyn provided no evidence to support his claim.

This seems to be part of a wider, and long-running, malaise, in which policymakers can make a bold claim without any evidence to support it, yet said claim is taken at face value and isn’t challenged by the media nearly as often as it should be. Even worse (and a point made by Jonathan Portes in his recent discussion with Michael Gove), when challenged to provide evidence to support their views many in the media and political sphere tend to rely on a single statistic or anecdote even if copious evidence exists that contradicts their claim.

That’s assuming that the personalities concerned respond at all. Much of the time, they remain meekly silent, failing to respond, yet letting their original claim stand as though it hadn’t been challenged at all.

This isn’t just a point of pedantry – quite clearly, claims made by those covering and participating in campaign trails have real implications. For example, Vote Leave’s claim that Turkey would join the EU (despite all evidence to the contrary) likely played on some voters’ desires to reduce immigration (according to Ashcroft immigration was a major concern for roughly one third of voters), despite the fact that immigration has continually been proven to benefit the UK and everyone in it.  Similar points can be levied against various claims that the current level of trade between the EU and the UK could easily be replaced by trade with Commonwealth countries (despite the fact that the well-proven gravity model of trade directly contradicts this). And it seems likely that the upcoming election will be rife with claims and counter-claims that are (un)supported with evidence to varying degrees.

In essence, it is at least plausible that false claims made by opinion formers were taken to be true by some members of the voting public who based their decisions accordingly, and might have voted differently had they been informed of the actual evidence.

Now, what can be done to ensure that voters (and the general public as a whole) have actual evidence available rather than simply the claims of journalists and politicians?

Well, for a start, the press regulators (IPSO and Impress), the Electoral Commission, and the likes of the Office for National Statistics need to take on a much more proactive role. They should not wait for complaints to be submitted to them by the general public, but should take it upon themselves to investigate and penalise those in the public eye that make misleading or unsupported claims, with those punishments being far more severe than those currently used (for example, newspapers cannot continue to be allowed to get away with publishing retractions in the bottom corner of some page in the middle of their publication).

Second, political programmes like Newsnight, Question Time, and the Daily Politics should do far more to challenge politicians and journalists to support any claims they might make with sufficient evidence (i.e. more than just a single anecdote or statistic).  In other words, any journalist or politician appearing on such shows must be able to demonstrate that their claims are valid. The presenters on such shows should spend far more effort researching the actual evidence as well as questioning their guests on the basis of any claims that they might make.

Third, the Parliamentary Standards Committee needs to realise that their role in holding MPs accountable extends to claims made by MPs that are not supported by any evidence. Such claims are in violation of the MPs’ Code of Conduct and should be treated as such, with the necessary punishments for these violations being far more than the usual slap on the wrist.

Finally, and a much more long-term remedy, the general public should be provided with far greater training in the use and abuse of statistics. This should start from an early age and not only train people in how to calculate various (simple) statistics, but also provide information concerning how to spot when a commenter is using misleading figures or is relying solely on anecdotes to try to substantiate their points.

Once these suggestions have been implemented, the ability of journalists and politicians to deliberately obfuscate and mislead would be markedly reduced. That can only be a good thing.

Grammar Schools: Sam Freedman really should know better

Over the past few days there as been quite a bit written about whether or not selective schools (i.e. allocating children to schools at age 11 based on ability) are beneficial or not, either in terms of social mobility, educational outcomes or other areas. This stems from rumours that Theresa May is reviewing current ban on Grammar Schools.

A number of commentators have claimed that re-introducing academic selection at 11 years old is a bad idea. For example, Sam Freedman, an executive director of Teach First and someone who really should know better has claimed that selective education is bad for social mobility; societal integration; accuracy of assessing ability; and/or promoting parental choice of school.

However each of Freedman’s supposed criticisms are not supported by the evidence.

First, there is strong evidence to support the idea that grammar schools actually improve social mobility and countries with selective systems tend to be no less integrated than those without. In making his claim that grammar schools harm social mobility and lead to decreased integration, Freedman cites this webpage. However, the results displayed on that webpage rely solely on correlations and does not try to control for any other factor that might account for the apparent relationship between deprivation and performance. For example, the difference in wages between grammar and comprehensive educated people could simply reflect the fact that grammar schools select those who are more likely to obtain a better wage anyway and enable them to reach their full potential, whereas those students would be held back if they were forced to attend a comprehensive. It also does nothing to account for different demographics beyond an entirely arbitrary and undefined measure of “deprivation”.

Indeed, the webpage cited by Freedman seems to view social mobility as being achieved by “preventing the gifted from reaching their full potential” rather than “allowing everyone to reach their maximum”. However, there is a substantial weight of evidence that indicates that selective schools not only enable the most-skilled to achieve their full potential, but also substantially improved outcomes for the less-skilled. For example, Dale & Krueger states that “students who attended more selective colleges earned about the same as students of seemingly comparable ability who attended less selective  schools. Children from low-income families, however, earned more if they attended selective colleges.”

Similarly, Galindo-Rueda & Vignoles finds that “the most able pupils in the selective school system did do somewhat better than those of similar ability in mixed ability school systems. Thus the grammar system was advantageous for the most able pupils in the system, i.e. highly able students who managed to get into grammar schools.

In other words, selective schools incontrovertibly enable the highly-skilled to achieve their full potential as well as benefiting children from low-income families. This result is also supported by a study commissioned by the Sutton Trust – despite their avidly anti-selective school bias leading them to try to weasel their way out of the positive grammar school effect, the study finds that grammar schools tend to increase student performance by roughly two grades per subject taken at GCSE.

Second, Freedman’s claim that the 11-plus is poor at assessing ability does not stand up to scrutiny. Freedman claims that 70,000 students are wrongly classified by the 11-plus test – it is not clear if Freedman means 70,000 over the entire span of grammar schools’ existence, or 70,000 “mistakes” every year. If the former, then the proportion of mistakes made is clearly tiny as millions of people would have taken the 11-plus since it was first used. If the latter, then assuming that all 700,000 11 year olds take the 11-plus (not an unreasonable assumption) that gives a “failure rate” of just 10%. Clearly this is not very large. And those that suggest that even a single failure is unacceptable when it comes to a child’s education are being completely impractical since no educational system exists that can completely eradicate failures.

Finally, Freedman claims that grammar schools are “anti-choice”. However, this is clearly false – there is an obvious mechanism by which grammar schools promote choice of school. Specifically, the presence of an 11-plus test gets parents thinking about what will happen after the test, encourages them to research different schools and think about what school(s) would be best for their child. In other words, the 11-plus exam incentivises parental involvement in school choice, thereby promoting it.

Hence, Freedman is incorrect on every single point he mentions about selective schools. From someone that high up in Teach First, that is simply unforgivable.

George Osborne: A solid, but not spectacular Chancellor

As announced last night, George Osborne is no longer Chancellor of the Exchequer. Plenty of articles have already been written regarding how he’ll be remembered and whatnot (see, for example, here), but what really matters in an evaluation of his performance as Chancellor is focusing on the long-term impact of his main policies.

Of course, the main focus of Osborne’s term as Chancellor was “austerity” (or, as it is described in technical terms, a “fiscal consolidation”). There is lots of debate as to whether austerity is harmful or is beneficial to growth in the short-run – for example, Alesina & Ardagna, and some parts of the IMF, find that fiscal consolidations actually increase short-term growth, whereas the likes of Guajardo et al. and other parts of the IMF believe that fiscal consolidations harm short-term growth.

However, what really matters in evaluating the impact of austerity is its likely affect on long-term growth. Here, none of the aforementioned studies have anything to say, but there are good reasons to believe that austerity is beneficial for long-term growth. For example, it seems plausible that the amount of time required for a country to re-establish any lost credibility (either with taxpayers or the central bank) that arises from running continually large fiscal deficits could be relatively high – convincing people that a country is now fiscally responsible is unlikely to be the matter of a few years’ work.

In other words, it is plausible that it could take longer than just a few years for people to change their opinion regarding a country’s fiscal responsibility, such that the full impact of fiscal consolidations are only likely to be felt far into the future. Moreover, even though a recent working paper (by Fata & Summers) suggest that fiscal consolidations hamper long-run growth, those papers are based on a methodology that is fundamentally flawed.) Hence, austerity per se could have been a good policy of Osborne’s.

However, Osborne erred when he cut government spending on investments and infrastructure. At a time of incredibly low interest rates, it would have made sense to borrow to invest in projects that would have reaped a return in the future – the costs of borrowing are low, while the expected future benefits of such investments are likely to be high (in terms of their impact on future growth and on future tax revenues). Therefore, Osborne’s focus on cutting all, rather than just day-to-day, spending was misguided. Just as misguided (for the same reasons, since it prevented Osborne from borrowing to invest in infrastructure) was his Fiscal Charter.

Similarly, protecting spending on the NHS and on international development meant that there was little incentive for those departments to find savings despite the fact that they, and the NHS in particular, is bloated and full of inefficiencies (witness the large NHS deficits). If those departments had not had their budgets protected, a more efficient and equitable distribution of the cuts to day-to-day spending could have been achieved (since if the NHS or development budgets had been cut slightly, then other departments’ budgets would not need decreasing as much). Likewise, the triple lock on pensions. So, another negative point for Osborne there.

On the other hand, Osborne did set up the Office of Budget Responsibility (OBR), which was undoubtedly a very good thing. Although not quite as dramatic as Labour granting the Bank of England (instrument) independence in 1997, this step was important since it enabled and promoted independent oversight of government forecasts and spending plans. Moreover, it added much-needed rigour to Treasury analysis, evaluation of government performance against fiscal targets etc. since those working in the Treasury know that people at the OBR will review and evaluate any plans and forecasts.

Getting on to some of the smaller issues, the pasty-tax debacle was also a negative point. Specifically, the introduction of the tax was actually a decent idea – it removed some of the myriad of exemptions that apply to VAT, thereby simplifying the tax system – but the subsequent reversal of the policy in the face of (relatively small) public backlash was weak and disappointing to see. Likewise, the introduction of the National Living Wage policy was a good idea, but restricting it to over 25s seems rather a cop-out, and instead the minimum wage should (and could easily) have been increased to the level of the NLW, thereby benefiting more people without substantially increasing businesses’ costs.

There are also things that Osborne couldn’t really do much about, but for which some might blame him anyway. The lack of productivity growth might be one, but that’s more the responsibility of other departments than it is the Treasury. Failing to meet, or continually adjusting, his fiscal targets could be another – but Osborne was hampered in meeting those because of sluggish growth in the global economy.

Overall, then, it seems as though there are plenty of things over which Osborne can be criticised (e.g. refusing to borrow to invest, protecting certain departments’ budgets), but equally there are plenty of policies he introduced that are worthy of praise (e.g. the OBR, consolidating day-to-day fiscal spending). As such, Osborne will most likely go down in history as fairly middle of the road – some good bits, some bad bits, but generally not outstanding in either category.

The cost of Brexit (part 2 of who knows how many)

In response to the Treasury’s report on the costs of Brexit (and, obviously, to my blog post covering that report) a group calling themselves “Economists for Brexit” published a pamphlet which they claim contains a more reasonable estimate of the impact of Brexit on the UK economy.

Unsurprisingly, they find that, contrary to the Treasury’s report (and, indeed, the vast majority of economic reports published on this issue), that Brexit would benefit the UK economy by increasing GDP growth by about 0.5% points per year on average (with the majority of this increase coming in 2020, the final year of their forecast).

Equally unsurprisingly, their estimate is fundamentally flawed. In an impressive attempt to hide these flaws, the report contains only a two page summary of the model they have used to obtain their results, but even then the numerous flaws are apparent.

First, the report assumes that leaving the EU would mean that the UK would be able to remove EU-set trade barriers to non-EU countries, but would still keep the same terms-of-trade it currently has with EU countries. Moreover, it assumes that all trade barriers will reduce by half over the next five years. These assumptions drive the report’s “finding” that Brexit would increase UK living standards by 3.2% by 2020. However, the report does not provide any evidence to support the validity of either of these assumptions. Indeed, there is plenty of evidence to suggest that they are not valid – for example, they assume a rate of decrease in trade barriers not seen since the 1960s.

Second, the report assumes that the 0.8% GDP net saving from the UK not having to contribute to the EU budget would be passed-on entirely to taxpayers in the form of an income tax cut. This is extremely unlikely to happen – due to the current government’s austerity policies, any savings from Brexit are likely to be used to reduce the government deficit rather than hand out a (potentially politically damaging) tax cut.

Third, not only does the report assume that there would be a reduction in regulation if the UK were to leave the UK (which is an unproven assumption), the report then assumes that this reduction in regulation would have exactly the same effect as a 2% point decrease in the employer rate of NI. One hopes that those writing the point must have released how barmy such an assumption is – the report doesn’t contain even a passing attempt to justify how a decrease in regulation would have exactly the same impact as reduction in employer NI. Indeed, it is barely possible to conceive how anyone could think this was a reasonable assumption.

Anyway, moving on. Finally, the report assumes that the government deficit is unchanged due to the aforementioned assumptions resulting in the government’s revenues not changed. However, this fails to recognise the possibility that some of the money that was spent on EU goods and services previously could now be spent on UK goods and services, thereby potentially increasing tax receipts. Conversely, the report also assumes that non-UK people and businesses won’t decide to move away from the UK, which would result in a decrease in tax revenues.

And all of this is to say nothing of the fact that the report has excluded countless other factors that could be detrimental to the UK. For example, the report does not even mention the potential impact Brexit could have on immigration (note that the vast majority of studies find that immigration is beneficial for the country to which immigrants relocate and this is even true for low-skilled workers in that country). Nor does it cover the costs associated with the uncertainty that would be created and persist for a number of years regarding exactly what form of agreement between the UK and the EU would be put in place post-Brexit.

In essence, the study published by the “Economists for Brexit” group is so full of holes it is no surprise that they were only able to find eight professional economists to support it. Contrast this to the almost 200 economists (including yours truly) that are signatories to a letter in the Times stating that “[l]eaving would entail significant long-term costs.” That in itself should be damning enough.

The cost of Brexit

How much does the UK’s membership of the EU actually cost? And, in fact, does being in the EU represent a net economic benefit, rather than a net cost?

If you were to believe the information provided by Vote Leave, you might think you’d know that the answer. Vote Leave has claimed that membership of the EU costs the UK about £18 billion per year, the equivalent of about £280 per person per year. However, this figure does not include the substantial rebates and public/private sector receipts that the UK receives from the EU – once these are taken into account the actual direct budgetary cost of the UK’s membership of the EU is about £8.4 billion, or £131 per person, per year (i.e. less than half of the original Vote Leave claim).

Moreover, the Vote Leave figure only includes the direct budgetary costs of being part of the EU. Importantly, it does not include, nor does the Vote Leave campaign attempt to include, any “indirect” benefits that result from EU membership. Such indirect benefits include, for example, any jobs or exports resulting from trade with EU countries that would not otherwise occur absent EU membership. If membership of the EU increases UK output above what it would have been if the UK was not part of the EU (which is likely to be the case), then leaving the EU would result in a decrease in UK output.

This could happen for such wide-ranging reasons as EU consumers have more diverse tastes than just those in the UK allowing a larger number of different firms to flourish in the UK and export their output to the EU than would otherwise prevail if the UK left the EU and UK firms would have reduced demand from EU countries; or collaboration between EU and UK firms enables a wider spread of technology that would not be possible after Brexit such that UK productivity is higher than it would be outside the EU; or membership of the EU encourages investment not just from EU firms but from firms located in the Rest of the World that would not occur if the UK were to leave the EU. There are plenty of other potential mechanisms through which EU membership increases UK output.

Importantly, although Vote Leave has not attempted to include such factors, the Centre for Economic Performance (CEP) has done so and finds that leaving the EU would reduce the UK’s output by at least £850 per household per year. That is the best case scenario for the Vote Leave supporters. Note, too, that this only includes “static trade consequences” – i.e. the impact that can be attributed just to losing the ability to trade freely with EU countries; it does not include any of the costs associated with reduced migration, technology transfer, investment etc that would also result from leaving the EU. In fact, once these factors are taken into account, the cost of leaving the EU could be as high as £6,400 per household per year.

As such, the £850 figure likely underestimates the true cost of leaving the EU.

Nonetheless, that is not to say that every part of the CEP analysis is beyond criticism. For example, the study assumes that intra-EU trade costs will continue to fall as they have done in the past, but does not provide any evidence to suggest that such an assumption is reasonable. If, in fact, intra-EU trade costs were to fall less quickly than assumed by the study, then the costs to leaving the EU would be reduced.

Moreover, little information is provided regarding how the estimates of the cost of leaving the EU that account for the aforementioned “dynamic” factors (such as migration and investment) are obtained. Given that those estimates are likely to be based (at least in part) on complex (albeit commonly used) statistical methods, a higher level of transparency regarding the approach used would be welcome so as to enable a higher degree of confidence that the estimates have been obtained via a reasonable approach.

Overall, therefore, although the Vote Leave figure regarding the benefits of leaving the EU is an egregious over-estimation, and it is actually highly likely that there would be a large net cost to leaving the EU, it is unclear what exactly the cost per household per year is. However, this uncertainty regarding the exact cost should not detract from the fact that the cost to leaving the EU is large.