How to value infrastructure - Improving cost benefit analysis - 20 Sept 2017 - Graham Atkins - Nick Davies - Tess Kidney Bishop

About this report

Evidence suggests that the UK performs poorly on infrastructure compared with some other wealthy countries. This Institute for Government work programme, which is supported by the Project Management Institute, explores how UK economic infrastructure policymaking can be improved.

The aim of this report is to evaluate how the UK appraises infrastructure projects, and in particular the Government’s use of cost benefit analysis. We offer recommendations on how the Government might improve the way it uses cost benefit analysis to accurately include relevant benefits and costs, assess them consistently, and communicate results clearly.

The first publication in this series looked at some of the main flaws in recent and controversial UK ‘megaprojects’.

Subsequent publications will look at:

         how infrastructure projects are financed, and government’s ability to strike good deals

         politics and institutions, including -
            *        public consultation,
            *        devolution, and
            *        the role of politicians and experts in decision making.

Follow our work on infrastructure policy:  www.instituteforgovernment.org.uk/infrastructure

Contents

List of figures 2

Summary 3

1 Introduction 6

2 Context 8

3 Cost benefit analysis does not account for all relevant impacts 11

4 Cost benefit analysis routinely underestimates costs 17

5 Impacts are not consistently valued 23

6 Cost benefit analysis is not communicated clearly and transparently 27

7 Recommendations: getting the most out of cost benefit analysis 33

References 40

List of figures

Figure 1: Cost benefit analysis in Whitehall 8

Figure 2: Predicted value for money of approved transport projects 2004-16 9

Figure 3: Schemes in the National Infrastructure and Construction Pipeline 12

Figure 4: Network Rail Great Western Railway electrification cost estimates, 2014–16 19

Figure 5: Optimism bias recommended adjustment ranges 20

Figure 6: Highways England, average error in cost forecasts by appraisal year, 2000–09 22

Summary

The UK Government spends tens of billions of pounds every year on economic infrastructure, including energy, transport, water, utilities and digital communication.

Picking the wrong projects risks wasting money now and damaging the economy in the future, but picking the right projects can boost productivity and economic growth over the long term.

Cost benefit analysis (CBA) is a way of summing up the positive and negative impacts of a project. It is one of the key tools the Government uses to decide between competing projects. Yet, as this report demonstrates, it can be misused, inconsistent and poorly communicated. It does not always play a central role in decision making and is too often used to justify decisions that have already been made. As the expensive overruns of the Channel Tunnel and the embarrassing u-turns of the rail electrification programme show, progressing with projects before a thorough analysis has been undertaken can lead to problems.

But despite these issues, CBA is still the best tool available for helping government to appraise and prioritise potential investments. Improving its use is therefore crucial.

In this report, we identify four problems and offer recommendations for overcoming these.

I)    Problem: difficulties capturing impacts

Ministers are interested in projects that can create jobs and increase productivity and Gross Domestic Product (GDP). But existing techniques for estimating ‘dynamic effects’ – those that change the structure of the economy – are -
*        costly to apply,
*        difficult to undertake; and
*        relatively underdeveloped.

Some industry models produce high numbers that lack credibility.

Recommendations

• The Treasury and relevant departments should produce updated guidance on estimating the dynamic effects of infrastructure projects. This should include clear advice on reasonable assumptions and models -
   *    that can be used, and
   *    what kinds of projects merit this broader analysis.

• Ministers and senior civil servants should ensure that claims about dynamic effects are subjected to the same scrutiny as other economic claims, -
  *    either by including them in the economic case;
  *    or by formalising an Infrastructure and Projects Authority review of the strategic case.

• Departments should undertake more research into dynamic effects, including evaluating the dynamic effects of projects after construction.

II)   Problem: unrealistic cost estimations

Cost and time estimates for infrastructure projects are almost always over-optimistic.  This misleads decision makers and can lock ministers into undeliverable targets. It also reduces public trust when these are not met.

Recommendations

Ministers should be honest about cost uncertainty and potential overruns when making public announcements, particularly for big projects.

• Government must consistently evaluate infrastructure projects. All projects should systematically collect data on -
   *    cost outturns against estimates,
   *    delivery times against estimates,
   *    the size of project teams; and
   *    project length.
Thorough post-project evaluations should be undertaken for all major projects.

These data should be used to inform appraisals, particularly of cost estimates. The Infrastructure and Projects Authority should collate this information centrally and ensure that it is used by departments to enhance reference class forecasting.

III)  Problem: lack of consistency between project assessments

Some costs and benefits, such as impacts on health, safety and the environment, are hard to monetise. Both the public and government care about these impacts, but they are not consistently measured across projects.

Recommendations

Departments should follow the U.K. Department for Transport and produce standardised guidance on valuation and presentation for their appraisals.

Departments must consistently incorporate these valuations into decision making.  We recommend a five-step approach, including producing a narrative, benefit–cost ratios and an explanation of whether uncertain effects justify moving a project into a different value-for-money category.

The Treasury should streamline The Green Book guidance and make it more user friendly. Its rollout should be accompanied by an extensive training programme with appropriate support provided for senior civil servants and ministers, as well as analysts.

IV)  Problem: poor communication within and outside Whitehall

CBA results are not always well understood, by either decision makers or the public.  Information may not be presented transparently or communicated clearly, leading to misunderstandings about critical assumptions and uncertainties.

Recommendations

• Analysts should present and communicate their work more transparently within Whitehall, being clear about critical assumptions, and what might happen if assumptions changed. Departments should learn from the Bank of England and the Office for Budget Responsibility about how to communicate technical information in a way that people will understand.

• Ministers must be fully transparent in communicating to the public the assumptions that underpin CBA results, making the relevant data available to the public and presenting benefit–cost ratios as a range of possible outcomes rather than a single number.

Increased training and knowledge transfer programmes should be made available to policy, delivery and commercial teams, as well as senior civil servants and ministers, so that they better understand analysts’ work.

• There should be a clearer separation between project promoters and analysts to strengthen the integrity of analysis.

• The Infrastructure and Projects Authority should independently assess CBAs conducted for major projects.

Taken together, these recommendations provide a route map for improving the use of CBA, leading to a better selection of infrastructure projects, fewer cost overruns and, ultimately, a stronger economy.

1. Introduction

Cost benefit analysis (CBA) – a way of summing up the positive and negative impacts of a project – is one of the key tools the Government uses to decide between competing projects. CBA is much maligned but it is one of the best tools that governments have to appraise and prioritise investments; most states1 and international organisations2 use a variant of it.

Where the methodology and results of CBA are publicly available, they make explicit the judgements that governments make about the scope and size of project costs and benefits,3 enabling public debate and democratic oversight. In the absence of a tool for systematically appraising investments, deciding which projects go ahead could slip into approving projects with the loudest promoters, or the promoters most willing to make outlandish claims about the low costs and high benefits of their project.

CBA is a force for improving government decision making, but several problems undermine its use:

•         difficulties capturing benefits

•         unrealistic cost estimations

•         lack of consistency between project assessments

•         poor communication within and outside Whitehall.

These problems risk projects being wrongly approved, turned down or delayed. The problems, and how they might be mitigated, are the focus of this report.

Methodology

This report was informed by an in-depth review of the academic literature and broader work in this field as well as interviews with current and former senior civil servants, politicians, academics and consultants.  The report focuses on transport infrastructure because the economic appraisal of transport projects is most developed in comparison with other infrastructure sectors.

Transport infrastructure tends to have the largest social and environmental impacts, and the greatest likelihood of inducing ‘dynamic changes’* in land use, employment, investment and productivity. As a result, CBA debates have focused on transport projects.

Similarly, we focus on appraisal in Whitehall as that is where decisions about the largest and most controversial projects are currently made. CBA, though, is carried outby many public bodies – including local authorities and local enterprise partnerships – seeking funding from central government.

Our analysis is informed by practice across infrastructure sectors and departments, and our recommendations apply to government investment decisions in transport, energy, communications and utilities, across all levels of government.

This report is the second in a series of reports on infrastructure. The first report was entitled What’s Wrong with Infrastructure Decision Making? Conclusions from six UK case studies.4

Future reports will look at:

•         the decision-making process that departments follow when deciding how to finance infrastructure projects

•         how government can best engage with the capital markets to secure private finance for new infrastructure, should it decide that this is the most appropriate option

•         how to reduce national infrastructure policy instability, develop better relationships between central government and various tiers of local government, and increase public engagement in decision making.

A final report will draw together the evidence we have gathered throughout this programme of work to make final recommendations on how to improve infrastructure decision making in the UK. This will be launched at an event in early 2018.

2. Context

Cost benefit analysis in Whitehall

CBA attempts to judge the net present value of an investment5 – whether benefits outweigh costs, discounted over time. For publicly funded infrastructure projects such as road and rail, CBA has been in use since the late 1960s.6

CBA forms an important part of how Whitehall decides which projects to back, and where to allocate limited resources, with guidance provided by the Treasury’s Green Book.7* It also forms part of the accountability framework that civil servants and ministers operate within.

The civil service has put substantial efforts into improving CBA. The Department for Transport has consulted8 and worked extensively9 on the wider effects of transport investment. Its guidance for analysing transport investments – WebTAG10 – is often used as a benchmark for other countries.11

Within Whitehall, CBA is mainly used in three separate stages (see Figure 1):

• First, departments use CBA to value projects.

• Second, the Treasury uses CBA results to help allocate spending under fiscal constraints.

• Third, after the Treasury allocates its capital budget, departments should then revisit their project selection and use CBA to reprioritise the optimal mix of projects.

Figure 1: Cost benefit analysis in Whitehall (simplified)

Source: Institute for Government analysis

Analysis

Departments rank and compare projects by looking at their net impact

Prioritisation

The Treasury uses CBA results to prioritise spending

Reprioritisation

Departments should reprioritise their mix of projects under budgets set by the Treasury Departments appraise potential projects

Departments bid for Treasury funding

The Treasury prioritises spending

Departments finalise a mix of projects to do

* The Green Book is the Treasury’s guidance on appraising project or policy proposals before public funds are committed.

For ‘megaprojects’ – those with a whole-life cost of more than £1 billion (bn) – there is a longer analysis stage. This is because a department’s analysis is usually revised following review by the Treasury and Infrastructure and Projects Authority. Their recommendations are made throughout the policymaking and business case stages,12 usually at the standard Treasury approval points – the strategic, outline and full business cases.13 In addition to the Treasury, major projects require approval from the Infrastructure and Projects Authority, which is granted following scrutiny by a panel of experts from the public and private sectors: the Major Project Review Group.14

In all business cases, CBA is referred to as the ‘economic case’, and is only one of five cases, sitting alongside the strategic, commercial, financial and management cases.15

Both during analysis and at the final decision, the results of CBA, in particular the benefit–cost ratio, will never be the sole criteria for a green light. Deliverability, affordability and the prevailing politics of the day all matter too.

Alignment with the Government’s policy priorities, outlined in the strategic case for a project, is usually the main reason for going ahead with a project. Nonetheless, decision makers would require significant justification to proceed with a project if analysts thought that the costs exceeded the benefits. Indeed, a benefit–cost ratio that suggests this – less than 1 – allows permanent secretaries to request a 'ministerial direction' on grounds of value for money.*

At the Department for Transport, most approved projects have ‘high’ or ‘very high’ benefit–cost ratios – meaning that they have ratios over 2:1 – according to their collated value-for-money data (see Figure 2).16

Figure 2: Predicted value for money of approved transport projects, 2004–16

Note: SR = Spending Review. Before 2011, the Department for Transport reported the value for money of schemes approved in Spending Reviews, rather than annually. ‘Very high’ was introduced as a category in April 2009. We report schemes approved before 2011 by Spending Review, noting that these were not eligible for ‘very high’ ratings.

Source: Institute for Government analysis of Department for Transport (2010) Resource Accounts 2009-2010, Department for Transport (2017) Transparency Data: DfT value for money indicator * A ministerial direction is a request for written grounds to continue with a policy, transferring responsibility for spending money to the relevant minister. Data collected by the Institute for Government in 2017 suggest that value for money is the most common criterion for ministerial directions. See Institute for Government (2017) ‘Ministerial directions’, explainer, 24 August, retrieved 6 September 2017, https://www.instituteforgovernment.org.uk/publications/whitehall-monitor/whitehall-explained/ministerial-directions

Very high High Medium Low Poor

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

2004 SR 2007 SR 2011 2012 2013 2014 2015 2016

What does good cost benefit analysis in government look like?

• It includes all relevant impacts. Some major projects have dynamic effects, which a static analysis will not account for. Wherever possible, these dynamic effects should be in the economic case so that they are subject to a high level of internal scrutiny.

• It is realistic about costs. Over-optimistic estimates can lock ministers into undeliverable targets and reduce public faith in government when these are not met. Whitehall must try to mitigate this tendency wherever possible, using data from past projects.

• It is consistent. Costs and benefits should be valued consistently across projects to ensure comparability when prioritising.

• It is transparent and well communicated. Analysts’ work must reflect forecasting risks and be transparent enough to allow assumptions to be debated. Outside Whitehall, ministers must be more honest with the public about uncertainties in costs, benefits and timescales.

Improving the use of cost benefit analysis

As CBA results influence decision making, it follows that CBAs should be as good as they can be. In the box below we set out four criteria for a good CBA.

Better evaluation underpins the first two criteria. Comparing forecasts with actual costs and benefits, and feeding this analysis back into appraisals is crucial for helping departments to produce more accurate estimates.

Sometimes government CBAs meet the four criteria but, as we show in the following chapters, they often fall down on one or more criteria.

Infrastructure projects and the governments proposing them often have transformational objectives. High-speed rail, new nuclear power plants and broadband have all been seen by ministers as investments with dynamic effects, such as increasing growth, boosting productivity or creating jobs. But these effects, which could represent a significant proportion of a project’s benefits, are not fully accounted for in a static, conventional CBA,* which assumes that these aspects of the economy will remain constant. A good CBA should include all relevant impacts, including dynamic effects.

What are dynamic effects?

Dynamic effects change the behaviour of firms and consumers, and the structure of an economy. They are typically contrasted with static effects.17

Static effects are the direct effects of an investment – in the case of transport, changes that emerge due to a reduction in travel time and costs. This will incorporate some wider economic impacts beyond the individuals and firms using the transport. For example, static effects include increased tax returns from wage and productivity gains created when individuals and firms are brought ‘closer together’ (that is, when the time it takes to travel from place to place is reduced, but firms and consumers do not relocate).

Dynamic effects are the induced effects of an intervention – changes that emerge as a result of capital and labour movement. For example, individuals and firms may decide to relocate in response to a change in transport. This will change economic activity in certain areas, sometimes leading to further productivity gains. Static effects do not include these relocations, and the resulting changes in land use, and so miss part of the economic impact.

Transformative projects represent a sizable proportion of planned infrastructure spending

Transformative infrastructure projects – those most likely to induce dynamic effects – are investments that are ‘large enough to make a significant (or observable) and sustained impact on economic performance’.18 Such projects are likely to be new assets, rather than upgrades or maintenance,19 and cost is a rough guide for identifying them, even if cost is no guarantee of dynamic effects.

3. Cost benefit analysis does not account for all relevant impacts

* Throughout this report we use the term ‘conventional CBA’ to refer to CBA that includes only static effects.

‘Megaprojects’ – those worth more than £1bn – account for a sizeable proportion of total spend in the National Infrastructure and Construction Pipeline.20 There are 89 schemes (where one scheme may include multiple projects) in the pipeline with a value of more than £1bn and their combined value is more than £450bn (see Figure 3), representing over 80% of the total pipeline spend. It is therefore particularly important to try to get appraisal right – these are costly investments.

Figure 3: Schemes in the National Infrastructure and Construction Pipeline

Source: Institute for Government analysis of HM Treasury and Infrastructure and Projects Authority (2016) National Infrastructure and Construction Pipeline 2016, GOV.UK

In addition to accurately incorporating into CBA the genuine dynamic effects of larger schemes, we should also care about ensuring that only relevant benefits are incorporated into the CBAs of smaller schemes. Non-existent dynamic effects should not be used to boost the benefit–cost ratios of the vast majority of projects that are not transformational. Not incorporating dynamic benefits when they are unlikely is as important as including them when they are.

Conventional cost benefit analysis does not capture dynamic effects

Conventional CBA does not fully account for dynamic effects and cannot fully assess transformative projects. Most UK transport projects will not have significant dynamic effects, so this is not always a problem. But for larger projects, dynamic effects may provide significant additional benefits that need to be included to accurately convey a project’s value.21

Professor Diane Coyle puts the problem plainly:

Cost benefit analysis is the main tool economists provide for assessing projects, but it is inadequate for long-term infrastructure projects… [conventional CBA] is a method for looking at incremental changes, not system-wide ones. It does not try to account for the significant changes in behaviour that new infrastructure causes.22

Dynamic effects happen over a lengthy period and, by definition, involve structural changes to the economy. Conventional CBA can only analyse the static impacts of projects because the Treasury – and most departments – outline strict conditions as to when investments can be assumed to change the structure of the economy.*

In building models based on the existing economy, conventional CBA will not be able to properly assess impacts in a different future economy, or impacts that would make the economy different in the future. The strategic case for High Speed Two (HS2) explicitly acknowledges this, noting that ‘the benefit–cost ratio methodology was not developed [for schemes] on the scale of HS2’.23

Combined value (£bn)

0 50 100 150 200 250 300 350 400 450 500

Small

(<£100m)

Major

(£100m–£1bn)

Mega

(>£1bn)

Size of scheme

134

schemes

264

schemes

89

schemes

Could measuring impacts on productivity or GDP be an alternative to CBA?

As conventional CBA cannot capture the dynamic effects referred to above, some academics have suggested that attempting to measure how far a particular project increases productivity or GDP might be a better alternative.24

In doing this, however, the impacts that CBA can capture might well be lost. CBA is conceptually different from measuring productivity or GDP. It is built on social welfare assumptions. This means it looks at the impact on individuals’ wellbeing, and then tries to monetise this. As such, it includes things that are not traded in the money economy, such as impacts on health or the environment.25

While measuring impact on productivity or GDP can be a useful supplement, it is not complete and cannot substitute for judging the overall value of transport projects, which should incorporate monetised and non-monetised impacts.

There is a case, however, for giving this supplement more weight alongside CBA, if economic growth is the main objective of infrastructure investment.

Transport for Greater Manchester (TfGM), for example, uses productivity measures alongside CBA. It still uses CBA as a ‘sifting test’ to decide whether an investment should be considered, but benefit–cost ratios are not used to prioritise.26 Instead, TfGM prioritises investments based on impact on gross value added (GVA) for every pound of investment – a metric developed following direction from Greater Manchester politicians to use the capital budget allocated to support economic growth.27 It incorporated this measure to show the impact of transport on the real economy, helping to gain political and public buy-in. Its 2014 transport strategy and investment plan28 provides a clear overview of scheme objectives, monetised benefits, a narrative outlining the case for wider benefits, and a judgement of appraisal robustness.

* To its credit, the Department for Transport has set out guidance on when this assumption can be relaxed but it remains the default position for analysis. See Department for Transport (2016) Understanding and Valuing Impacts of Transport Investment: Updating wider economic impacts guidance, GOV.UK, retrieved 30 June 2017,

www.gov.uk/government/consultations/transport-investment-understanding-and-valuing-impacts

Capturing dynamic effects matters

In the UK, there is often criticism from academics, consultants and politicians about the strictness of the WebTAG guidance.* Critics argue that the emphasis on travel-time savings does not fully capture the dynamic effects of infrastructure.29 It does not, for example, fully account for the way that infrastructure projects might increase productivity and drive economic growth.

This is pertinent because economic transformation dominates political discussion about infrastructure. One interviewee argued, bluntly, that politicians care about jobs and GDP, not the social welfare improvements that conventional CBA captures.30 This may be why dynamic effects are frequently offered as rationales for infrastructure projects. The £23bn National Productivity Investment Fund, for example, announced in the 2016 Autumn Statement,31 promised infrastructure investment to boost productivity and economic growth.

Transformative objectives are often central to the strategic case – the first case in the five-case business model.32 They are rarely, however, fully incorporated into the  economic case – the CBA. This leads to three problems:

•      Not including some touted dynamic effects in benefit–cost ratios33 could have a major impact on which projects get approved and which do not. Including these benefits may change the results, sometimes dramatically. For instance, including dynamic effects in the CBA of High Speed 1 (HS1) almost doubled the benefit–cost ratio from below 1 (poor value) to over 1.7 (medium value).

•      Dynamic effects should be subject to the same scrutiny as any other benefits. It is easy to claim that a new transport link will create jobs, but it is important to know how many, what kinds of jobs they will be, and whether they are just jobs moved from elsewhere. The economic case is rigorously scrutinised, adjusted for optimism bias** and requires extensive sensitivity testing.*** It is important that this rigorous scrutiny happens as a matter of course for dynamic effects mentioned in the strategic case. The best way to do this is to try to include them in the economic case, where more rigorous scrutiny is the norm.

•      There is a risk that uncertain dynamic effects will be overvalued without a standardised way to include them.34 Without standardisation, there is a dual perverse incentive. Consultants who provide models to estimate dynamic effects to project promoters have a clear interest in developing models that give high estimates. In turn, project promoters, in both the public and private sectors, have an incentive to choose a model known for producing large estimates of dynamic effects, to paint a project in the best possible light. To the extent that appraisal ends up in a competition to find ever-more impressive dynamic effects based on dubious evidence,35 this will damage decision making by overstating project benefits.

To ensure that capturing dynamic effects is valuable, consultants and their clients need to make sure that they are transparent about how they model these effects. If clients do not have access to models and their underlying assumptions, they will not be able to ask constructive questions.36 Without transparency there is a risk that in trying to model dynamic effects,37 imperfect CBAs will be replaced with more complicated black box models.38 This shifts the debate into a much less accessible, technical space, constraining healthy policy debate.

These problems emerge clearly when the touted dynamic effects of a project diverge significantly from conventional CBA results.39

* WebTAG’s default assumption about wider economic activity is that it represents activity that was occurring in

other areas and is therefore 100% displaced, rather than additional. The Department for Transport’s latest

guidance sets out the circumstances in which defaulting from this assumption would be appropriate, however.

See Department for Transport (2016) Understanding and Valuing Impacts of Transport Investment: Updating

wider economic impacts guidance, Department for Transport, p. 20, https://www.gov.uk/government/uploads/

system/uploads/attachment_data/file/554783/transport-appraisal-guidance-webtag-consultationdocument.pdf

** Optimism bias is the phenomenon that analysts are likely to make overly optimistic estimates: overestimating

benefits and underestimating costs. How government adjusts for this is explained in the next chapter.

*** Sensitivity testing involves analysts adjusting variables in a model to test their impact on the outcome. This is

used to see how much a certain cost or benefit would need to change for the benefit–cost ratio to change significantly.

Case study: The transformative effects of HS2

Economic growth and rebalancing the economy have been cited as major

objectives of HS2.40 Because of the assumptions in conventional CBA, dynamic

effects were not included in the 2013 economic case for HS2. An analysis of the

regional economic impacts of HS2 was therefore commissioned to complement

the economic case.41 The report, produced by KPMG and published in

September 2013, claimed that HS2 could generate a £15 billion a year net productivity boost to the UK economy.42

The size of the benefits estimated by KPMG heightened attention on the

modelling used. The estimates became the focus of public and parliamentary

debates and challenges to HS2. Henry Overman, a former adviser to High Speed

Two (HS2) Limited, called the procedure used to generate the figures ‘essentially made up’.43

In October 2013, following a Freedom of Information request, critics argued

that there had not been enough focus on the impact of HS2 on those regions

not on the line,44 with the report, and subsequent government responses to it, skimming over negative impacts.

KPMG denies that it misrepresented any figures under pressure from HS2

Limited;45 almost all the criticisms it received were addressed in the technical

details of the original report.* But where models are experimental, there needs

to be more scrutiny and a clearer explanation of how estimates were generated,

and their level of uncertainty. Without this, it can easily appear that

transformational effects are being wilfully overvalued, rather than just uncertain.

The HS2 case highlights the need for clear guidance on including dynamic

effects in economic appraisals. There must be a clear method of communicating

uncertainty and different scenarios, including a narrative explanation. It also

points to the importance of transparency and communication in promoting

constructive debate, as we discuss further below.

* The criticism that KPMG skimmed over negative impacts is a presentation issue. These impacts were included

in a map on page 56 of the report, but the areas affected and the scale of the impact were slightly unclear.

A subsequent Freedom of Information release showed the precise areas that were impacted, and what the

projected output change would mean as a proportion of each area’s GDP. See BBC News (2013) ‘HS2 “losers”

revealed as report shows potential impact’, BBC News, 19 October, retrieved 8 August 2017,

www.bbc.co.uk/news/uk-24589652

The Department for Transport has made progress in including dynamic effects and other departments should learn from it

Guidance on valuing wider impacts has improved significantly over the past 20 years

and the Department for Transport has been at the forefront of the debate.

The 1999 Standing Advisory Committee on Trunk Road Appraisal (SACTRA) report was

its first major consideration of dynamic effects. Its remit was to consider what impact

transport projects might have on the performance of the economy.46 It proposed that

impacts on the economy should only be included for large projects. In most cases,

dynamic effects are covered by time-savings benefits; to include them would be

double counting. This is still the de-facto standard position in the main body of The

Green Book, although there is now greater acknowledgement of dynamic effects in the

supplementary guidance. The 2015 guidance on ‘valuing infrastructure spend’

suggests that ‘supplementary analysis’ of dynamic effects should be presented

alongside CBA and used to prioritise certain projects.47

The Department for Transport first published guidance on including – static – wider

economic impacts in 2005.48 It is now undertaking a major update of this guidance,

which includes dynamic effects; a consultation was published in September 2016.49

This, and the greater clarity on dynamic effects (‘indicative monetised impacts’) in the

new Value for Money Guidance,50 are promising. The call for clear understanding and

communication of the way dynamic impacts are expected to play out, and how they

relate to conventional CBA, is positive. This follows through on the recommendations

of the Department for Transport-commissioned Transport Investment and Economic

Performance report, which noted that ‘appraisal techniques [can be] insufficiently

context and project specific; they need to be informed by a clear narrative about likely economic impacts of the project’.51

The Department for Transport has made good progress in attempting to capture all

relevant benefits in the economic case, ensuring that they receive the necessary

scrutiny. It is important now that this is done routinely on projects that claim dynamic

effects. Where other kinds of economic infrastructure projects have transformational

objectives – for example the rollout of full-fibre broadband52 – other departments

should learn from the Department for Transport’s practice.

Whether Whitehall includes all relevant impacts is only one half of whether CBA

provides useful information. It is also critical that the costs are assessed accurately.

Over-optimistic cost estimates mislead decision makers, locking ministers into

undeliverable targets or unrealistic promises, reducing public faith in government when they are not met.

Cost underestimation is a significant and systematic problem in government

Cost underestimation refers to the tendency for cost forecasts to be lower than final

costs incurred. The UK is far from alone in suffering from infrastructure cost

underestimation. Internationally, megaproject academic Bent Flyvbjerg estimates that

nine out of ten projects costing more than £1bn go over budget.53 Flyvbjerg’s evidence

suggests that railways and tunnels are the transport projects most likely to overrun,

followed by bridges and roads.54 In Australia, the Grattan Institute found that projects

announced early – those announced without a budget commitment – account for a

disproportionate amount of cost overruns. In a dataset of all transport projects valued

at more than AUS$20 million between 2001 and 2015, projects with an early

announcement (32%) accounted for 74% of the total value of cost overruns.55

There is no single explanation for cost underestimation; academics point to three main causes:

• strategic misrepresentation56 – where analysts understate costs to make a project

look good and push it through the approval process

• optimism bias57 – a systematic tendency to underestimate the time and costs of a

project; for megaprojects, this can often be because of an (in)ability to assess all

that is required to make a very large project work

• ‘anchoring and adjustment’58 – the difficulty that people face with adjusting away

from an initial – usually low – estimate (the ‘anchor’ number), and subsequent

tendency not to adjust their estimates far away enough from it.

Our interviewees suggested that most errors are due to a combination of all three,

with particular blame attributed to strategic misrepresentation – usually under

pressure from more senior civil servants and ministers – and optimism bias.59

Over-optimistic cost estimates cause significant problems. They damage decision

making and create political problems for ministers; and they make it harder to manage departmental budgets.

4. Cost benefit analysis routinely underestimates costs

First, governments that commit to infrastructure investments based on low early costs

may end up selecting the wrong projects. They can find themselves locked into

projects that no longer look a good bet, as spiralling costs reduce projected value for

money and cost-effectiveness.60 Ministers and senior officials will be held accountable

by Parliament and the public when this happens, as shown by recent rail electrification projects (see the box below).

Second, misleading analysis leads to difficulties in managing departmental budgets.

The Treasury cannot make appropriate capital funding allocations during Spending

Reviews if it cannot rely on the cost estimates provided by departments. There is also

a danger that cost underestimations make infrastructure projects look better value

than they are, relative to other spending priorities.61

Rail electrification

Cost estimates for an ambitious set of electrification plans62 set out in July

2012 were serially underestimated, locking successive transport ministers into

undeliverable commitments. Escalating costs on the Great Western

electrification project raised concerns, and led Sir Patrick McLoughlin, the-then

Secretary of State for Transport, to pause it in June 2015.

Between July 2012 and June 2015, the costs of the Great Western

electrification project rose substantially. Due to a multitude of factors,

including immature cost estimates and project management shortcomings,

Network Rail revised its £1.1bn October 2013 cost estimate to £1.8bn in

September 2014. These revised estimates were assessed and approved as

plausible by the Office of Rail and Road. But even these ‘mature’ cost estimates

were over-optimistic. These revisions rose again to £2.8bn when re-planned

guidance was issued in 2016, as illustrated in Figure 4.63

The reasons for the increases are multifaceted. Part of the cost rise was due to

weak project management and planning: detailed requirements were not

agreed until after construction had started.64 But even taking this into account, cost underestimation mattered.

Network Rail was too optimistic about the scope of cost reductions that could

be achieved using a factory train, an alternative to labour-intensive

construction. Designs were drawn up and equipment was purchased before

ground-level research, requiring costly equipment modifications and redrawn

specifications. Network Rail further underestimated the costs of ‘route

clearance’ – removing obstacles to electrification, and the associated costs of

getting the 1,800 separate consents to perform work on the line.

Since McLoughlin’s June 2015 pause, subsequent transport ministers have

performed further u-turns. Midland Mainline and TransPennine electrification

was paused alongside Great Western electrification in June 2015, before being

restarted in September 2015. Four Great Western projects were delayed in

November 2016. Finally, electrifying the Great Western line west of Cardiff was

cancelled in July 2017, alongside parts of the Midland Mainline and Lake

District railways. At present, the Government has not committed to funding any

infrastructure enhancements in Network Rail’s next five-year funding period,

and claims that it will deal with them separately. 65

This whirlwind of promises has confused almost all external observers, and the

Government now faces criticism from the media,66 regional government67 and

mayors.68 If ministers had given Network Rail enough time to develop rigorous

cost estimates, and the Office of Rail and Road enough time to scrutinise them,

they could have announced a more realistic programme. Ministers should care

about whether cost estimates are accurate: if they are not, they will only lead to

political trouble further down the line.

Figure 4: Network Rail Great Western Railway electrification cost estimates, 2014–16

Source: Adapted from National Audit Office analysis of Network Rail information in National Audit Office (2017) Modernising the Great Western Railway, Figure 13

The Government has recognised the cost-underestimation problem but should do more to embed reference class forecasting

Dealing with unrealistic cost estimates is nothing new. Whitehall is aware of the

problem and the Treasury has standardised advice for addressing it. Based on research

initially conducted by engineering and management consultants Mott MacDonald in

2002,69 the supplementary guidance to The Green Book on dealing with optimism bias, acknowledges that:

0

200

400

600

800

1,000

1,200

Installation of

electrification equipment

Route

clearance

Programme

management

Signals and

communicaions equipment

Lead design

organisation

Risk and

opportunity

Other

Project costs (£bn)

Estimated cost 2014 Additions in 2016 estimated cost

Project appraisers have the tendency to be over optimistic. Explicit adjustments

should therefore be made to the estimates of a project’s costs, benefits and

duration, which should be based on data from past or similar projects…70

The guidance stipulates that an uplift within a specified range (as shown in Figure 5)

be made to forecast costs at the outline business case stage of projects (the second

stage at which projects seek Treasury approval).71 The precise value of the uplift

depends on the amount of risk that has been mitigated. This new figure is used to

confirm whether the economic case for a project would still be robust if historically

observed cost overruns were repeated.

Figure 5: Optimism bias recommended adjustment ranges

Source: HM Treasury (2013) Green Book Supplementary Guidance: Optimism bias, HM Treasury

This guidance was a welcome step forward. It aligns with the principles of reference

class forecasting72 – a widely recommended way to improve accuracy by adjusting

forecast project costs and benefits using data from similar past projects, albeit with rather blunt categories.

However, there are ongoing questions about how far this guidance is applied in

practice,* and how much sensitivity analysis is put at the forefront of cost estimates.

More broadly, it is unclear whether the guidance has improved the accuracy of cost

estimates in government. There is no publicly accessible database of cost estimates at

different project stages, although departments and agencies responsible for

infrastructure projects should hold such data. As a result, it is not clear to external

observers whether the data are being aggregated and used to inform cost estimates.

Within government departments, there is evidence that more specific data are being

used to inform projects. The Department for Transport’s WebTAG tool uses more

* See, for example, the debate between Heathrow Ltd and the Airports Commission over applying optimism bias

to the rail infrastructure costs associated with a third runway. See Pickard J (2016) ‘Heathrow runway “faces

£16bn black hole”’, Financial Times, 24 April 2016, retrieved 4 September, www.ft.com/content/9ba757ce-

0a22-11e6-b0f1-61f222853ff3

0%

50%

100%

150%

200%

Suggested optimism bias uplift

Standard

buildings

Non-standard

buildings

Standard civil

engineering

Non-standard

civil engineering

Equipment/

development

Higher adjustment Lower adjustment

precise reference categories: roads, light rail, conventional rail, ‘fixed links’ (bridges

and tunnels), buildings and information technology.73 The Department keeps these

data under review,74 including, for example, Network Rail’s enhancement projects.75

Similarly, the Department for Environment, Food and Rural Affairs (Defra) has a lower

uplift rate for flood and coastal defence projects, informed by its data.76 But there is

little publicly available evidence in either case that more precise guidance has led to more realistic cost estimates.

More promisingly, there is evidence from Highways England that cost estimates for

major roads are improving due to reference class forecasting.77

Reference class forecasting in practice

Highways England completes a Post Opening Project Evaluation (POPE) survey of all road schemes one and five years after completion. This compares pre‑project appraisals and post-project evaluations with the aim of reviewing whether ‘the expected impacts of highway schemes have materialised’.78

POPE has improved appraisal through the collation and use of evaluation data.

Road evaluation follows a standardised procedure, which feeds back into appraisal in a transparent way. Highways England maintains a list of all the issues raised by POPE, tracks its response to them and shares best practice with project managers and specialists. According to our interviewees, Highways England no longer uses The Green Book’s supplementary guidance to apply optimism bias, aside from at the earliest stages, but instead uses its own database on comparable projects, which it has found more accurate.79

The most recent POPE meta-report80 found evidence of more accurate cost forecasting since 2004, and an increase in the overall value for money achieved by schemes since 2008. This suggests that Highways England’s evaluation and use of data in appraisals has improved the way it appraises projects. Figure 6 illustrates how Highways England’s cost estimates have improved.

0

5

10

15

20

25

30

35

40

Average error in major

project cost forecasts (%)

2000

2001

2002

2003

2004

2005

2006

2007

2008

2009

Figure 6: Highways England, average error in cost forecasts by appraisal year, 2000–09

Source: Institute for Government visualisation of Highways England (2015) Post Opening Project Evaluation (POPE) of Major Schemes: Main report: Meta-analysis 2015, Highways England, Figure 6-15, ‘Margin of error of capital cost estimates by appraisal period’

Systematic evaluation with data feeding back into appraisal may not be feasible for all projects. But it should be the norm for major projects that we undertake regularly – such as roads, rail enhancements, mid-size utilities investments and flood defences.

Megaprojects will require a different approach. Extra care should be taken not to commit too early, and to consult with operational and delivery officials before announcements. As megaprojects often have few directly comparable predecessors in the UK, reference class forecasting based on past UK projects is unlikely to be as useful. But the general principles of reference class forecasting should still apply. Cost estimates for high-speed rail or major nuclear power plants could be stress-tested by looking at appraisal accuracy in similar international projects.

For some costs and benefits, such as operating costs and journey-time savings, departments endorse a standard way of valuing, monetising and including them in a CBA. Others, however, are harder to value, monetise and include. It is not obvious, for example, how to monetise impacts on health, safety and the environment.

Departments with specific guidance such as Defra and the Department for Transport endorse particular ways of valuing impacts, but none are without dispute. There is no ‘right’ way to value these impacts; but they are impacts that both the public and the Government care a lot about.81 They should be included in CBA.

For impacts that are difficult to put a monetary value on, technical improvements will only go so far. It is important that government analysts value these impacts consistently and departments draw on the expertise of others. Encouraging consistency with more guidance will help to ensure that the Treasury is able to compare projects fairly when choosing which to proceed with. It should also help to avoid disputes about valuation methods, which can obscure more fundamental challenges.

The civil service has guidance to encourage consistent valuation

For costs and benefits that are harder to monetise and include, economists and statisticians have proposed many different valuation methods.82 The Green Book guidance83 prioritises two: revealed preferences and stated preferences.

Revealed preference methods infer the price of a good by observing consumer behaviour.84 This is where analysts deduce a non-monetary good’s price by looking at the variation in the price of a market good due to the non-market good. Variation in house prices tells us something about how much people will pay to avoid noise and pollution, for example. Stated preference methods use surveys to ask a representative group how much they value the good. The Green Book prioritises revealed over stated preference methods.85

Where a reliable valuation does not yet exist, departments can commission studies to determine how to value a good.86 The Green Book recommends looking at which valuation technique is appropriate on a case-by-case basis.87

If it is just not possible to monetise a cost or a benefit sensibly88 in any of these ways, it can nonetheless be described in its own units (decibels for noise, for example) or described qualitatively. In this case, The Green Book guidance recommends that appraisals explain whether the resulting situation will be better or worse than at present.89 Comparisons of unvalued costs and benefits are formalised in models for multi-criteria analysis.90

5. Impacts are not consistently valued

Multi-criteria analysis: a better decision-making tool?

Some analysts have suggested that multi-criteria analysis could be a better

appraisal tool than CBA, given the struggle CBA faces to consistently include

costs and benefits which are not market goods.91 Multi-criteria analysis involves

analysts assigning numerical values to all project criteria, explicitly weighting

each element of the criteria, and then aggregating them to produce a final set of numbers to rank projects against one another.*

Unlike CBA, multi-criteria analysis is presented as a decision-making rather

than decision-support tool. Current UK guidance is that multi-criteria analysis

may be used alongside CBA where costs and benefits cannot be valued in

monetary terms.92 It has been used more extensively in Australia,93 New

Zealand94 and for the Republic of Ireland’s road network.95

Multi-criteria analysis is said to be more transparent than CBA because it forces

decision makers to put an explicit weight on the project criteria that matter to

them. If the numbers behind these weights are published, this should make decision makers more accountable.

However, the theoretical benefits of multi-criteria appraisal are undermined in

practice. It is prone to the same errors as CBA – uncertain data and biased use of

appraisal models – but its greater complexity makes it easier to manipulate. It

can make comparing projects difficult, because the weighting given to various

factors is not always consistent, changing as government priorities change.

CBA, done well, makes comparison possible because the values for different benefits can be standardised.

The five-case business case model that the UK Government now endorses

means that the economic case (the CBA) is now only one of five parts of the

overall business case. As stressed to us by interviewees, decisions are not

based on the economic case alone (although they can be presented this way publicly).96

As such, the UK is effectively already using a form of multi-criteria analysis,

albeit without the explicit weights required for a full analysis of this kind. This

is preferable as ‘implicit weighting’ requires ministers to outline a clear narrative for their decisions.

Impacts are not valued or included consistently

Despite extensive guidance, it remains inherently difficult to monetise many of the

impacts of an infrastructure project. For example, valuing loss of life or the loss of

access to a nearby park will always be subjective. Monetising these impacts ‘can work

as long as everyone agrees on the measures used [but] if there is disagreement, proxy

* It is possible to undertake a multi-criteria analysis without aggregating the numbers (partial multi-criteria

analysis) and without explicitly weighting the different options. The results of a government ‘five-case model’

for business cases look similar to an unaggregated multi-criteria analysis that has not been weighted.

measures of value are very easy to pour scorn onto. CBA and any attempt to monetise

costs and benefits are very bad at measuring things that are priceless’.97

But CBA should do its best to include all relevant impacts. That some impacts are

difficult to monetise should not be a reason to neglect them, especially when decision

makers care a lot about them.98 Given that affected communities often raise objections

about what is, and is not, included in a CBA, some consistent numbers would be better than no, or inconsistent, numbers.

The main problem that analysts currently face is the lack of an agreed technique for

valuing these impacts. This can, quite unintentionally, lead to inconsistency. As noted

above, The Green Book recommends assessing which valuation technique is

appropriate on a case-by-case basis.99 While there needs to be a degree of flexibility

given variations in size of project and resources available, this flexibility can cause

problems. In the past, it has led to inconsistency in how different departments include

costs and benefits, especially non-market ones.100

Inconsistency makes comparing projects more difficult, and creates room for

unnecessary valuation disputes, which obscure more fundamental areas of

disagreement. Dispute, in turn, leads to delay, and delays increase the likelihood of cost overruns.101

In addition to the above, we heard that the complexity and presentation of the

Government’s guidance can be challenging for analysts and policymakers to master.

The Green Book provides helpful guidance, which is widely used, but mastery of its

contents is low. One civil servant described it as like the Highway Code:102 some follow

it to the letter, the majority follow its spirit and some flout the rules completely.103 The

Treasury could do more to make The Green Book more accessible.

New guidance and data from Defra and the Department for Transport is welcome

The civil service has done a lot of work to promote appraisal consistency, mainly

through additional guidance and suggested models for presentation.104

The work of Defra and the Department for Transport has been particularly valuable.

Defra has provided an environmental issues checklist105 to help analysts determine

which impacts are relevant and need to be assessed; the Department for Transport

poses similar questions in its guidance for practitioners on deciding which economic

impacts to consider.106 The Department for Transport’s WebTAG guidance also provides

appraisal tables107 to promote consistency when analysing impacts that are hard to monetise.

Using standardised data is key to comparability. Defra is developing a new dataset for

environmental impacts as part of its work with the Natural Capital Committee, an

independent government advisory committee,108 in partnership with the Office for

National Statistics. This entails attempting to value natural assets: ‘soil, air, water and

all living things’.109 To do so, it combines different methods: estimates of both the

market and non-market values of benefits from the UK’s natural capital. Non-market

examples include the value of natural capital as a source of recreation and air

filtration.110 Defra plans to make these data publicly available in the future, which

should help analysts to consistently include environmental impacts.111

In transport, the Department for Transport frequently updates its TAG data book,112

which includes data to value the impacts referred to in appraisal guidance:

journey‑time savings, operating costs, carbon emissions, noise and safety

improvements. The department has also developed a public Wider Impacts Dataset,

which should facilitate consistency.113 Ensuring that this guidance is followed is crucial.

Incorporating all relevant benefits, being realistic about costs and valuing impacts

consistently should improve the accuracy of CBAs. But more should be done to

improve the way CBA results are used in decision making.

This requires the results to be well understood: not only by analysts, but also by the

senior civil servants and ministers who use the results. This requires analysts to

present their work transparently and communicate it clearly to policymakers and

ministers. Decision makers must understand the critical assumptions and

uncertainties in CBA, and what CBA does, and does not, explain.

Beyond Whitehall, ministers must improve how CBA’s role in decision making is

communicated to ensure that they do not mislead the public when explaining why particular projects have been selected.

Cost benefit analysis is not properly understood by all in Whitehall

From our interviews, we identified a series of problems stemming from a mixed

understanding of CBA on the part of policymakers, and inadequate communication on the part of analysts.

It is not clear that policymakers always properly understand the purpose of economic

appraisal: to provide an estimate of the net impact of projects, allowing them to be

compared consistently. We heard from analysts that policymakers too often see CBA

as a tool that answers whether they should go ahead with a project,114 and they

dismiss it if analysts are unable to provide a certain, monetisable, figure for all

impacts. If policymakers view CBA as a hurdle to jump, they can put unhelpful

pressure on analysts to find favourable results.

We heard from departmental analysts that they sometimes feel pressured to find

favourable benefit–cost ratios for a minister’s or senior civil servant’s favoured

projects. They told us that this pressure is frequently compounded by additional

pressure – from the same sources – to turn the analysis around quickly and with a high

degree of certainty.115 Analysts become torn between the competing pressures of

ensuring analytical rigour, and being seen to ‘get things done’ and contribute to

securing sign-off for a project.* Given that CBA results are used by departments to

appraise projects and by the Treasury to prioritise spending, it is important that they

are robust, and that analysts do not feel pressured to find particular results.

6. Cost benefit analysis is not communicated clearly and transparently

* This is not exclusively a public-sector problem; there is evidence of similar behaviour in the private sector. See Bain R (2015) ‘Ethics and advocacy in forecasting revisited – consultants in the dock’, Transport Xtra, LTT 680, September; Wachs M (1990) ‘Ethics and advocacy in forecasting for public policy’, Business & Professional Ethics Journal, vol. 9, pp. 141–57.

Seeing CBA as a hurdle to jump can lead to poor decision making. We were told that it

is not unusual, particularly in larger projects, for CBA to be used to justify a project

that has already been selected for political reasons. One former senior minister told us

that CBA is useful as an audit trail but that many projects are a gut decision.116 There

are legitimate political reasons for proceeding with a project despite a relatively low

benefit–cost ratio – a government may, for example, wish to prioritise investment in a

particular region. It would, however, be preferable if ministers were open about their

reasoning, rather than attempting to hide behind a retrofitted CBA.

Policymakers should not dismiss the results of CBA that do not give them the

outcomes they want. One senior interviewee responsible for a major project

expressed cynicism about whether CBA could really tell decision makers about the

impact of projects. Decision makers would, in their words, be “kidding [themselves]

that it [CBA] will be a true representation of the value that will be delivered by a

scheme”.117 To some degree, this is reasonable, given the limits of what CBA can

accurately capture. But it suggests a wholesale rejection of the value of CBA which is unwarranted and unhelpful.

This could be mitigated if policymakers better understood the uncertainty in CBA and

long-term forecasts. As we discussed in our first report in this series on infrastructure

projects,118 ministers and civil servants may not always fully understand project risks.

Their understanding will be hindered if CBA results are presented uncritically.

Better understanding requires better presentation. Results should not simply be a

single number: this kind of presentation implies a certainty that is unjustified. It also

makes it harder for policymakers to challenge critical assumptions and see how results

would change if certain assumptions – such as travel demand – varied. The Green Book explicitly acknowledges that:

sensitivity and scenario analyses should generally be included in presentations

and summary reports to decision makers, rather than just single point estimates…

there are ranges of potential outcomes, and hence [decision makers need] to judge

the capacity of proposals to withstand future uncertainty. 119

Most departments have their own guidance on how to present uncertainty. The

Department for Transport’s guidance120 states that results should be presented in an

appraisal summary table, showing all key impacts, quantitatively and qualitatively

assessed, monetised where relevant, alongside distributional effects.

Some transport project appraisals, usually larger ones, fit The Green Book’s criteria

for sensitivity analyses. The strategic cases for the different HS2 phases clearly

illustrate the range of benefit–cost ratios they could achieve,121 and their relative

likelihood.122 But the summaries for smaller projects often do not meet these criteria.

The Lower Thames Crossing business case provides an example.123 Here, the range of

costs and benefits were less clear because they were presented as single figures in different tables.

CBA results are normally presented in tables rather than as graphs or charts.

Departments could learn from other organisations such as the Bank of England and

the Office for Budget Responsibility, which present results in more visually appealing

CBA IS NOT COMMUNICATED CLEARLY AND TRANSPARENTLY 29

ways (see the examples below). The overarching aim of presentation should be to

make CBA as accessible as possible to those without an economics background. The

Department for Energy and Climate Change’s 2050 emissions model124 – which

anyone can use to see the impact of altering demand- and supply-side policies –

should be considered the benchmark for presenting highly technical information in an

easily understandable manner.125 If CBA is an opaque process, churning out a single

number that only analysts understand, it will alienate decision makers and ultimately make CBA less useful, and less used, in Whitehall.

Presenting uncertainty

Office for Budget Responsibility: Economic and Fiscal Outlook (EFO) forecasts for

public sector net borrowing (PSNB)126

These two charts compare median forecasts with other possible forecasts. In

the first chart, each lighter shade of the fan represents a 10% lower probability

of GDP being within that band. This illustrates increasing uncertainty over time.

The second chart shows the level of uncertainty for one particular forecast, for 2016–17.

Chart 2.4: March 2012 EFO GDP fan chart

-6

-4

-2

0

2

4

6

2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016

Percentage change on a year earlier

March 2012 EFO central forecast

Chart 2.5: March 2012 EFO probability projections for 2013 GDP

0

0.1

0.2

-1 0 1 2 3 4 5

Percentage change on a year earlier

Probability density

March 2012 EFO central forecast

Highways England: benefit–cost ratios (BCRs) for the Lower Thames Crossing127

In these two tables, PVB = present value of benefits, PVC = present value of

costs, WEBs = wider economic benefits, R = route, WSL = Western Southern Link and ESL = Eastern Southern Link.

Here, the ‘most likely’ benefit–cost ratios for different options are listed in one

table, and a separate table outlines the more cautious P90 benefit–cost ratios

(P90 indicates 90% confidence that the benefit–cost ratio will be above this

figure). Probability projection charts could visualise the likelihood of different

options achieving particular benefit–cost ratios but none are provided.

The above charts and tables have been reproduced with permission.

Cost benefit analysis is not explained clearly in public

Outside of Whitehall, there is miscommunication and misunderstanding about the role

that CBA plays in decision making. Our interviews with civil servants and ministers –

backed up by research at the local level128 and in other countries129 – indicated that

ministers tend to place most emphasis on the strategic case and wider political

considerations when making decisions. This is not necessarily wrong – the economic

TABLE 3.7 - MOST LIKELY BCRS FOR SHORTLISTED ROUTES (£BN PVB 2010 PRICES) [BR –BRIDGE; BT – BORED TUNNEL]

PVB (£bn) 2010 present value prices R1 R2 WSL R2 ESL R3 WSL R3 ESL R4 WSL R4ESL

Crossing type BR BT BT BT BT BT BT

PVB (excl WEBs & Reliability) 1.995 3.483 3.745 3.300 3.856 3.353 3.837

PVC (1) 1.222 1.578 1.672 1.564 1.656 1.757 1.858

Initial BCR 1.6 2.2 2.2 2.1 2.3 1.9 2.1

WEBs 0.737 1.264 1.626 1.353 1.677 1.678 1.735

Reliability 0.135 0.142 0.146 0.143 0.147 0.146 0.150

Adjusted BCR 2.3 3.1 3.3 3.1 3.4 2.9 3.1

(1) PVC calculation includes discounted revenues from user charges

TABLE 3.8 - P90 BCRS FOR SHORTLISTED ROUTES (£BN PVB 2010 PRICES) [BR – BRIDGE; BT – BORED TUNNEL]

PVB (£bn) 2010 present value prices R1 R2

WSL

R2

ESL

R3

WSL

R3

ESL

R4

WSL

R4

ESL

Crossing type BR BT BT BT BT BT BT

PVB (excl WEBs & Reliability) (£bn) 1.995 3.483 3.745 3.300 3.856 3.353 3.837

PVC (1) 2.053 2.235 2.334 2.185 2.284 2.465 2.570

Initial BCR 0.97 1.6 1.6 1.5 1.7 1.4 1.5

WEBs (£bn) 0.737 1.264 1.626 1.353 1.677 1.678 1.735

Reliability (£bn) 0.135 0.142 0.146 0.143 0.147 0.146 0.150

Adjusted BCR 1.4 2.2 2.4 2.2 2.5 2.1 2.2

(1) PVC calculation includes discounted revenues from user charges

case is only part of the reason to go ahead with a project. But the role that CBA plays is

sometimes misrepresented by ministers. This tends to happen in two ways, both of which are damaging.

First, politicians sometimes publicly overemphasise CBA results when they provide

high benefit–cost ratios. When benefit–cost ratios are high, politicians often find it

easy to reach for the CBA as an argument to persuade voters of the merits of a project.

When communicating with the public, the benefit–cost ratio is an easy-to-understand

metric – it sounds authoritative and persuasive – and a scan of press releases and

speeches around major project announcements reveals a tendency for politicians to emphasise it.

But the quantitative certainty it provides is illusory, and focusing on a single

benefit‑cost ratio contradicts Treasury guidance, which states that ‘the output of the

economic case should never be a one number answer’.130 Overemphasising the

certainty of CBA results can also limit political space for change if projects do not

proceed as planned, and the costs and benefits start to look different from those

announced at the beginning of a project, as successive transport ministers found with rail electrification.

Second, when benefit–cost ratios are low, the principles of CBA are often rejected

wholesale. It is frequently implied that the UK would underinvest in infrastructure if

we relied on conventional CBA. Lord Snape, the former Labour MP, has argued that:

If it were left to the Treasury, we would not have built the M25, the Jubilee line, the

Docklands Light Railway and various other schemes that most people would agree

are essential… [there are few projects that] would ever have passed the

preposterous cost benefit analysis so beloved of Her Majesty’s Treasury.131

Likewise, in a debate over the East Anglia Rail Franchise, Claire Perry, the former

Parliamentary Under Secretary of State for Transport and now Minister of State for

Climate Change and Industry, argued that:

[E]very major infrastructure project in this country has failed the economic value

test that the Government have imposed on it, because such projects are looked at

through a very narrow prism that does not factor in the economic value added that good transport investment brings.132

Both approaches – either using CBA as a full explanation or rejecting it as a decision

support tool – do more to confuse than to inform the public about the reasons for

going ahead with specific projects. If the public have been misinformed about the

reasons for going ahead with projects, it is much harder for them to make constructive

challenges. It also does a disservice to CBA. By failing to understand, or failing to

communicate, that CBA is not the sole basis for decisions, ministers encourage undue

scepticism towards CBA, both within and beyond Whitehall.

The Government has introduced measures to reduce the prevalence and impact of poor communication and understanding

The civil service has recognised the sometimes-poor communication and understanding of CBA on the part of analysts, policymakers and ministers. To date, it has focused on two kinds of reform to mitigate these problems: greater scrutiny and building understanding. Although these have not been specific to infrastructure, they could be replicated to improve the use of CBA for infrastructure projects.

Greater scrutiny

The independent Regulatory Policy Committee (RPC) highlights how greater scrutiny

can incentivise policymakers to take CBA seriously, reducing unwarranted pressures

they may place on analysts. Established in 2009, the RPC scrutinises the regulatory

impact assessments written by departments – the business cases for regulatory

changes – including a specific focus on the evidence base.133 RPC scrutiny appears to

have led to more rigorous analysis. Sense about Science, the Institute for Government

and the Alliance for Useful Evidence found that undertaking impact assessments for

regulations improved the transparency of the CBAs used to support them.134 Those of

our interviewees who had prepared impact assessments noted that the fear of

receiving a Red or Amber rating on an RPC evaluation motivated analysts and

policymakers to treat the evidence base seriously.135

Separating project promoters and analysts can also strengthen the integrity of

analysis. Both the Department for Transport and Transport for London have separate

teams with responsibility for reviewing appraisals. Such an approach could be adopted more widely.

Building understanding

Attempts to build understanding of CBA beyond analysts have been less formal, but

just as important to ensure that those outside the analytical community understand

and can make constructive challenges to CBA.

The Treasury has led central efforts to build civil servants’ understanding of business

cases by providing training courses for civil servants across the professions, and

drawing up new guidance on writing business cases. The latter culminated in an

extensive 2015 supplement to The Green Book on how to write business cases – the

‘five-case model’ – to ensure that they deliver public value.136

For project senior responsible officers, the Infrastructure and Projects Authority

co-ordinates the Major Projects Leadership Academy, which has helped civil servants

to develop skills to manage projects effectively. It would be beneficial if scrutinising

economic analysis was made a larger part of the commercial leadership competence137

to ensure that senior responsible officers can ask analysts constructive questions.

At the same time, there have been bottom-up efforts to improve relationships through

knowledge transfer. Transport for London, for example, runs training sessions for staff

in policy, delivery and commercial teams on how to create effective business cases.

Project managers and senior leadership are also invited to attend, helping to spread

knowledge of what analysts measure, and why.138

CBA is a valuable tool in the Government’s arsenal and the best tool available for

helping government to appraise and prioritise potential investments. Getting it right is therefore crucial.

Our recommendations in this chapter focus on how government can improve CBA’s

contribution to appraisal, and deciding which projects to greenlight. We focus on

reforms that will help government to meet our criteria for good CBA – that it includes

all relevant benefits and is realistic about costs, consistent, transparent and well communicated.

Including all relevant impacts

Recommendation: Departments should produce updated guidance on the dynamic effects of infrastructure

Analysts must advance the way they model the impact of transformational projects.

Ministers are interested in projects that can improve productivity, jobs and GDP. Yet,

analysts struggle to analyse these dynamic effects. This can complicate project

prioritisation and lead to unnecessary delays.

Techniques for estimating dynamic impacts do exist but they are costly, difficult to

undertake139 and relatively underdeveloped. Industry models tend not to be

transparent about the assumptions used, and often produce high numbers that lack

credibility. There is a risk of both undervaluation and overvaluation.

In light of this, departmental guidance on how to include dynamic effects must

improve both to ensure that all relevant benefits for transformational projects are

included, and to ensure that analysts do not assert that marginal projects will have

implausibly large dynamic effects. The Department for Transport’s WebTAG

guidance140 considers dynamic impacts but does not include them in the economic case for projects.

Government is aware of these challenges and has taken positive steps to improve

practice. It is important that these efforts retain momentum and receive the necessary internal resources.

To this end, the next versions of both WebTAG (following the Department for

Transport’s response to the feedback received in its recent consultation141) and The

Green Book should include clear advice for officials on reasonable assumptions and

models that can be used.

7. Recommendations: getting the most out of cost benefit analysis

i)        Recommendation: Wherever possible, claims about dynamic effects should be in the economic case for infrastructure projects

There needs to be much greater scrutiny and review of claims about dynamic effects, particularly for transport projects, where these effects can be extensive. Claims about dynamic effects can be central to the strategic case for large infrastructure projects, but the strategic case is not generally subjected to the same levels of scrutiny as the economic case. These effects should therefore be included in the economic case wherever possible.

Where these impacts cannot be included in the economic case, government should formalise an Infrastructure and Projects Authority review of the strategic case, drawing on external academic expertise. Promoters and analysts should also include a narrative of what dynamic effects they hope to achieve, how they aim to realise these, and what they will do to increase their likelihood.

ii)       Recommendation: Departments should undertake more research into dynamic effects

Systematic evaluation can play a role in refining analysis of dynamic impacts. The 2012 National Audit Office report on the completion and sale of High Speed 1 highlighted the need for this, especially where these benefits were used to justify a project.142 Ideally, such evaluations would create a ‘reference class’, against which claims for new projects could be judged. Alongside evaluation, it would also be beneficial, drawing on the example of the Industrial Strategy Challenge Fund,143 for government to commission further research into the dynamic effects of infrastructure.

Producing realistic cost estimates

iii)      Recommendation: Ministers must be upfront about uncertainty when decisions are based on early cost estimates

In the UK and internationally, cost and time estimates for infrastructure projects are almost always over-optimistic. While there are mitigating steps that analysts can take, over-optimism is to some degree inevitable, particularly when estimates must be turned around quickly for ambitious projects with large scopes. Ministers should appreciate this in their public announcements.

This does not mean delaying announcements until every penny is accounted for. But it does mean that ministers should be more honest about cost uncertainty and potential overruns in their public announcements, particularly for big projects. Being more upfront about the range of cost estimates will create the political space for ministers to make amendments or cancellations at a later stage. It is right that governments should revisit decisions considering the latest developments, and will find it easier to

do so if limitations of early cost estimates are communicated more clearly.

iv)     Recommendation: Departments must consistently evaluate infrastructure projects

Learning lessons from the past is critical. Previous projects, whether a stunning success or embarrassing failure, can provide invaluable guidance to those assessing costs and making infrastructure decisions today.

Personal experience is a powerful ally. A civil servant or minister is more likely to challenge and attempt to correct an overly positive assessment if they have been caught out by optimism bias in the past. The challenge that Whitehall faces is institutionalising this memory.

Post-project evaluation should become standard. We rightly expect that large infrastructure projects are subject to thorough economic assessments before securing approval. Yet, comparatively little effort is expended checking whether these assessments were accurate. This is not simply a matter of holding yesterday’s decision makers to account, but of ensuring that tomorrow’s decision makers do not make the same mistakes.

This is not a novel observation. In the most recent review of optimism bias in Network Rail, one of three main recommendations was that Network Rail should improve its data capture for future projects through standardised data entry, recording any applied adjustments and final costs.144 Similarly, in his 2013 review of major projects, Lord Browne recommended that the Major Projects Authority (now the Infrastructure and Projects Authority) should:

carry out or mandate a post project audit for most major projects, especially when they are novel or could bear lessons for other current projects. The findings from the audits should inform the future work of the authority and be incorporated into training for project leaders. Only the MPA can consistently ensure that post project audits take place, and it is a telling weakness of the current approach that they do not happen as a matter of course.145

As the Government’s centre of expertise for infrastructure, the Infrastructure and Projects Authority should ensure that thorough post-project evaluations are undertaken for all projects that have appeared on the Government Major Projects Portfolio. All should include an assessment of the extent to which the project achieved its objectives. All projects, of whatever size, should systematically collect data on cost outturns against estimates, delivery times against estimates, the size of project teams and project length.

v)      Recommendation: Departments must use the data from project evaluations to improve future appraisals

Data from infrastructure project evaluations should be used to inform appraisals, especially cost estimates. Departments should use this information, collated centrally by the Infrastructure and Projects Authority, to update their optimism bias guidance and enhance reference class forecasting. The example of Highways England is instructive.

By learning from frequent, similar projects, Highways England has significantly improved the accuracy of its assessments. The second Road Investment Strategy has been costed based on predicted and actual cost outturn data for over 400 recent schemes. Indeed, aside from in the preliminary assessment of options, Highways England no longer uses Green Book guidance to estimate optimism bias as it has found its own estimates to be more accurate.

Such an approach will be less applicable to big, one-off projects. For these megaprojects, reference class forecasting from past UK projects may not be possible.

But the general principles should still apply, using examples from other countries.

Consistently valuing impacts

vi)      Recommendation: Analysts across all levels of government must have a consistent and structured approach to valuing impacts

There is no right way to monetise many impacts, but there should be a consistent way.

Greater consistency across departments would help the Treasury to prioritise between projects, particularly where the choice is between completely different options, for example whether to prioritise a transport project or an energy project.

We recommend that other departments follow the Department for Transport in producing standardised guidance on valuation and presentation for their appraisals.

This should build on the expertise that already exists within government, for example Defra’s expertise on the environment. Other parts of the public sector should draw on this knowledge and data where relevant.

To ensure that impacts are valued consistently, government needs to make sure that this guidance is followed. Analysts should be including all relevant impacts based on the latest data. Where impacts are not deemed significant enough to be quantified or detailed, the reasons for this should be explained.

vii)    Recommendation: Whitehall must have a consistent way of incorporating the valuations of impacts into the decision-making process

We recommend that the Government takes the following five-stage approach to incorporating the valuations of impacts into the decision-making process.

1. Calculate a benefit–cost ratio that only includes monetary impacts, such as train fares. This would show the strictly commercial return on investment, and the payback period.

2. Calculate a benefit–cost ratio that also includes those impacts that can be straightforwardly and robustly monetised, such as journey-time savings.

3. Show the effect of including impacts where monetisation is possible, but contested – such as environmental impacts – as a sensitivity test.

4. If relevant, show the impact of including uncertain but important dynamic effects as a sensitivity test.

5. Present these sensitivity tests to decision makers alongside the two benefit–cost ratios. Both should include a narrative as well as numerical figures, explaining whether uncertain effects justify moving the project into a different value‑for‑money category. The Green Book should be clearer on how to present these effects, drawing on the similar method suggested by the Department for Transport’s Transport Appraisal and Strategic Modelling team.146

viii)    Recommendation: The Treasury should make The Green Book more user friendly

Streamlining The Green Book and making it more user friendly will help to improve how consistently it is applied. We recommend that The Green Book should give more advice on the strategic case, use more case studies, provide guidance on the time required to appraise different types and sizes of projects, and include key information currently found in the supplementary guidance – for example on risk – in the main guide.

It is essential that the new guidance is understood by policymakers. Its rollout should therefore be accompanied by an extensive training programme with appropriate support provided for senior civil servants and ministers, as well as analysts.

The Government should also explore new ways to present the guidance. To this end, we are pleased to see that Exploring Economics – a network of economics professionals across the civil service – is working to create an accessible guide to The Green Book.147 We recommend that the Treasury should work closely with Exploring Economics and the wider community of practice to update The Green Book.

This should draw on good service design principles. Starting with developing a clear understanding of Green Book users’ needs and problems to overcome, they should design the guidance and its presentation so that it fits with existing methods of working, constantly assessing whether it is performing as intended. For example, this might mean only providing users with the information that they need at any given stage but providing a clear route to additional information when required.

The Green Book has been under review since early 2016 and we hope that many of the changes listed above will be reflected in the new guidance. It is important that the UK’s uncertain political environment does not delay sign-off. Otherwise, there is a risk that the prospect of an updated version undermines the credibility of the existing guidance, further reducing how consistently it is applied.

Improving transparency and communication

ix)     Recommendation: Analysts should communicate and present their work more transparently

Our research found that policymakers and ministers do not always seem to have a good grip on CBA, viewing it as either a hurdle to jump or a decision-making tool.  Some interviewees had concerns that uncertainties and assumptions were not fully understood at the senior level.

Analysts can help here, by being clear about which assumptions are critical and what might happen to the value of a project if assumptions changed.  Some of this is already done: The Green Book discusses scenario analysis, and the guidance produced by the Transport Appraisal and Strategic Modelling team within the Department for Transport requires the production of a benefit–cost ratio range, not just the central estimate.148  But it could be done better. Results still tend to be in tables rather than charts and graphs, yet the latter can communicate uncertainties in a more visually appealing way.  Departments could learn from the Bank of England and the Office for Budget Responsibility about how to better communicate technical information in an understandable way.

x)      Recommendation: Ministers need to be honest about the data and assumptions underpinning CBA

Ministers must improve the way they communicate CBA to the public. This means being transparent about the assumptions that underpin CBAs, making the relevant data available to the public and presenting benefit–cost ratios as a range of possible outcomes rather than a single number. A better understanding of CBA within Whitehall should facilitate better ministerial communication with the public.

Greater transparency can strengthen both the quality of decision making and public trust in the process. Local communities affected by infrastructure projects, academics and other stakeholders can act as auditors, refining assumptions, providing additional data and identifying issues that may have been missed by officials. Making greater use of the expertise that lies outside of government departments and agencies will help to expose dubious early cost estimates in early-stage business cases,149 improving project design and selection.

Furthermore, greater honesty with the public about the uncertainty of costs, timescales and likely benefits when projects are announced would provide more political space for ministers to make amendments or cancellations at a later stage. The Government should revisit decisions considering new technological, social or economic developments and will find it easier to do so if the assumptions underpinning CBAs are communicated more clearly.

xi)      Recommendation: Training for officials and ministers should be expanded

Training for analysts has improved over time and is generally of a high quality. But it is critical that training is not limited to analysts. Senior civil servants and ministers, the people making decisions, need to understand the sensitivity of estimates and have the confidence to challenge underpinning assumptions. The establishment of the Major Project Leadership Academy is a positive step forward for developing the skills of senior responsible officers and we would like to see the Academy expand knowledge transfer programmes for project leaders. The programme of activities for the Academy alumni group should be expanded. The skills of ministers are rarely nurtured in a formal way. We recommend that ministers with infrastructure responsibility should have deeper and more regular engagement with the Academy.

Decision making is improved if officials in policy, delivery and commercial teams have a better understanding of analysts’ work. At Transport for London, these staff are invited to attend sessions on forming business plans that explain the creation of the economic case. Similar training should be made available more widely. The Infrastructure and Projects Authority is well placed to provide this, with priority given to civil servants from departments such as the Department for Transport and the Department for Business, Energy and Industrial Strategy, which undertake significant economic modelling for infrastructure projects.

The Infrastructure and Projects Authority already co-ordinates Transforming Together, a community of practice that brings together staff working on transformation projects to share their work and develop a joint approach to delivering transformation. We recommend that an equivalent is created for infrastructure, potentially in partnership with Exploring Economics. Such a network would enable analysts and others to discuss best practice, including the latest modelling tools and evidence.

xii)     Recommendation: Government must bolster the independence of analysis

Creating a clearer separation between project promoters and analysts can strengthen the integrity of analysis. We recommend that government departments and agencies not currently set up in this way should follow the example of the Department for Transport and Transport for London, which have separate teams with responsibility for reviewing appraisals. A balance must be struck between institutional separation and joint working. Good economic analysis will be an iterative process between analysts and policymakers, but it is important that CBAs are not just post-hoc justifications of decisions already made.

The Infrastructure and Projects Authority should also independently assess the CBAs conducted for major projects to ensure that they ‘deliver the option that enacts the objectives of ministers with the highest benefits to cost ratio that is possible.’150 To effectively scrutinise modelling undertaken by departments, the Infrastructure and Projects Authority should be given the resources to hire a team of economists. This ongoing additional expenditure on central assurance is more than justified by the medium- to long-term project savings that could be realised.

References

1 Mackie P and Worsley T (2013) International Comparisons of Transport Appraisal Practice:

Overview report, Institute for Transport Studies, University of Leeds, https://workspace.

imperial.ac.uk/ref/Public/UoA%2014%20-%20Civil%20and%20Construction%20

Engineering/Wider%20economic%20Impacts/K%20Worsley%20report.pdf

2 Sartori D, Catalano G, Genco M, Pancotti C, Sirtori E, Vignetti S and Del Bo C (2014)

Guide to Cost–Benefit Analysis of Investment Projects: Economic appraisal tool for

cohesion policy 2014–2020, European Commission,

http://ec.europa.eu/regional_policy/sources/docgener/studies/pdf/cba_guide.pdf

3 Coyle D (2016) ‘A land built by economists?’, The Enlightened Economist, 10 October,

retrieved 1 September 2017, www.enlightenmenteconomics.com/blog/index.

php/2016/10/a-land-built-by-economists

4 Atkins G, Wajzer C, Hogarth R, Davies N and Norris E (2017) What’s Wrong with

Infrastructure Decision Making? Conclusions from six UK case studies, Institute for

Government, www.instituteforgovernment.org.uk/sites/default/files/publications/

Infrastructure%20report%20%28final%29r.pdf

5 Gallo A (2014) ‘A refresher on net present value’, Harvard Business Review, 19 November,

retrieved 26 June 2017, https://hbr.org/2014/11/a-refresher-on-net-present-value

6 Worsley T and Mackie P (2015) Transport Policy, Appraisal and Decision-Making, RAC

Foundation, p. 2, www.racfoundation.org/assets/rac_foundation/content/

downloadables/Transport_policy_appraisal_decision_making_worsley_mackie_May_2015_final_report.pdf

7 HM Treasury (2003) The Green Book: Appraisal and evaluation in central government, The

Stationery Office, www.gov.uk/government/uploads/system/uploads/attachment_

data/file/220541/green_book_complete.pdf

8 Venables AJ, Laird J and Overman H (2014) Transport Investment and Economic

Performance: Implications for project appraisal, Department for Transport, www.gov.uk/

government/uploads/system/uploads/attachment_data/file/386126/TIEP_Report.pdf

9 Department for Transport (2016) Understanding and Valuing the Impacts of Transport

Investment: Updating Wider Economic Impacts Guidance, Department for Transport

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/554783/transport-appraisal-guidance-webtag-consultation-document.pdf

10 Department for Transport (2013) Transport Analysis Guidance: WebTAG, GOV.UK,

updated 2017, retrieved 11 August 2017,

www.gov.uk/guidance/transport-analysis-guidance-webtag

11 Mackie P and Worsley T (2013) International Comparisons of Transport Appraisal Practice:

Overview report, Institute for Transport Studies, University of Leeds, p. 3,

https://workspace.imperial.ac.uk/ref/Public/UoA%2014%20-%20Civil%20and%20Construction%20Engineering/

Wider%20economic%20Impacts/K%20Worsley%20report.pdf

12 HM Treasury (2016) Treasury Approvals Process for Programmes and Projects,

HM Treasury, p. 24, https://www.gov.uk/government/uploads/system/uploads/

attachment_data/file/567908/Treasury_approvals_process_guidance_final.pdf

13 Atkins G, Wajzer C, Hogarth R, Davies N and Norris E (2017) What’s Wrong with

Infrastructure Decision Making? Conclusions from six UK case studies,

Institute for Government, p. 13, www.instituteforgovernment.org.uk/sites/default/

files/publications/Infrastructure%20report%20%28final%29r.pdf

14 HM Treasury (2016) Treasury Approvals Process for Programmes and Projects,

HM Treasury, https://www.gov.uk/government/uploads/system/uploads/attachment_

data/file/567908/Treasury_approvals_process_guidance_final.pdf

15 HM Treasury (2013) Public Sector Business Cases: Using the five case model, HM Treasury,

www.gov.uk/government/uploads/system/uploads/attachment_data/file/469317/

green_book_guidance_public_sector_business_cases_2015_update.pdf

16 Department for Transport (various years) DfT Value for Money Indicator, Department for

Transport, retrieved 2 August 2017, https://www.gov.uk/government/publications/

percentage-of-dft-s-appraised-project-spending-that-is-assessed-as-good-or-verygood-

value-for-money

17 Department for Transport (2016) Understanding and Valuing Impacts of Transport

Investment: Updating wider economic impacts guidance, GOV.UK, pp. 15–16,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/554783/transport-appraisal-guidance-webtag-consultation-document.pdf

18 Cox E and Davies B (2014) Transformational Infrastructure for the North: Why we need a

Great North Plan, IPPR North, p. 14, www.ippr.org/files/publications/pdf/

transformational-infrastructure-North_Aug2014.pdf

19 Eddington R (2006) The Eddington Transport Study: The case for action:

Sir Rod Eddington’s advice to government, HMSO, p. 9,

http://webarchive.nationalarchives.gov.uk/20081230093524/http:/www.dft.gov.uk/

about/strategy/transportstrategy/eddingtonstudy/

20 HM Treasury and Infrastructure and Projects Authority (2016) National Infrastructure

and Construction Pipeline 2016, GOV.UK, retrieved 7 August 2017, www.gov.uk/

government/publications/national-infrastructure-and-construction-pipeline-2016

21 The Industrial Strategy Commission (2017) Laying the Foundations, The Industrial

Strategy Commission, pp. 28–30, http://industrialstrategycommission.org.uk/

wp-content/uploads/2017/07/Laying-the-Foundations-the-Industrial-Strategy-Commission.pdf

22 Coyle D (2016) ‘Infrastructure cannot be built on empty purses and big promises’,

Financial Times, 16 March, retrieved 30 June 2017,

www.ft.com/content/4bb57c70-e9cd-11e5-888e-2eadd5fbc4a4

23 Department for Transport and High Speed Two (HS2) (2013) The Strategic Case for HS2,

GOV.UK, p. 31, para. 79, www.gov.uk/government/uploads/system/uploads/

attachment_data/file/260525/strategic-case.pdf

24 Byett A, Laird J, Stroombergen A and Trodd S (2015) Assessing New Approaches to

Estimating the Economic Impact of Transport Interventions Using the Gross Value Added

Approach, research report 566, NZ Transport Agency,

www.nzta.govt.nz/assets/resources/research/reports/566/docs/566.pdf

25 Laird JJ, Nash C and Mackie PJ (2014) ‘Transformational transport infrastructure: cost

benefit analysis challenges’, Town Planning Review, vol. 85, no. 6, pp. 709–30, retrieved

11 August 2017, http://eprints.whiterose.ac.uk/87446/

26 Institute for Government, Infrastructure Project interview, 2017.

27 Stewart J (2013) ‘Give the economic benefits of infrastructure a chance to breathe’,

The Guardian, 11 February, retrieved 9 August 2017, www.theguardian.com/publicleaders-

network/2013/feb/11/economic-benefits-infrastructure-greater-manchester

28 Greater Manchester Combined Authority, Association of Greater

Manchester Authorities, Greater Manchester Local Enterprise Partnership

and Transport for Greater Manchester (2014) Greater Manchester Growth

and Reform Plan: Transport Strategy and Investment Plan, retrieved 14 September 2017,

http://www.tfgm.com/ltp3/Documents/GMGRP-Transport-Strategy.pdf

29 Threlfall R (2016) Oral Evidence: Autumn Statement 2016, HC837,

House of Commons Treasury Committee, p. 32,

http://data.parliament.uk/writtenevidence/committeeevidence.svc/

evidencedocument/treasury-committee/autumn-statement-2016/oral/44537.pdf

30 Institute for Government, Infrastructure Project interview, 2017.

31 HM Treasury (2016) Autumn Statement 2016, GOV.UK, retrieved 30 June 2017,

www.gov.uk/government/publications/autumn-statement-2016-documents/

autumn-statement-2016

32 Lowe J (2014) ‘Developing better business cases using the five case model; for better

decision making and better outcomes’, APMG International webinar, 4 September,

retrieved 7 August 2017, www.slideshare.net/apmg-inter/developing-better-businescases-

using-the-five-case-model-for-better-decision-making-and-better-outcomes

33 Department for Transport (2015) Value for Money Framework: Moving Britain ahead,

Department for Transport, www.gov.uk/government/uploads/system/uploads/

attachment_data/file/630704/value-for-money-framework.pdf

34 Dobes L and Leung J (2015) ‘Wider economic impacts in transport infrastructure cost

benefit analysis – a bridge too far?’, Agenda, vol. 22, no. 1, retrieved 30 June 2017,

http://press-files.anu.edu.au/downloads/press/p332753/html/argument02.

xhtml?referer=1750&page=11#

35 Institute for Government, Infrastructure Project interview, 2017.

36 Institute for Government, Infrastructure Project interviews, 2017.

37 Dobes L and Leung J (2016) ‘Wider economic impacts in transport infrastructure cost

benefit analysis – a bridge too far?’, Agenda, vol. 22, no. 1, retrieved 30 June 2017,

http://press-files.anu.edu.au/downloads/press/p332753/html/argument02.

xhtml?referer=1750&page=11#

38 Rosewell B (2012) ‘21st century policy development’, comment, 21 March, retrieved 21

June 2017, www.bridgetrosewell.com/21st-century-policy-development

39 Institute for Government, Infrastructure Project interview, 2017.

40 Department for Transport and High Speed Two (HS2) (2013) HS2: Strategic case, GOV.UK,

retrieved 7 August 2017, www.gov.uk/government/publications/hs2-strategic-case

41 High Speed Two (HS2) Limited (2013) The Economic Case for HS2, High Speed Two (HS2)

Limited, p. 59, http://assets.hs2.org.uk/sites/default/files/inserts/S%26A%201_

Economic%20case_0.pdf

42 KPMG (2013) High Speed Two (HS2) Limited: HS2 regional economic impacts, High Speed

Two (HS2) Limited, http://assets.hs2.org.uk/sites/default/files/inserts/HS2%20

Regional%20Economic%20Impacts.pdf

43 Topham G (2013) ‘KPMG to face MPs again over HS2 report’,

The Guardian, 11 November, retrieved 30 June 2017,

www.theguardian.com/uk-news/2013/nov/11/kpmg-hs2-mps-forecast-report

44 BBC News (2013) ‘HS2 “losers” revealed as report shows potential impact’, BBC News,

19 October, retrieved 8 August 2017, www.bbc.co.uk/news/uk-24589652

45 Threlfall R (2013) Oral Evidence: Economics of HS2, HC 788, House of Commons Treasury

Committee, retrieved 7 August 2017, www.parliament.uk/business/committees/

committees-a-z/commons-select/treasury-committee/inquiries1/parliament-2010/

economics-of-hs2

46 Department for Transport (1999) Transport and the Economy: Full report (SACTRA), The

National Archives, retrieved 30 June 2017, http://webarchive.nationalarchives.gov.

uk/20090224130217/http://www.dft.gov.uk/pgr/economics/sactra/

transportandtheeconomyfullre3148

47 HM Treasury (2015) Valuing Infrastructure Spend: Supplementary guidance to The Green

Book, GOV.UK, p. 33, www.gov.uk/government/uploads/system/uploads/attachment_

data/file/417822/PU1798_Valuing_Infrastructure_Spend_-_lastest_draft.pdf

48 Department for Transport (2005) Transport, Wider Economic Benefits, and Impacts on

GDP, The National Archives, retrieved 7 September 2017, http://webarchive.

nationalarchives.gov.uk/+/http:/www.dft.gov.uk/pgr/economics/rdg/webia/

webmethodology/sportwidereconomicbenefi3137.pdf

49 Department for Transport (2016) Understanding and Valuing Impacts of Transport

Investment: Updating wider economic impacts guidance, GOV.UK, retrieved 30 June 2017,

www.gov.uk/government/consultations/transport-investment-understanding-andvaluing-impacts

50 Department for Transport (2016) Value for Money: Supplementary guidance on

categories, Department for Transport, www.gov.uk/government/uploads/system/

uploads/attachment_data/file/627490/value-for-money-supplementary-guidanceon-

categories.pdf

51 Venables AJ, Laird J and Overman H (2014) Transport Investment and Economic

Performance: Implications for project appraisal, Department for Transport, p. 4,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/386126/TIEP_Report.pdf

52 HM Treasury, Infrastructure and Projects Authority and Jones A (2017) ‘Billion pound

connectivity boost to make buffering a thing of the past’, news story, GOV.UK, 3 July,

retrieved 7 August 2017, www.gov.uk/government/news/billion-pound-connectivityboost-

to-make-buffering-a-thing-of-the-past

53 Flyvbjerg B (2014) ‘What you should know about megaprojects, and why: an overview’,

Project Management Journal, vol. 45, no. 2, p. 9,

https://arxiv.org/ftp/arxiv/papers/1409/1409.0003.pdf

54 Ibid., p. 12.

55 Terrill M (2016) Cost Overruns in Transport Infrastructure, Grattan Institute, p. 3,

https://grattan.edu.au/report/cost-overruns-in-transport-infrastructure

56 Flyvbjerg B, Garbuio M and Lovallo D (2009) ‘Delusion and deception in large

infrastructure projects: two models for explaining and preventing executive disaster’,

California Management Review, vol. 51, no. 2, pp. 170–93.

57 National Audit Office (2013), Over-Optimism in Government Projects, National Audit

Office, p. 9, https://www.nao.org.uk/wp-content/uploads/2013/12/10320-001-Overoptimism-

in-government-projects.pdf

58 Kahneman D (2011) Thinking, Fast and Slow, Allen Lane, pp. 119–29.

59 Institute for Government, Infrastructure Project interviews, 2017.

60 Brown R, Ashley R and Farrelly M (2011) ‘Political and professional agency entrapment:

an agenda for urban water research’, Water Resources Management, vol. 25, no. 15,

pp. 4037–50.

61 Terrill M (2016) Cost Overruns in Transport Infrastructure, Grattan Institute, p. 25,

https://grattan.edu.au/report/cost-overruns-in-transport-infrastructure

62 House of Commons (2017) Rail Electrification, Parliament UK, retrieved 8 August 2017,

http://researchbriefings.parliament.uk/ResearchBriefing/Summary/SN05907

63 Ibid., pp. 33–5.

64 National Audit Office (2016). Department for Transport and Network Rail: Modernising the

Great Western Railway, HC781. London: The Stationery Office, p. 21.

65 Department for Transport (2017) Railways Act 2005 Statement: High level output

statement, Department for Transport, p. 5, www.gov.uk/government/uploads/system/

uploads/attachment_data/file/630674/high-level-output-specification-web.pdf

66 BBC News (2017) ‘Crossrail 2: support by government “outrageous” after northern

snub’, BBC News, 25 July, retrieved 8 August 2017,

www.bbc.co.uk/news/uk-england-40708531

67 Welsh Government (2017) ‘Ken Skates demands £700m from UK Government for

scrapping electrification across South Wales’, news release, 21 July 2017, retrieved 1

September 2017, http://gov.wales/newsroom/transport/2017/170721infra/?lang=en

68 Burnham A (2017) Letter to Chris Grayling, 22 July, retrieved 1 September 2017,

https://drive.google.com/file/d/0BxfytYfkWADUVjE4eDVIZS1FR1U/view

69 Mott MacDonald (2002) Review of Large Public Procurement in the UK, Mott MacDonald.

70 HM Treasury (2013) Green Book Supplementary Guidance: Optimism bias, HM Treasury,

retrieved 1 September 2017, www.gov.uk/government/publications/green-booksupplementary-

guidance-optimism-bias

71 Atkins G, Wajzer C, Hogarth R, Davies N and Norris E (2017) What’s Wrong with

Infrastructure Decision Making? Conclusions from six UK case studies, Institute for

Government, p. 13, www.instituteforgovernment.org.uk/sites/default/files/

publications/Infrastructure%20report%20%28final%29r.pdf

72 Flyvbjerg B (2013) ‘Quality control and due diligence in project management: getting

decisions right by taking the outside view’, International Journal of Project Management,

vol. 31, no. 5, pp. 760–74.

73 Transport Appraisal and Strategic Modelling Team (2017) TAG Unit A1.2: Scheme costs,

Department for Transport, p. 15, www.gov.uk/government/uploads/system/uploads/

attachment_data/file/625380/TAG_unit_a1.2_cost_estimation_jul17.pdf

74 Transport Appraisal and Strategic Modelling Team (2017) WebTAG: Forthcoming changes,

Department for Transport, retrieved 4 September 2017,

www.gov.uk/government/publications/webtag-forthcoming-changes

75 De Reyck B, Grushka-Cockayne Y, Fragkos I, Harrison J and Read D (2017) Optimism Bias

Study: Recommended adjustments to optimism bias uplifts, Department for Transport,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/576976/dft-optimism-bias-study.pdf

76 Department for Environment, Food and Rural Affairs (2003) Flood and Coastal Defence

Project Appraisal Guidance, Department for Environment, Food and Rural Affairs,

www.gov.uk/government/uploads/system/uploads/attachment_data/file/181443/ fcd3update0303.pdf

77 Highways England (2015) Post Opening Project Evaluation (POPE) of Major Schemes:

Executive summary, Highways England, p. 6, www.gov.uk/government/uploads/system/

uploads/attachment_data/file/497240/POPE___Meta_2015___summary___final.pdf

78 Highways England (2015) Post Opening Project Evaluation (POPE) of Major Schemes:

Main report: Meta-analysis 2015, Highways England, p. 3,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/497241/POPE___Meta_2015_Final_210116_-_FINAL.pdf

79 Institute for Government, Infrastructure Project interviews, 2017.

80 Highways England (2015) Post Opening Project Evaluation (POPE) of Major Schemes:

Main report: Meta-analysis 2015, Highways England,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/497241/POPE___Meta_2015_Final_210116_-_FINAL.pdf

81 Topham G and Elgot J (2017) ‘Heathrow night flights to continue until third runway is

built’, The Guardian, 13 July, retrieved 1 September 2017, www.theguardian.com/

uk-news/2017/jul/13/heathrow-night-flights-to-continue-until-third-runway-is-built;

Wharf H (2013) ‘Reaction to the Government’s HS2 project’, The Independent,

3 February, retrieved 7 August 2017, www.independent.co.uk/news/uk/home-news/

reaction-to-the-governments-hs2-project-8478614.html

82 Baker R and Ruting B (2014) Environmental Policy Analysis: A guide to non‑market

valuation, staff working paper, Productivity Commission, pp. 24–30,

www.pc.gov.au/research/supporting/non-market-valuation/non-market-valuation.pdf

83 HM Treasury (2011) The Green Book: Appraisal and evaluation in central government,

The Stationery Office, retrieved 30 June 2017, www.gov.uk/government/publications/

the-green-book-appraisal-and-evaluation-in-central-governent

84 Ibid., p. 23, Box 10.

85 Ibid., pp. 23, 57.

86 Ibid., p. 58.

87 Ibid., p. 57.

88 Ibid., p. 58.

89 Ibid., pp. 64–5.

90 Ibid., p. 35.

91 Annema JA, Mouter N and Razaei J (2015) ‘Cost–benefit analysis (CBA), or multi-criteria

decision-making (MCDM) or both: politicians’ perspective in transport policy appraisal’,

Transportation Research Procedia, vol. 10, pp. 788–97, p. 790, retrieved 11 August 2017,

www.sciencedirect.com/science/article/pii/S2352146515002197

92 Department for Communities and Local Government (2009) Multi-Criteria Analyis: A

manual, Department for Communities and Local Government, p. 5, retrieved 11

September 2017, https://www.gov.uk/government/uploads/system/uploads/

attachment_data/file/7612/1132618.pdf

93 Australian Bureau of Agricultural and Resource Economics and Sciences (2017)

‘Multi-criteria analysis (MCAS-S)’, Department of Agriculture and Water Resources,

Australian Government, retrieved 7 August 2017,

www.agriculture.gov.au/abares/aclump/multi-criteria-analysis

94 NZ Transport Agency (2017) ‘Multi criteria analysis for transport business cases

guidance’, New Zealand Government, retrieved 7 August 2017, http://www.nzta.govt.

nz/resources/17-02-new-multi-criteria-analysis-for-transport-business-casesguidelines-for-consultation/

95 Gühnemann A, Laird JJ and Pearman AD (2012) ‘Combining cost–benefit and multicriteria

analysis to prioritise a national road infrastructure programme’,

Transport Policy, vol. 23, pp. 15–24, retrieved 7 August 2017,

www.sciencedirect.com/science/article/pii/S0967070X12000753

96 Institute for Government, Infrastructure Project interviews, 2017.

97 Durrant D (2013) ‘Evaluating grand infrastructure projects like HS2 using cost benefit

analysis may actually make the case harder to make’, blog,

LSE British Politics and Policy, 12 September 2013, retrieved 30 June 2017,

http://blogs.lse.ac.uk/politicsandpolicy/36311/

98 Nellthorp J and Mackie P (2000) ‘The UK roads review—a hedonic model of decision

making’, Transport Policy, vol. 7, no. 2, pp. 127–38, retrieved 7 August 2017,

www.sciencedirect.com/science/article/pii/S0967070X00000020

99 HM Treasury (2003) The Green Book: Appraisal and evaluation in central government,

The Stationery Office, p. 57, The Stationery Office, www.gov.uk/government/uploads/

system/uploads/attachment_data/file/220541/green_book_complete.pdf

100 National Audit Office, Options Appraisal: Making informed decisions in government,

National Audit Office, 2011, pp. 15–17,

www.nao.org.uk/wp-content/uploads/2011/05/1012_Option_Appraisal.pdf

101 Flyvbjerg B, Skamris Holm MK and Buhl SL (2004) ‘What causes cost overrun in

transport infrastructure projects?’, Transport Reviews, vol. 24, no. 1, pp. 3–18, p. 4,

https://arxiv.org/ftp/arxiv/papers/1304/1304.4476.pdf

102 Department for Transport (2015) The Official Highway Code, The Stationery Office.

103 Institute for Government, Infrastructure Project interview, 2017.

104 For example: Dunn H (2012) Accounting for Environmental Impacts: Supplementary Green

Book guidance, HM Treasury and Defra, www.gov.uk/government/uploads/system/

uploads/attachment_data/file/191500/Accounting_for_enviornomental_impacts.pdf

105 Department for Environment, Food and Rural Affairs (2013) Assessing Environmental

Impact: Guidance, GOV.UK, retrieved 7 August 2017,

www.gov.uk/guidance/assessing-environmental-impact-guidance

106 Department for Transport (2014) TAG Unit A2.1: Wider impacts, GOV.UK, retrieved

7 August 2017, www.gov.uk/government/uploads/system/uploads/attachment_data/

file/427091/webtag-tag-unit-a2-1-wider-impacts.pdf#nameddest=chptr03

107 Department for Transport (2013) WebTAG: Appraisal tables, GOV.UK, retrieved 7 August

2017, www.gov.uk/government/publications/webtag-appraisal-tables

108 GOV.UK (no date) Natural Capital Committee (NCC), GOV.UK, retrieved 7 August 2017,

www.gov.uk/government/groups/natural-capital-committee

109 Office for National Statistics (2017) Methodology: Natural capital, Office for National

Statistics, retrieved 30 June 2017, www.ons.gov.uk/economy/nationalaccounts/

uksectoraccounts/methodologies/naturalcapital

110 Office for National Statistics (2016) UK Natural Capital: Monetary estimates, 2016,

Office for National Statistics, retrieved 30 June 2017, www.ons.gov.uk/economy/

environmentalaccounts/bulletins/uknaturalcapital/monetaryestimates2016

111 Curnow J (2017) ‘How do you put a value on “natural capital”?’, blog,

Civil Service Quarterly, 2 August 2017, retrieved 7 August 2017,

https://quarterly.blog.gov.uk/2017/08/02/how-do-you-put-a-value-on-natural-capital

112 Department for Transport (2017) WebTAG: TAG data book, July 2017, GOV.UK, retrieved 7

August 2017, www.gov.uk/government/publications/webtag-tag-data-book-july-2017

113 Department for Transport (2013) WebTAG: Economic impacts worksheets,

GOV.UK, retrieved 30 June 2017,

www.gov.uk/government/publications/webtag-economic-impacts-worksheets

114 Institute for Government, Infrastructure Project interviews, 2017.

115 Ibid.

116 Institute for Government, Infrastructure Project interview, 2017.

117 Ibid.

118 Atkins G, Wajzer C, Hogarth R, Davies N and Norris E (2017) What’s Wrong with

Infrastructure Decision Making? Conclusions from six UK case studies, Institute for

Government, www.instituteforgovernment.org.uk/sites/default/files/publications/

Infrastructure%20report%20%28final%29r.pdf

119 HM Treasury (2003) The Green Book: Appraisal and evaluation in central government,

The Stationery Office, p. 6, www.gov.uk/government/uploads/system/uploads/

attachment_data/file/220541/green_book_complete.pdf

120 Department for Transport (2013) ‘WebTAG: appraisal tables’, Department

for Transport, retrieved 10 August 2017,

www.gov.uk/government/publications/webtag-appraisal-tables

121 Department for Transport (2013) The Strategic Case for HS2, Department for Transport,

p. 109, www.gov.uk/government/uploads/system/uploads/attachment_data/

file/260525/strategic-case.pdf

122 Department for Transport (2017) High Speed Two Phase Two: Strategic case,

Department for Transport, p. 47, www.gov.uk/government/uploads/system/uploads/

attachment_data/file/629393/high-speed-two-phase-two-strategic-case.pdf

123 Highways England (2016) Lower Thames Crossing: Summary business case, Highways

England, pp. 26, https://highwaysengland.citizenspace.com/cip/lower-thamescrossing-

consultation/user_uploads/lower-thames-crossing-consultation-summarybusiness-case.pdf

124 Department for Business, Energy and Industrial Strategy (2013) 2050 Pathways,

GOV.UK, retrieved 10 August 2017, www.gov.uk/guidance/2050-pathways-analysis

125 Greenway A (2016) ‘Policy without the hot air’, comment, 30 May, retrieved 10 August

2017, https://medium.com/@ad_greenway/policy-without-the-hot-air-8bb064dcdd62

126 Office for Budget Responsibility (2012) Briefing Paper No. 4: How we present uncertainty,

Office for Budget Responsibility, p. 20, http://budgetresponsibility.org.uk/docs/dlm_

uploads/Briefing-paper-No4-How-we-present-uncertainty.pdf

127 Highways England (2016) Lower Thames Crossing: Summary business case, Highways

England, p. 26, https://highwaysengland.citizenspace.com/cip/lower-thames-crossingconsultation/

user_uploads/lower-thames-crossing-consultation-summary-businesscase.pdf

128 Mullen CA and Marsden G (2015) ‘Transport, economic competitiveness and

competition: a city perspective’, Journal of Transport Geography, vol. 49, pp. 1–8,

http://eprints.whiterose.ac.uk/90360/2/Transport_and_comp_text_Aug%2015_

final%20%281%29.pdf

129 Annema JA, Mouter N and Razei J (2015) ‘Cost benefit analysis (CBA), or multi-criteria

decision-making (MCDM) or both: politicians’ perspective in transport policy appraisal’,

Transportation Research Procedia, vol. 10, pp. 788–97, retrieved 10 August 2017,

www.sciencedirect.com/science/article/pii/S2352146515002197

130 HM Treasury (2015) Public Sector Business Cases Using the Five Case Model: Green Book

supplementary guidance on delivering public value from spending proposals, HM

Treasury, www.gov.uk/government/uploads/system/uploads/attachment_data/

file/469317/green_book_guidance_public_sector_business_cases_2015_update.pdf

131 Lord Snape, cited in Lords Hansard (2013) ‘High Speed Rail (Preparation) Bill:

second reading (and remaining stages)’, retrieved 4 September 2017,

www.publications.parliament.uk/pa/ld201314/ldhansrd/text/131119-0003.htm

132 Perry C (2015) cited in House of Commons Debate, 16 December 2015, c1665,

retrieved 4 September 2017, www.theyworkforyou.com/debates/?id=2015-12-16c.16

53.0&s=Jubilee+Line+Extension#g1665.0

133 Regulatory Policy Committee (2014) ‘Regulatory Policy Committee: recommendations

used when scrutinising impact assessments’, GOV.UK, retrieved 27 June 2016,

www.gov.uk/government/publications/how-the-regulatory-policy-committeescrutinises-

impact-assessments/regulatory-policy-committee-recommendationsused-

when-scrutinising-impact-assessments

134 Sense About Science, Institute for Government and Alliance for Useful Evidence (2016)

Transparency of Evidence: An assessment of government policy proposals May 2015 to

May 2016, Sense About Science, http://senseaboutscience.org/wp-content/

uploads/2016/11/SaS-Transparency-of-Evidence-2016-Nov.pdf

135 Institute for Government, Infrastructure Project interviews, 2017.

136 HM Treasury (2015) Public Sector Business Cases Using the Five Case Model: Green Book

supplementary guidance on delivering public value from spending proposals, HM Treasury,

www.gov.uk/government/uploads/system/uploads/attachment_data/file/469317/

green_book_guidance_public_sector_business_cases_2015_update.pdf

137 Cabinet Office and Saïd Business School (no date) Major Projects Leadership Academy:

MPLA handbook, Cabinet Office and Saïd Business School,

www.gov.uk/government/uploads/system/uploads/attachment_data/file/405600/

MPLA_Handbook_for_gov_uk.pdf

138 Institute for Government, Infrastructure Project interview, 2017.

139 Vickerman RW (2008) ‘Cost–benefit analysis and the wider economic benefits from

mega-projects’, in Priemus H, Flyvbjerg B and van Wee B eds., Decision-Making on

Mega-Projects, Edward Elgar, p. 74.

140 Department for Transport (2017) ‘Transport analysis guidance: WebTAG’, GOV.UK,

retrieved 23 August 2017, www.gov.uk/guidance/transport-analysis-guidance-webtag

141 Department for Transport (2016) ‘Understanding and valuing impacts of transport

investment: updating wider economic impacts guidance’, Department for Transport,

retrieved 23 August 2017, www.gov.uk/government/consultations/transportinvestment-

understanding-and-valuing-impacts

142 National Audit Office (2012) The Completion and Sale of High Speed 1, HC1834, London:

The Stationery Office.

143 Innovate UK and Department for Business, Energy and Industrial Strategy (2017)

‘Industrial Strategy Challenge Fund: joint research and innovation’,

Innovate UK and Department for Business, Energy and Industrial Strategy,

https://www.gov.uk/government/collections/industrial-strategy-challenge-fund-jointresearch-and-innovation

144 Read D (2017) Optimism Bias Study: Recommended adjustments to optimism bias uplifts,

Department for Transport, https://www.gov.uk/government/uploads/system/uploads/

attachment_data/file/576976/dft-optimism-bias-study.pdf

145 Lord Browne of Madingley (2013) Getting a Grip: How to improve major project

execution and control in government,

www.gov.uk/government/uploads/system/uploads/attachment_data/file/205209/

Getting_a_grip_Lord_Browne_major_project_review_Mar-2013.pdf

146 Department for Transport (2017) Value for Money Framework: Moving Britain ahead,

Department for Transport, retrieved 23 August 2017,

www.gov.uk/government/publications/dft-value-for-money-framework

147 Bearpark T, Heron A and Glover B (2017) ‘Economics in government: more open, more

diverse, more influential’, blog, Civil Service Quarterly, 8 August, retrieved 23 August

2017, https://quarterly.blog.gov.uk/2017/08/08/economics-in-government-moreopen-

more-diverse-more-influential

148 Department for Transport (2017) Value for Money Framework: Moving Britain ahead,

Department for Transport, retrieved 23 August 2017,

www.gov.uk/government/publications/dft-value-for-money-framework

149 Atkins G (2017) ‘Garden Bridge raises questions about infrastructure decisions’,

blog, Institute for Government, 13 February, retrieved 16 June 2017,

www.instituteforgovernment.org.uk/blog/garden-bridge-raises-questions-aboutinfrastructure-decisions

150 Lord Browne of Madingley (2013) Getting a Grip: How to improve major project

execution and control in Government, p. 8,

https://www.gov.uk/government/uploads/system/uploads/attachment_data/

file/205209/Getting_a_grip_Lord_Browne_major_project_review_Mar-2013.pdf

About the authors

Graham Atkins joined the Institute for Government as a

Researcher in November 2016. Prior to this, he worked as a

Researcher at DragonGate, a public sector consultancy,

working on a range of projects related to local government

and the broader public sector; and at a research agency,

Populus, on politics and corporate reputation.

Nick Davies joined the Institute as a Research Manager in

March 2017. He currently leads the Institute’s work on

infrastructure. Nick was previously Public Services Manager

at the National Council for Voluntary Organisations. He has

also worked at Children England, London Youth and as a

parliamentary researcher.

Tess Kidney Bishop is a Researcher at the Institute, working

on the infrastructure policymaking project. Prior to joining

the Institute, Tess completed an internship in public

diplomacy at the Australian High Commission in London.

Acknowledgements

The authors would like to thank colleagues Emma Norris,

Raphael Hogarth, Bronwen Maddox, Jill Rutter and Nicole

Valentinuzzi at the Institute for Government for invaluable

advice throughout this research. Thanks to the

communications team at the Institute for Government for

their support. We are also grateful to those who read early

drafts of this report, particularly Professor Denise Bower,

Professor Henry Overman and Rachel Wolf. We would like

to thank all those who generously agreed to talk to us

throughout our research. Finally, we are very grateful to the

Project Management Institute for supporting this work. All

errors and omissions are, of course, our responsibility alone.

The Institute for Government is the leading think tank working to make government more effective.

We provide rigorous research and analysis, topical commentary and public events to explore the key challenges facing government.

We offer a space for discussion and fresh thinking, to help senior politicians and civil servants think differently and bring about change.

© Institute for Government 2017

The Institute for Government is a registered charity in England and Wales (No.1123926) with cross-party

governance. Our main funder is the Gatsby Charitable Foundation, one of the Sainsbury Family Charitable Trusts.

instituteforgovernment.org.uk

enquiries@instituteforgovernment.org.uk

+44 (0) 20 7747 0400 +44 (0) 20 7766 0700

@instituteforgov

Institute for Government, 2 Carlton Gardens

London SW1Y 5AA, United Kingdom