DOI: 10.5553/EJLR/138723702017019102006

European Journal of Law ReformAccess_open

Article

Why Better Regulation Demands Better Scrutiny of Results

The European Parliament’s Use of Performance Audits by the European Court of Auditors in ex post Impact Assessment

Keywords EU budget, European Parliamentary Research Service, policy evaluation, scrutiny, oversight
Authors
DOI
Show PDF Show fullscreen
Abstract Author's information Statistics Citation
This article has been viewed times.
This article been downloaded 0 times.
Suggested citation
Paul Stephenson, 'Why Better Regulation Demands Better Scrutiny of Results', (2017) European Journal of Law Reform 97-120

    Ex post impact assessment (traditionally considered part of policy evaluation) received less attention in the preceding ‘Better Regulation’ package (2011) than ex ante impact assessment. Yet, the insights generated through ex post impact assessment provide crucial input for streamlining legislation. In recognition of its contribution, the current agenda (2015) extends the reach to policy evaluation, and from financial instruments to regulatory instruments. In light of existing experience with impact assessments in Commission Directorates-General (DGs), the European Union (EU) institutions have been increasingly aware of the need to develop staff expertise in ex post (policy) evaluation, which has in the past been largely outsourced to external parties. Making sense of collected input and incorporating it within impact assessment is time consuming. Indeed, taking up the findings for practical use is a challenge for political decision makers but essential for the purposes of accountability, scrutiny and institutional learning. The challenge is more so, given the wealth of information being generated by multiple parties and the increasing technical and financial complexity of certain policy areas. The role of the Commission as an advocate of ‘Better Regulation’ has been studied extensively. However, we know relatively little about the role of the European Parliament (EP) in ex post evaluation. This article contributes to the literature on ‘Better Regulation in the EU’ by shedding light on the EP activities in the realm of scrutiny and evaluation. In particular, it looks at the Parliament’s use of special reports produced by the European Court of Auditors (ECA) through its performance audit work and how it takes on board the findings and recommendations in its scrutiny of budgetary spending. Moreover, it examines the emerging role of the European Parliamentary Research Service (EPRS) in monitoring the outputs of the ECA and other bodies engaged in audit and evaluation, and thereby, the way in which the EPRS is helping increase the Parliament’s capacity for scrutiny and oversight.

Dit artikel wordt geciteerd in

    • A Introduction

      Ex post impact assessment (traditionally considered part of policy evaluation) received less attention in the preceding ‘Better Regulation’ package (2011) than ex ante impact assessment.1xEuropean Parliament Report of 18 April 2011 on guaranteeing independent impact assessments (2010/2016(INI)), Committee on Legal Affairs, ‘Rapporteur: Angelika Niebler’, available at: <www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A7-2011-0159&language=EN>. Yet, the insights generated through ex post impact assessment provide crucial input for streamlining legislation. In recognition of its contribution, the current agenda2xEuropean Commission, ‘Better Regulation Agenda: Enhancing Transparency and Scrutiny for Better Law-Making’, 19 May 2015, available at: <http://europa.eu/rapid/press-release_MEMO-15-4989_en.htm>. (2015) extends the reach to policy evaluation, and from financial instruments to regulatory instruments.
      In light of existing experience with impact assessments in Commission Directorates-General (DGs), the European Union (EU) institutions have been increasingly aware of the need to develop staff expertise in ex post (policy) evaluation, which has in the past been largely outsourced to external parties.3xM. Eliantonio & A. Spendzharova, ‘Introduction’, European Journal of Law Reform, in this Special Issue, 2017. Making sense of collected input and incorporating it within impact assessment is time consuming. Indeed, taking up the findings for practical use is a challenge for political decision makers but essential for the purposes of accountability, scrutiny and institutional learning. The challenge is more so, given the wealth of information being generated by multiple parties and the increasing technical and financial complexity of certain policy areas.
      The role of the Commission as an advocate of ‘Better Regulation’ has been studied extensively.4x See multiple authors in Special Issue on the Better Regulation Package: ‘How Much Better is Better Regulation?’, Vol. 6, No. 3, 2015. However, we know relatively little about the role of the European Parliament (EP) in ex post evaluation. This article contributes to the literature on ‘Better Regulation in the EU’ by shedding light on EP activities in the realm of scrutiny and evaluation. Its role in impact assessment work has been increasing, as reflected in the non-legislative inter-institutional agreement of 13 April 2016,5xEuropean Parliament, Council of the European Union and European Commission, ‘Inter-institutional Agreement on Better Law-Making’, 13 April 2016, OJ L 123/1, available at: <http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016Q0512(01)&from=en>. which states that “the three Institutions consider that public and stakeholder consultation, ex-post evaluation of existing legislation and impact assessments of new initiatives will help achieve the objective of Better Law-Making”.6x Ibid. The agreement includes a specific chapter on ‘ex post evaluations of existing legislation’, as one of the available better lawmaking tools, here as understood as Commission or Commission-sponsored evaluations.7xIn Part III, ‘Tools for Better Law-Making’, sub-section ‘Ex Post Evaluations of Existing Legislation’, Arts. 20-24 state as follows: “20. The three Institutions confirm the importance of the greatest possible consistency and coherence in organising their work to evaluate the performance of Union legislation, including related public and stakeholder consultations. 21. The Commission will inform the European Parliament and the Council of its multiannual planning of evaluations of existing legislation and will, to the extent possible, include in that planning their requests for in-depth evaluation of specific policy areas or legal acts. The Commission’s evaluation planning will respect the timing for reports and reviews set out in Union legislation. 22. In the context of the legislative cycle, evaluations of existing legislation and policy, based on efficiency, effectiveness, relevance, coherence and value added, should provide the basis for impact assessments of options for further action. To support these processes, the three Institutions agree to, as appropriate, establish reporting, monitoring and evaluation requirements in legislation, while avoiding overregulation and administrative burdens, in particular on Member States. Where appropriate, such requirements can include measurable indicators as a basis on which to collect evidence of the effects of legislation on the ground. 23. The three Institutions agree to systematically consider the use of review clauses in legislation and to take account of the time needed for implementation and for gathering evidence on results and impacts. The three Institutions will consider whether to limit the application of certain legislation to a fixed period of time (‘sunset clause’). 24. The three Institutions shall inform each other in good time before adopting or revising their guidelines concerning their tools for Better Law-Making (public and stakeholder consultations, impact assessments and ex-post evaluations).”
      Scholars have recently been conducting their own meta-evaluations of ex post legislative evaluations, looking – perhaps wishfully – to the closing of the regulatory cycle.8xE. Mastenbroek, S. van Voorst & A. Meuwese, ‘Closing the Regulatory Cycle? A Meta Evaluation of Ex-post Legislative Evaluations by the European Commission’, Journal of European Public Policy, Vol. 23, No. 9, 2016, pp. 1329-1348. At the same time, in the policymaking arena, the 2016 inter-institutional agreement recognized the whole policy cycle, from agenda setting to evaluation; in theory at least, effective and timely evaluation should feed into agenda resetting and policy reformulation. Whilst the agreement acknowledges the role that ex ante impact assessment work conducted during the early stages of the legislative process can play at the evaluation stage, it also explicitly recognizes – finally, one might argue – the need for thorough ex post impact assessment, which is much broader than implementation, and pertains to the evaluation of policies and programmes that have resulted from legislative activity.
      Traditionally, the EP has disposed of its own impact assessment tools, independent of those of the Commission – the so-called ‘implementation reports’ – effectively own-initiative reports that “allow[s] parliamentary committees to scrutinise how EU legislation, soft law and international agreements have been transposed into national law, implemented and enforced”9xI. Anglmayer, ‘Evaluation and Ex-post Impact Assessment at EU level’, European Parliament Research Service (Ex-Post Impact Assessment Unit), September 2016, available at: <www.europarl.europa.eu/RegData/etudes/BRIE/2016/581415/EPRS_BRI(2016)581415_EN.pdf>. – but arguably these have proven to be insufficient to provide for full oversight and scrutiny. Rapporteurs of implementation reports can not only request factual information from national parliaments but can also obtain expertise from the EP’s own research facilities, in particular, the policy departments and the DG for European Parliamentary Research Services (EPRS), which provide background analysis drafted in-house based on desk research and surveys. The EP implementation assessments are tailored to the needs of Members of the European Parliament (MEPs). In short, at an institutional level, the EPRS is beginning to play an important role in learning: processing the information, seeing what’s relevant, distilling the main findings and ‘feeding’ them to the committees or MEPs, where appropriate. It is arguably contributing to more informed and transparent lawmaking, in line with the goals of the Better Regulation agenda.
      More precisely, the EPRS now offers increased support to committees in their scrutiny work, by monitoring and assessing a wealth of studies, reviews, reports and audits conducted by other EU institutions and policy stakeholders. It maintains rolling checklists of a range of ‘scrutiny tools’, including the ongoing and planned evaluations of the European Commission and the special reports of the European Court of Auditors (ECA), which are the focus of this article. Yet, the EP has not always been so active in impact assessment activity (ex ante and ex post evaluations). In fact, it is only in the last 5 years that the EP has expanded its resources and expertise in this field. Whilst initially its focus from 2011 was on ex ante evaluation, since November 2013 it has broadened its remit to ex post evaluation, thus covering the whole legislative/policy cycle. Of course, one might question why the EP needs to be engaged in evaluation activity itself or whether it can simply rely on those produced by the Commission. Arguably, since it is the parliament’s role to scrutinize how the executive has implemented the EU budget, it must also scrutinize the evaluations conducted and commissioned by the executive. Part of this process may be to seek alternative evaluations in order to test or triangulate the findings. This may mean turning to the work of the ECA, third parties or even initiating its own evaluations.
      Much research to date on the EP has focused on ex ante financial scrutiny by the EP, and on ex ante impact assessment within the framework of the Regulatory Fitness and Performance Programme (REFIT) agenda.10xEliantonio & Spendzharova, in this Special Issue, 2017. By contrast, this article focuses specifically on ex post evaluation (ex post impact assessment) and the role that the audit findings of the ECA have played over time in the scrutiny work of the EP. It looks at the rise of performance audits and use of special reports as a ‘scrutiny tool’, examining the role of the newly established EPRS in both promoting the reports and complementing them through its own work. The article then explores the inter-institutional challenges of scrutiny over time, and considers, more fundamentally, the extent to which evaluations of policy performance can help the co-legislator contribute to delivering better regulation, together with the other EU institutions and the member states. After all, the Better Regulation agenda is cross-cutting and requires all institutions to cooperate. ECA performance audits and evaluations are an important input into EP oversight, alongside other stakeholder’s evaluations and reports within a larger ‘toolbox’ of informational resources.

    • B The Challenge of Financial and Budgetary Oversight for Parliaments

      EU legislators and policymakers are subject to fast-paced changes in the external environment and must keep up with transformations in social media. The legislature is under pressure to react more quickly, to deliver value judgements and opinions and to provide sound, evidence-based insights into policy performance. Second, since the global financial crisis, and budgetary austerity in the EU, the EU has committed itself to a ‘Europe of Results’,11xCommunication from the Commission, ‘A Citizen’s Agenda – Delivering Results from Europe’, 10 May 2006, COM(2006) 211 final. , 12xCommunication from the Commission, ‘A Europe of Results – Applying Community Law’, 5 September 2007, COM(2007) 502 final. to do more with less and demonstrate the added value of EU policymaking. Again, this has placed a pressure on the EU to put in place mechanisms that will help account for policy achievements. Third, although there is a clear political agenda for better and more streamlined legislation, there has arguably been a recognition of the limited capacity of the EU to engage in evaluation activity retrospectively, where traditionally ex post evaluation has been outsourced to external bodies, largely private or mixed consortia of academics and consultants. As a result, the EU institutions have never dedicated many resources to evaluation activities – even within the Commission’s Directorate-General for Regional Policy (DG REGIO), the unit for evaluating cohesion policy amounted to a mere handful of individuals.13xC. Mendez & J. Bachtler, ‘Administrative Reform and Unintended Consequences: An Assessment of the EU Cohesion Policy “Audit Explosion”’, Journal of European Public Policy, Vol. 18. No. 5, 2011, pp. 746-765. The Commission’s own framework for evaluation has developed over several decades, as Højlund argues, evolving from decentralized and sector based (1980-1994) to centralized and driven by accountability concerns (1995-1999), and thereafter driven by institutional reform and the call for evidence-based policymaking (2000-2006), to the present-day trend of undertaking policy evaluation alongside regulatory evaluation in an era of fiscal constraint (2007-2014).14xS. Højlund, ‘Evaluation in the European Commission – For Accountability or Learning?’, European Journal of Risk Regulation, Vol. 6, No. 1, 2015, pp. 35-46. Fourth, the EP has never had much capacity for ex post programme evaluation and scrutiny, relying largely on the Budgetary Control Committee (CONT). This has also meant dependence over time on the commitment of individual MEPs and rapporteurs, input from the library and the ad hoc work of research staff within the parliamentary DGs and committee secretariats. These points are explored in the following sections.

      I The Capacity for Oversight and Function of Parliamentary Committees

      Parliamentary oversight has been conceptualized as the “legislative supervision and monitoring of the decisions and actions of executive agents”.15xN. Font & I.P. Durán, ‘The European Parliament Oversight of EU Agencies Through Written Questions’, Journal of European Public Policy, Vol. 23, No. 9, 2016, pp. 1349-1366, at 1350. , 16xM.D. McCubbins, R. Noll & B. Weingast, ‘Administrative Procedures as Instruments of Political Control’, Journal of Law, Economics and Organization, No. 3, 1987, pp. 242-279. To date, much of the research has been on ex ante scrutiny prior to the adoption of legislation, and the link between the involvement of national parliamentary committees ex ante and eventual compliance of EU legislation.17xD. Finke & T. Dannwolf, ‘Who Let the Dogs Out? The Effect of Parliamentary Scrutiny on Compliance with EU Law’, Journal of European Public Policy, Vol. 22, No. 8, 2015, pp. 1127-1147, at 1128. The question of EP oversight of EU executive actors has been addressed from parliamentary and agency perspectives, with recent studies refuting the notion of the EP as a unitary actor.18xFont & Durán, 2016, pp. 1351-1352. Indeed, parliaments specialize through committees, which “manage the workload of the institution, build coalitions and change legislation on a regular basis”.19xT. Winzen, ‘Technical or Political? An Exploration of the Work of Officials in the Committees of the European Parliament’, The Journal of Legislative Studies, Vol. 17, No. 1, 2011, pp. 27-44, at 27.
      As such, there are few studies on ex post scrutiny and the impact of parliamentary scrutiny of policy spending (as opposed to legislation itself) on future legislation and policy programming. The ability of parliamentarians to probe and ask questions of the executive pertaining to budgetary implementation ultimately rests on the capacity of the parliament to provide sufficient insight into how money was spent. The media often pays a disproportionate amount of attention to oral questions, even if the nature and consequences of questioning in parliament may remain obscure.20xS. Martin, ‘Parliamentary Questions, the Behaviour of Legislators, and the Function of Legislatures: An Introduction’, The Journal of Legislative Studies, Vol. 17, No. 3, 2011, pp. 259-270, at 259. EP committees are supported by permanent secretariats, whose officials have “long held a prominent role in supporting MEPs in drafting parliamentary reports”,21xWinzen, 2011, p. 28; C. Neuhold, ‘The “Legislative Backbone” Keeping the Institution Upright? The Role of European Parliament Committees in the EU Policy-Making Process’, European Integration Papers Online, Vol. 5, No. 10, 2001. particularly committee chairs, though their role over time has declined. Nonetheless, scholars have asked if they are fundamentally engaged in technical work to assist ‘the smooth functioning of the policy process’ or if their role might have a political dimension.22xWinzen, 2011, p. 28. Officials in the EP contribute to the ‘spanning of ideological and sectoral cleavages’, becoming more professionalized and helping the EP ‘less dependent upon the expertise and administrative capacity of the executive’.23xM. Egeberg et al., ‘Parliament Staff: Unpacking the Behavior of Officials in the European Parliament’, Journal of European Public Policy, Vol. 20, No. 4, 2013, p. 495.
      If then we consider parliamentary committees to have informational resources at their disposal, how then might we judge their effectiveness as forums for scrutiny? Most studies have tended to focus on the influence of particular reports, rather than determining the proportion of reports that are influential.24xD. Monk, ‘A Framework for Evaluating the Performance of Committees in Westminster Parliaments’, Vol. 16, No. 1, 2010, p. 2. There are no clear criteria against which committees are measured. In fact, some have argued that the notion of ‘effectiveness’, as used in a governmental context, is not even relevant to parliamentary committees.25x Ibid., p. 5. As Monk asserts, “audit offices often assess the effectiveness of these policies and programmes and usually manage to do so in a robust manner.”26x Ibid. By contrast, parliamentarians can ‘engage in oversight activity, advertise their concern for constituents, or seek advancement through the astute management of important issues’ and we should not forget that they have political aims.27x Ibid. Effectiveness of parliamentary committees may be ‘largely in the eye of the beholder’.28xP. Thomas, ‘Effectiveness of Parliamentary Committees’, Parliamentary Government, No. 44, 1993, pp. 10-11.
      In the case of the EP, scholars have tended to attribute much success to the energies of the committee chair and rapporteur, and in turn, looked to the level of interest from other stakeholders as well as the amount of media attention placed on reports. The interest often comes down to issue saliency, its human dimension, or the ease which complexity can be reduced down to an easily digested narrative. As far as budgetary scrutiny and oversight is concerned, money is easy to communicate, and paradoxically, the story may have greater impact with politicians and citizens where the scrutiny process shines the light on cases of ineffective spending, mismanagement and fraud. Arguably, the construction of the public sphere in the EU has been built on five decades of journalism around fraudulent activity; this is largely how the citizen knows of the EU; this has even shaped what he or she considers the EU is. Indeed, perceptions of budgetary misspending have jeopardized the output legitimacy of the EU in terms of what it has produced.29xV. Schmidt, ‘Democracy and Legitimacy in the European Union Revisited: Input, Output and “Throughout”’, Political Studies, Vol. 61, 2013, pp. 2-22. However, arriving at any notion of what the EU delivers itself depends on throughput legitimacy, i.e. on ensuring the effective governance processes and practices of EU institutions and their administrative machinery. Throughput is “based on the interactions – institutional and constructive – of all actors engaged in EU governance”.30xSchmidt, 2013, p. 5. The quality of interaction and deliberation both matter. For our purposes then, throughput legitimacy relies on the effective scrutiny and oversight of the audits and evaluations of budgetary spending, by the EP, and in principle, also by national parliaments. In the EP, this ex post function has traditionally been carried out by the CONT committee, which takes a retrospective perspective, while select (also known as specialized, spending or sectoral) committees have tended to largely overlook questions regarding ex post evaluation.

      II Parliamentary Performance of ex post Budgetary Scrutiny

      Legislative scrutiny of public spending is a vital mechanism for holding the executive to account. A recent report by the Association of Chartered Certified Accounts (ACCA) has stressed that legislatures must improve their performance if scrutiny is to keep up with budget and accounting reforms, as well as with other financial developments.31xACCA (Association of Chartered Certified Accountants), ‘Parliamentary Financial Scrutiny in Hard Times’, 2011, available at: <www.accaglobal.com/content/dam/acca/global/PDF-technical/public-sector/tech-tp-pfs.pdf>. In the wake of the financial crisis, parliamentary scrutiny processes have been slow to evolve. The ACCA recognizes select committees as the ‘engine room’ of parliaments and an important forum for holding governments to account.32x Ibid., p. 1. Nonetheless, it recognizes the greater need for the training and professional development of parliamentarians to ‘promote a culture of financial awareness and to empower politicians to ask more searching questions on financial matters’, which requires high-quality accounting information and ‘effective independent audit’.33x Ibid. Systematic evaluation and monitoring is essential, but depends on input and own resources.
      Generally, national parliaments have become more active in budgetary matters, and more willing to engage with the budget process. This does not guarantee, however, that a more formalized system of scrutiny “automatically translate[s] into a more meaningful role for parliaments in effecting the budget”.34x Ibid., p. 5. Financial and budgetary scrutiny has long been considered “less compelling than the scrutiny of government policy and high-profile scandals”. In the UK in particular there has been a lack of interest in scrutinising government expenditure.35x Ibid., pp. 5-6. Technical support is a crucial factor, as is the allocation of sufficient time for debate, and access to appropriate resources and information, including high-quality analysis of policy performance by auditors, evaluators and third parties. Budget processes and policy programmes are complex, presenting a barrier to engagement by parliamentarians. The task of budget and budgetary control committees (also sometimes termed public accounts committee) is made more difficult by the fact that they are required to have oversight of a range of policy areas, not just one, as most committees do, and as such they must cultivate expertise among committee members, but also draw on the insight of external experts.
      As part of the ex post scrutiny role, budgetary control committees, therefore, look to the audit findings of supreme audit institutions (SAIs), or national audit offices/courts (the notion of a ‘court’ stemming from the Franco-Mediterranean model where audit was traditionally conducted by qualified lawyers, as opposed to accountants and financial controllers in the Anglos-Saxon or Nordic tradition). Audit findings help ‘provide public confidence and certainty in the systems of governance and public spending’, but the findings themselves depend on reliable and trusted systems of audit.36x Ibid., p. 20. SAIs are traditionally “the most effective legislative vehicle for scrutiny”,37x Ibid., p. 20. providing assurance of financial and regulatory compliance, but also, in recent years, paying greater attention to value-for-money examinations of transactions, otherwise known as performance audit. Performance audits of policies, programmes and other initiatives draw conclusions on performance and make recommendations to executive bodies. Their conduct is normally guided by the use of international standards within the audit profession. As such, the throughput legitimacy of ex post parliamentary scrutiny effectively depends on the throughput legitimacy of audit practice; likewise, there is a limit to the potential impact of audit reports if they are not taken up effectively by the legislature; audit only aids democratic accountability, it does not deliver it. Moreover, the potential for audit findings to be used in such as a way as to lead to better policy programming, and even, better regulation, depends on their quality, timeliness, relevance and overall ‘usability’ for parliamentarians.
      Performance audit in particular can help scrutinizers in parliament, by drawing attention to, and making value judgements on, policy outcomes, in terms of the three ‘E’s: economy, efficiency and effectiveness. The SAIs are thereafter dependent on the follow-up of their reports by parliament.38xA. Kanis, ‘Ex-Post Budgetary Oversight in Europe’, European Court of Auditors Journal, No. 6, 2011, pp. 15-17. Nevertheless, the role of SAIs and their relationship with their national parliament differs, with the literature on ex post budgetary oversight distinguishing between three models of SAI – Westminster, Board (Collegiate) and Court (Napoleonic) – though in practice the models, in their pure form, no longer strictly apply.39x Ibid., p. 16. In all EU parliaments, there is a committee responsible for reviewing and following up on audit reports. In most countries, in recent years there has been a greater focused placed on performance audit.

    • C The Historical Link between the Auditor and Legislature in the EU

      Making value judgements on policy outcomes, in terms of the three ‘E’s: economy, efficiency and effectiveness, is an important part of policy evaluation to achieve the goals of ‘Better Regulation’. Still, in order to reach meaningful conclusions on this matter, the EP needs to reach out to other institutional actors with professional expertise in this field, especially the ECA.

      I The External Audit Function: The Budget and the ECA

      The budgetary process of the Community was transformed with the treaty reforms of 1970 and 1975 (Brussels Treaties). Financial autonomy for the Community was closely linked to the notion of democratic budgetary control. The EP would need to establish new structures and procedures, including the establishment of a ‘discharge procedure’ and the establishment of a new committee, Budgetary Control Committee (CONT). An independent body was required to ensure the external control function on Community spending. The ECA was thus established in October 1977, following the EP’s acceptance of responsibility to subject the taxpayer’s money to control by the taxpayer’s representatives.
      The EP foresaw that the introduction of a system of own resources would increase its own powers. It is no surprise therefore that the Council initially objected to the creation of an independent audit body as this would disturb the balance of power amongst the three large Community institutions (the pre-existing Audit Board of the European Communities, operating from 1959, was effectively a non-independent, part-time body working under the auspices of the Council).40xP. Stephenson, ‘Starting from Scratch? Analysing Early Institutionalization Processes: The Case of Audit Governance’, Journal of European Public Policy, Vol. 23, No. 10, 2016, pp. 1481-1501. The ECA was eventually established at the instigation of the EP, and more particularly, thanks to pressure from the Bavarian President of the Budgetary Control Committee, Mr. Heinrich Aigner, whose 1973 report made a convincing case for the creation of a European audit office.41xEuropean Parliament, ‘The Case for a European Audit Office: Introduction by Heinrich Aigner (Vice-Chairman of the Committee on Budgets’, Secretariat Directorate-General for Research and Documentation, September 1973.
      Traditionally, the EP has supported the ECA work and valued the contribution it makes in helping the legislature to hold the European Commission and the member states to account in the implementation of the EU budget, i.e. to help it deliver ‘accountability’. It might be tempting, therefore, to consider the ECA as an agent of the EP,42xB. Laffan, ‘Becoming a “Living Institution”: The Evolution of the European Court of Auditors’, Journal of European Public Policy, Vol. 37, No. 2, 1999, pp. 251-268. since it is legally obliged to present its audit findings to the Parliament.43xArts. 285-287 TFEU state inter alia that ‘The ECA provides the European Parliament and the Council with a statement of assurance as to the reliability of the accounts and the legality and regularity of the underlying transactions which is published in the Official Journal of the European Union. This statement may be supplemented by specific assessments for each major area of Union activity’ and that ‘The European Court of Auditors also assists the European Parliament and the Council in exercising their powers of control over the implementation of the budget. However, the ECA has, since its inception, closely guarded its independence, both in the execution of its work, the establishment of work plans, selection of audit topics and in its internal organization. Its location in Luxembourg no doubt helped secure this independence but also encouraged a reputation as a distant, opaque and secretive body.
      From its early days, the ECA began to establish its own norms of ‘sound financial management’, drawing from the legal traditions and audit cultures of the member states, with an attempt to reconcile the focus on financial audit (do the sums add up?) and compliance audit (has money been spent according to the rules?) with a focus on performance audit (was the money spent effectively?).44xP. Stephenson, ‘Sixty Five Years of Auditing Europe’, Journal of Contemporary European Research, Vol. 12, No. 1, 2016. Generally speaking, it sought to combine the more legalistic approaches to audit within Franco/Latin cultures with the Anglo-Saxon approach from accountancy and public administration. The ECA carved out its ‘no surprises’ approach to audit that respected norms of transparency and good governance in relations between auditor and auditee.
      The ECA soon made its mark. In 1983, the ECA produced a general report on the sound financial management of Community spending.45xEuropean Court of Auditors, ‘Stuttgart Report’, 24 October 1983, OJ C 287. Directors of the audit groups inside the ECA were asked to check the three main areas of expenditure: Agricultural Guarantee Fund (CAP), Structural Funds and Development Aid. The ECA reports highlighted significant political and administrative shortcomings in the conduct of Community policies, which subsequently caused a chill in its relations with the Commission throughout much of the 1980s. Relations improved, however, and with the Maastricht Treaty, the status of the ECA was raised to an official institution from 1 November 1993, conferring with it new powers, and making its seat in Luxembourg permanent. It also introduced the notion of ‘statement of assurance’ (déclaration d’assurance or ‘DAS’), requiring the ECA to calculate individual error rates per area of spending, something that is not in fact done by national audit offices at the member state level.46xStephenson, 2016, supra note 44. , 47xArt. 285 TFEU asserts “The Court of Auditors shall examine the accounts of all revenue and expenditure of the Community. It shall also examine the accounts of all revenue and expenditure of all bodies set up by the Community insofar as the relevant constituent instrument does not preclude such examination.”

      II EP Scrutiny of the ECA Annual Report

      The ECA has arguably had an image problem, often overlooked by decision makers. If its works is referred to in the media, then it usually once a year, in autumn, when the president of the ECA presents the findings of ECA audits, not only on the EU accounts, i.e. on the Commission’s implementation of the EU budget, but also on the institutional expenditure of all EU institutions and agencies. On this occasion, the ECA provides a ‘statement of assurance’ (DAS), essentially a certificate (DAS) to confirm that its report – traditionally based on a financial and compliance audit approach to EU budgetary spending across all areas – provides an accurate representation of spending in the previous year. Importantly, it provides ‘error rates’ per policy area, which, contrary to common understanding, do not represent or suggest the incidence of fraudulent spending, but indicate the likelihood of error in the accuracy of accounts. Errors that are identified in the accounts of various EU policy areas may have occurred within the Commission, but most likely at ‘street level’ within the member states. The errors may be administrative, accidental, unintentional, perhaps owing to a misunderstanding of accounting procedures, eligibility rules, tendering procedures etc.
      The delivery of the ECA annual report on the EU budget to the CONT committee has become a core part of the ritual of the EP giving ‘discharge’ to the European Commission for the implementation of the EU budget. This process has traditionally focused on the scrutiny of quantitative data: statistics, percentages and numbers. This raises the question, firstly of the usefulness of such information for the legislator, in terms of trying draw up better regulation and effective policy? How does such knowledge empower the parliament? What can it do with such data? Second, since the causes of error rates are predictable and largely the same year on year, what is the value of dedicating so many resources within the ECA to financial and compliance audit every year? Although an annual statement of assurance might provide some symbol of accountability to the media and general public, it is questionable whether it leads to any broader learning amongst the financial beneficiaries or provides lessons for them to improve their financial management systems with the aim of lowering future error rates when implementing policies, programmes and projects paid for by the EU budget. Third, the annual report was for many years long, dense and complex, leading the EP to request more concise reports that provide audit insights that are easier to digest and more suited to the needs of the select committees and MEPs. This has led to a more condensed annual report, with much greater emphasis on presentation and clear communication. The ECA has also introduced a chapter on performance audit into the report to provide general conclusions as to the three ‘E’s of budgetary spending: effectiveness, economy and efficiency. This new dimension to the annual report has been introduced alongside the publication of an increasing number of special reports; indeed, the EP has called for a greater emphasis on performance audit, to better equip MEPs in their oversight and monitoring of EU budgetary spending.48xEuropean Parliament, ‘Resolution of 4 February 2014 on the future role of the Court of Auditors (2012/2064 (INI))’. available at: <www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P7-TA-2014-0060+0+DOC+XML+V0//EN>. In the section ‘The Court’s new dimension and challenges’, para. 9, the European Parliament: “Acknowledges the historic, constructive role of the DAS exercise in focusing on legality and regularity as useful indices of good financial practices and management performance at all levels of Union spending and in showing the way that EU funds have been used in accordance with the decisions of Parliament, acting as legislator and budget authority; underlines, however, that at this point, and in the future, the Court should devote more resources to the examination of whether economy, effectiveness and efficiency have been achieved in the use of the public funds entrusted to the Commission; the results of the findings obtained in Special Reports should imply corresponding adjustments in EU programmes.”

      III The Rise of Performance Audit and Use of Special Reports

      Performance audit is less concerned with whether the sums add up but rather, whether money was spent effectively.49xP. Stephenson, ‘Reconciling Audit and Evaluation? The Shift to Performance and Effectiveness at the European Court of Auditors’, European Journal of Risk Regulation, Vol. 6, No. 1, 2015, pp. 79-89. It focuses on the additionality of EU policy and what has ultimately been delivered to the taxpayer; in this sense, it comes closer to the general notion of policy evaluation, given the focus on results, impact and effectiveness. As Mendez and Bachtler state,50xC. Mendez & J. Bachtler, ‘Administrative Reform and Unintended Consequences: An Assessment of the EU Cohesion Policy “Audit Explosion”’, Journal of European Public Policy, Vol. 18, No. 5, 2011, pp. 746-765. performance audit has developed since the mid-1980s. It resembles evaluation in probing the efficiency and effectiveness of public programmes but is undertaken in a manner resembling auditing.51xM. Barzelay, ‘Central Audit Institutions and Performance Auditing: A Comparative Analysis of Organizational Strategies in the OECD’, Governance, Vol. 10, No. 3, 1997, pp. 235-260. Performance audits usually include evaluative elements of selected subjects and consider evaluation systems and information with a view to assessing their quality and, when they are considered to be satisfactory and relevant, use evaluation information as audit evidence. Less dense than the annual reports, the special reports can make for ‘arresting reading’.52xD. O’Keefe, ‘The Court of Auditors’, in D.M. Curtin & T. Heukels (Eds.), Institutional Dynamics of European Integration: Essays in Honour of Henry G. Schermers, Vol. II, 1994, pp. 177-194.
      Special reports are not new. Since 1977, the ECA has produced “a myriad of special reports on policy programmes or financial procedures”.53xG. Karakatsanis & B. Laffan, ‘Financial Control: The Court of Auditors and OLAF’, in J. Peterson & M. Shackleton (Eds.), The Institutions of the European Union, Oxford, Oxford University Press, 2012, pp. 242-261. In its first 20 years, the ECA published 102 special reports and studies (1977-1996), followed by 112 special reports in the following 7 years alone (1997-2004) and 71 in the following 5 years (2005-2010). The ECA claims that special reports “provide a means to focus on specific topics reflecting a high-level of risk and public interest, in particular performance issues”.54xEuropean Court of Auditors, Gaps, Overlaps and Challenges: A Landscape Review of EU Accountability and Public Audit Arrangements, 2014, p. 7, available at: <www.eca.europa.eu/Lists/ECADocuments/LR14_01/QJ0214776ENC.pdf>. A House of Lords report (2001) found special reports to be of a “generally greater value than the Annual Reports”, whilst recognizing variations in quality.55xKarakatsanis & Laffan, p. 249. Reports examined the effectiveness of internal programme expenditure (e.g. European Regional Development Fund (ERDF) assistance, energy programmes, fisheries), external expenditure (e.g. development aid, PHARE (Poland and Hungary: Assistance for Restructuring their Economies) and TACIS (Technical Aid to the Commonwealth of Independent States), nuclear safety in the Central and Eastern European Countries (CEECs)), customs union/revenue (e.g. risk analysis in customs control, protection of EC financial interests, assessment of value-added tax (VAT) and gross national product (GNP)) and EU institutions (allowances of MEPs, the added value of EU agencies).56xB. Laffan, ‘Auditing and Accountability in the European Union’, Journal of European Public Policy, Vol. 10, No. 5, p. 772. Less dense than the annual reports, special reports could make ‘arresting reading’ and provided general conclusions.57xO’Keeffe, 1994, p. 183.
      In the 2012 report on the future role of the ECA, conducted by international peer reviewers, the Parliament’s Budgetary Committee held that the Court is in a pre-eminent position to provide the legislator and the Budgetary Authority, especially Parliament’s Budgetary Control Committee, with valuable opinions on results achieved by the Union’s policies, in order to improve the performance and effectiveness of Union-financed activities, identify economies of scale and scope, as well as spillover effects amongst national policies of member states, and provide Parliament with external assessments of the Commission’s Support for further concentrating on special reports persists amongst many in the ECA management. As Klaus-Heiner Lehne, the recently appointed president of the ECA, then German member of the ECA and former MEP in the Committee on Legal Affairs, asserted in 2014,58xInterview with Klaus-Heiner Lehne in the European Court of Auditors Journal, May 2014. Available at: <www.eca.europa.eu/en/Pages/Journal.aspx>.

      …the Court of Auditor’s work must clearly shift towards special reports […] the results that it obtains in the form of error rates will have very limited added value from a political perspective. Politicians need to know specifically what has gone wrong and where; the Court of Auditors does not provide enough information on this. On the political side, be it at the Commission, the Parliament or the Council, it is extremely difficult to do anything sensible just with error rates. On the other hand, it is very helpful that special reports deal with substantive issues, draw substantive conclusions and put forward solutions.

      That the ECA makes increasing value judgements in its special reports, and thereafter, policy recommendations (largely aimed at the Commission), creates a tension, in so far as the ECA is expressing how policy fared. The ECA has traditionally been the (supposedly) apolitical agent of the EP itself a political and directly elected body. Moreover, that the EP has sought external assessments of the Commission’s own evaluations implies not only that performance audit is being carried out to ‘evaluate evaluations’ but that the EP is endorsing the ‘meta-auditing’ of the work of the executives responsible for implementation both at supranational and national levels.

    • D Scrutiny and ex post Budgetary Oversight in the European Parliament

      I MEPs and the Budgetary Control Committee

      Traditionally, scrutiny within the EP has been left to the Budgetary Control Committee (CONT), established in the late 1970s originally as a subcommittee of the Budgets Committee. Since its inception, it has been the one committee that adopts a retrospective perspective not only directly on spending but also indirectly on policy, in so far as its mandate is to ‘control’, i.e. monitor, oversee and scrutinize, past budgetary spending. Some might consider CONT weaker than the Budgets Committee (BUDG), and less powerful than the select committees, since it has no legislative powers. CONT is the odd one out since its remit vis-à-vis the select (spending) committees is strikingly different: whilst it seeks to hold the European Commission (and implicitly the member states) to account via the reports of the ECA, with regards to the use of EU funds, the motive of the select committees is to draw greater attention to, and secure financial commitments for, their particular policy area.
      Arguably, there is a tension within the institutional machinery of the EP. First, as co-legislator, it decides upon new laws, which provide the legal basis for policies/programmes to be financed; then a few years down the line, it engages in scrutinizing the effectiveness of these very policies/programmes, drawing on the findings of audits on budgetary spending and ex post evaluations. The EP holds the Commission to account for implementing legislation and policies, but does it – and can it – hold itself to account? Is it able to conclude objectively whether its laws were effective policy instruments? There is arguably a role conflict given that the EP acts both as a (political, biased) decision maker and as a (supposedly objective, unbiased) overseer of policy performance. MEPs both legislate (before spending) and scrutinize (after spending), and make judgements on the value of the results and impacts of the policies that emerged from legislation. Yet, MEPs are not objective evaluators; they are biased, politically driven actors who can use ex post evaluation to advance their own political agendas and interests, often for short-term goals, including re-election.
      Second, the large majority of the EP committee machinery, with its forward-looking perspective, is engaged in tasks relevant to the agenda-setting, policy formulation and decision-making stages of the policy process, whereas CONT has traditionally been the sole committee explicitly working to examine implemented policy and engage in some sort of ex post evaluation. In short, CONT’s motives and rationale is markedly different to those of the select committees. In light of this set-up, should we be concerned about the EP’s capacity to scrutinize? What do we know about how it goes about its scrutiny work? And how does it access and process information?
      Like other committees, CONT works on the basis of a committee chair, with rapporteur. It is made up of MEPs with policy expertise or who have expressed an interest in committee membership. The strength of the committee in recent years has arguably been dependent on the personality of the chair and on the commitment of its rapporteur to gather information on the behalf of its members, and invite external experts and speakers to its meetings, including public hearings. The current committee chair, Mrs. Ingeborg Grässle, a German MEP from the European People’s Party (EPP) political group has established a reputation as a fierce and persistent chair, not least in her relations with the ECA.
      A recent report by CONT on the future role of the ECA – the Sender report of 2014 – noted that the ECA mandate provides for significant flexibility to allow the Court to carry out its mission beyond the scope of the DAS. It recalled that the results of its performance audits in special reports ‘provide a significant opportunity to add value by focusing on and investigating high-risk areas’. In addition, it asserted that special reports “provide information to European citizens on the functioning of the Union and the use of European funds in many sectors, helping to bring Europe closer to its citizens and to make it more transparent and easier to understand”.59xEuropean Parliament, 2014, see section ‘The Court’s New Dimensions and Challenges’, note 10. , 60xThe 2014 report followed the 2012 public hearing of the Committee on Budgetary Control. The proceedings were titled ‘Future role of the European Court of Auditors: Challenges ahead and possible reform’. Rapporteur: Inés Ayala Sender, 30 May 2012.
      The motive of MEPs is, first and foremost, political. As elected representatives of their constituents, they belong to domestic political parties and transnational political groupings within the EP, whose aim is to secure more seats. As such, oversight and scrutiny by MEPs can never be a technocratic exercise. Even if objective, impartial evaluations, reviews, reports and audits are made available to the CONT committee, the findings, including conclusions and recommendations will ultimately be used to secure political objectives in the pursuit of obtaining and maintaining political power. In short, MEPs do not have an altruistic incentive to hold the European Commission (and member states) to account, based on a fundamental commitment to good norms (accountability, transparency and openness). Nonetheless, the promotion of, and adherence to, these norms serves the individual and collective interests of the EP, regardless of benefits of scrutiny for the taxpayer. One might even posit that, working to ensure the legitimacy of Community spending itself reinforces the legitimacy of the institution.

      II CONT and Audit Findings: A Scrutiny Gatekeeper?

      If we consider the institutional machinery for scrutiny in the EP, then there has been a clear case of path dependence – and lock-in – since the establishment of CONT in the late 1970s. As such, CONT has remained the single formalized venue for scrutiny, and held the de facto monopoly on oversight within the committee system. This is not to suggest that the select committees do not read and discuss audits and reports on spending in their policy domain; nonetheless, their formal role has not been to monitor and gauge the results of spending. As such, the ECA and its representatives have, over the years, reported directly to CONT. The ECA president has traditionally presented the annual report to CONT, each special report presented by a member of the ECA from one of the four (since 2017, five) audit chambers. As a result, the ECA has arguably had a narrow audience for its work, which might to some extent explain the perception by many stakeholders, including MEPs, of the limited relevance and impact of its work.
      In recent years, and soon after the appointment of Mr. Vítor Caldeira as president in 2007 (he left the ECA in January 2017), the ECA established a new communications unit to professionalize the quality of its reports, with greater consideration to key stakeholders and how they might seek to use the reports. The ECA realized the need for a nuanced communications strategy and for taking an approach that is differentiated according to end user. The ECA also opened itself to external peer review by other SAIs in 200861xEuropean Court of Auditors, International Peer Review of the European Court of Auditors, 2008, available at: <www.eca.europa.eu/Lists/ECADocuments/PEERREVIEW2008/PEERREVIEW2008_EN.PDF>. and 2014,62xBundesrechnungshof, International Peer Review of the European Court of Auditors, 2014 available at: <www.eca.europa.eu/Lists/ECADocuments/2013_PEER_REVIEW/2013_PEER_REVIEW_EN.pdf>. which concluded that the ECA had to be more responsive to stakeholder needs, particularly the EP as its main ‘client’. It advocated taking on higher risk subjects for its audits, by securing efficiency gains lost after enlargement in 2004 and by delivering more timely reports, including informing the EP of its calendar and work schedule for future special reports so that it could anticipate the audits more effectively. Moreover, in the past, the EP has expressed frustrations about the quantity of reports and the committee’s ability to process the findings; hence, a need for more concise and better-written audit reports.
      More controversially perhaps, has been the pressure exerted by CONT on the ECA to launch audits into areas of most interest to the committee – which implicitly means areas of most political interest. The ECA continues to make clear its independence as an official EU institution. In so doing, it must strike the balance between appearing receptive and responsive to EP interests, whilst maintaining control over its priorities and internal decision-making over audit. In some cases, however, we see evidence of the EP and ECA maintaining close and mutually reinforcing relationships, such as with the recent European Fund for Strategic Investment (EFSI, commonly known as the ‘Juncker Plan’), where the two institutions worked together to ensure some (be it limited) mandate for the ECA as the EU external auditor for a new financial mechanism only partially funded by the EU budget, but nonetheless, to the value of several billion euros. Given concerns over the implications for budgetary discharge and lack of clarity over the ECA’s own role in monitoring EFSI implementation, the ECA adopted a quick opinion63xEuropean Court of Auditors, Opinion No 4/2015 concerning the proposal for a Regulation of the European Parliament and of the Council on the European Fund for Strategic Investments and amending Regulations (EU) No 1291/2013 and (EU) No 1316/2013, 12 March 2015, available at: <www.eca.europa.eu/Lists/ECADocuments/OP15_04/OP15_04_EN.pdf>. and was able to influence decision-making by the Council and EP.64xProposal for a Regulation of the European Parliament and of the Council on the European Fund for Strategic Investments and amending Regulations (EU) No 1291/2013 and (EU) No 1316/2013, 31 January 2015, COM(2015) 10 final. A formal ECA mandate at least ensures a degree of oversight and scrutiny by the EP.
      A key component of the ECA communications strategy has been the appointment of a Member for Inter-institutional Relations, charged with consolidating relations with key stakeholders; the EP, Commission and Council. As well as working to establish closer personal relations with EU institutions and the media, the ECA member (together with a former journalist hired as a spokesperson) has also sought to build direct relations with the EP select committees to secure a direct audience for its reports (alongside CONT). In the past, ECA officials presented their reports to CONT, and members of other committees were invited, but on an ad hoc basis. By presenting a special report directly to the select committee – for example, a special report on the effectiveness of the EU budget in paying for rail freight infrastructure directly to the transport committee – it seeks to encourage the greater take-up of its work amongst parliamentarians. Such a strategy should empower the select committees and provide greater information of relevance to policymaking. If the EP as a whole has more insight into the success and failure of past policies (and the legislation from which they emanated), then arguably it should make more informed decisions about future policy, and perhaps even, design more effective legislation, recognizing that legislation is but one policy instrument available to policymakers (not legislating but using voluntary mechanisms might be a more effective future tool).

    • E Can Performance Audit Findings Help Achieve ‘Better Regulation’?

      I The Use of ECA Special Reports by the Commission

      According to international audit standards, the follow-up of audit reports is the final stage in the performance audit cycle of planning, execution and follow-up. Follow-up is an essential element in the cycle of accountability. Since legislation provides the basis for policies and programmes, the performance audit of these programmes should allow for various outcomes. Most directly, such audits should demonstrate whether or not the EU budget was well spent, and by implication, suggest whether the legislation that enabled policy was itself effective. After all, EU budgetary spending is only eligible within the legal framework of the particular programme, i.e. auditors must acknowledge the regularity framework, not least for compliance audit, but also when seeking to assess value for money. In addition, the EP should be able to gain insights into the effectiveness, efficiency and economy of the spending programmes under review, but also, by implication, provide insights into the suitability of legislation. Beyond the EP, the ECA performance audits should provide lessons for the Commission to improve the ways in which it monitors policy during implementation, so that new rules and procedures might be put into place to reduce the incidences of error.
      The ECA’s own analysis of the impact of its special reports has tended to focus on how they are received by the European Commission, rather than the EP. Indeed, the ECA’s fourth comprehensive review of 2016 sought to find out whether the Commission had “taken the necessary actions to adequately manage and follow up the ECA’s recommendations”.65xEuropean Court of Auditors, ‘Special Report No 2/2016: 2014 report on the follow up of the European Court of Auditors’, 26 February 2016, available at: <www.eca.europa.eu/en/Pages/DocItem.aspx?did=35401>. First, had the Commission adequately addressed the recommendations? Second, was its follow-up system now robust? The ECA examined 44 recommendations from eight reports published in the period 2009-2012. Performance audits focus explicitly on the achievement of policy objectives, and the efficiency and effectiveness (including of management) of particular spending measures. The ECA found that the Commission had fully implemented 60% of its audit recommendations, with 29% implemented in most respects, leaving just 3% not implemented. Particular improvements had been made with regards to the design, information, compliance and user-friendliness of the follow-up IT application. However, it found that the system ‘did not fully ensure a sufficient audit trail of actions taken, nor provide for an internal assessment of the adequacy of these actions, and the monitoring of partially-implemented recommendations’. In short, this kind of meta-audit suggests that the recommendations in ECA reports can help ensure better policy/programme implementation by the executive. One presumes that the lessons learned with regards to policy at the latter stages of the policy cycle will trickle down to policy reformulation and the future drafting of legislation in the Commission’s legal service, though this process itself engages multiple institutions beyond.

      II The Use of ECA Special Reports by the EP

      As discussed earlier, traditionally the EP has sought recourse to the special reports of the ECA within the traditional setting of the CONT committee, but in recent years the EP has seen the ECA pushing to secure ‘mainstream’ access to the select committees. Direct physical access will certainly help to build relationships between auditors and MEPs, encouraging greater interest in the ECA work. If the select committees are more informed about policy outcomes, then they should – in principle – be able to give more input into future legislative proposals.
      To assist in digesting and processing the audit reports, MEPs can now turn, not only to the research services within the parliament’s DG but also to the in-house research service, the EPRS. The EPRS provides added value by studying special reports in depth to extract the essential information, and then providing a legal context to the audit through an overview of the committee working documents, draft reports and EP resolutions that emerged both before and after the publication of the ECA special report. This provides a huge amount of policy-relevant information for MEPs and committees, who can understand where the ECA findings might come from, and look for evidence that criticism and shortcomings have been addressed by the legislature. In short, this close monitoring of deliverables by groupings inside the EP, as well as external bodies, provides valuable insight for parliamentary scrutiny.
      The EPRS compiles and regularly issues comprehensive “rolling checklists” of what it refers to as “important, but otherwise largely inaccessible, material relating to various aspects of the EU lawmaking and policy cycles”.66xEuropean Parliament, ‘EPRS Scrutiny Toolbox’, European Parliamentary Research Service Blog, 2017, available at: <https://epthinktank.eu/scrutiny-toolbox/>. Cumulatively, these seven checklists already amount to about a thousand pages of, often interactive, text. Besides monitoring the implementation of Country Specific Recommendations (CSRs) within European Monentary Union (EMU), and the implementation of the provision of the Fiscal Compact Treaty, the EPRS reviews clauses in EU legislation and EU international agreements, and the follow-up to European Council conclusions. As far as policy evaluation is specifically concerned, it monitors ex post evaluation exercises in the European Commission and the ECA special reports, of which about 15 a year. Though the special reports are easily retrievable from the ECA website, arguably, MEPs may be less inclined to read them if they must seek them out themselves.
      The rolling checklist of ECA special reports published by the EPRS in March 201667x Ibid. provides essential insight and ‘recent findings’ into 24 special reports from December 2014 to March 2016. The reports are sorted into 12 categories: (1) EU development aid / humanitarian aid / foreign affairs / European Development Fund / Enlargement; (2) International trade; (3) Economic and Monetary Union; (4) EU Budget / Budgetary Control / Discharge Procedure; (5) Environment; (6) Energy; (7) Regional Development (ERDF / Cohesion Fund); (8) Agriculture and Rural Development (Common Agricultural Policy (CAP) / European Agricultural Guarantee Fund (EAGF) / European Agricultural Fund for Rural Development (EAFRD)); (9) Transport / TEN-T; (10) Fisheries; (11) Internal Market and (12) Employment and Social Affairs (ESF).
      Let us take, for example, a recent special report on “EuropeAid’s evaluation and results-oriented monitoring systems”.68xEuropean Court of Auditors, ‘Special Report No 18/2014: Europe Aid’s Evaluation and Results-Oriented Monitoring System’, 11 December 2014, available at: <www.eca.europa.eu/en/Pages/DocItem.aspx?did=30363>. First, the checklist provides a report number, title and date, including a hyperlink to the full report as well as the summary. It then lists the explicit questions asked by the auditors, and the observations and recommendations made by the ECA. Thereafter, in the following row of the table, the EPRS provides a direct link to the CONT committee’s working document of the meeting (30 March 2015) held soon after publication of the report, and lists the specific recommendations of the committee rapporteur. Next, it provides a link to related EP reports and resolutions of other committees, here a DEVE draft report (March 2016) awaiting committee decision. It also provides direct hyperlinks to, and summaries of, 10 EP resolutions relevant to the subject of the special report, dating from April 2012 to October 2015 (2 months before publication of the report). For each resolution, the legal base (individual legislation and relevant decisions) is clearly stated. Finally, it provides direct links to 19 oral and written questions from MEPs, indicating rule number, name, political group affiliation and question subject matter. The questions cover a 7-year period from 2009 to 2016.
      Looking more closely at the CONT committee hearing, following the ECA report, the rolling checklist details the rapporteur’s recommendations word for word.69xEuropean Parliament, 2017. In its scrutiny, the committee can then draw attention to the inadequacies of evaluation at programme level (if programmes are not evaluated effectively then this limits the insights of both auditor and legislature). The committee shares the ECA concerns over the ‘insufficient reliability’ of EuropeAid evaluation systems, supervision and monitoring activities (2). It points out that ‘it is indispensable to provide Parliament and the budgetary control authority with a clear view of the real extent to which the Union’s main objectives have been achieved’ and asserts that ‘feedback on the performance of Commission aid projects and programmes should be provided as part of the Commission’s commitment to quality assurance’ (3-4), Furthermore, it consider that ‘outcomes of the evaluations are key-elements to feed into policy and political review’ (5). It considers that ‘the sharing of knowledge by all means and tools is crucial for developing a culture of evaluation and an effective culture of performance’ (6). Finally, it supports the ECA recommendations with regards to EuropeAid’s evaluation and results-oriented monitoring systems. In a word, the rolling checklist shows explicitly, by extracting from the CONT Committee minutes, how the EP has formally considered the findings of the ECA and, subsequently, endorsed its position.
      In short, the wealth of information provided in the interactive rolling checklists constitutes a valuable asset for parliamentarians and should contribute to improving the work of committees as they carry out their scrutiny function. Scrutiny can only be effective if the scrutinizers are well informed and can understand the policy and legal context. The work of the EPRS in the field of ex post impact assessment should contribute to more effective parliamentary evaluation of the results of EU spending, not only in the CONT committee but also beyond.

      III The Use of ECA Special Reports by the Council

      The EP is engaged in lawmaking at various stages, from deliberating and amending in the early drafting process, to deliberating and scrutinizing policy in the later stages of evaluation. However, it is only a co-legislator. Like the CONT Committee in the EP, there is a Budget Committee70x See Council of the European Union, ‘Budget Committee’, available at: <www.consilium.europa.eu/en/council-eu/preparatory-bodies/budget-committee/>. in the Council that deals with the annual budgetary discharge procedure. The ECA sends a copy of every special report to the Council and receives confirmation of receipt and substantive comments. Yet, we know little about how the Council uses audit findings in its oversight and there are no committee proceedings available. The EP is eager to find ways to engage the Council more formally in the discharge procedure and is currently financing research to look back at Treaty obligations to reconsider the Council’s legal obligations.71xEuropean Parliament, ‘The European Parliament’s Right to Grant Discharge to the Council’, Documentation of a Workshop Held on 27 September 2012, March 2013, available at: <www.europarl.europa.eu/RegData/etudes/workshop/join/2013/490667/IPOL-JOIN_AT(2013)490667_EN.pdf>. , 72xA. D’Alfonso, ‘Discharge to the Council and European Council’, 16 October 2014, available at: <www.europarl.europa.eu/EPRS/EPRS-AaG-538960-Discharge-to-the-Council-and-European-Council-FINAL.pdf>.
      There was a dispute in 2011 when the EP refused to grant discharge to the Council for its 2009 accounts, having already postponed discharge for the 2007 year in 2009 on the grounds of lack of transparency and a refusal to engage in dialogue. There is persistent disagreement with the Council over the EP budget role.73xF. Chaltiel, ‘Scrutiny of Budget Implementation by the European Parliament’, in The European Parliament, The European Parliament’s Right to Grant Discharge to the Council – Documentation of a Workshop held on 27 September 2012, 2013, p. 77. As Chaltiel asserts, tension has been building for several years over the EP’s wish to examine the accounts of all institutions, not just the Commission. The Council has been reticent to cooperate with the scrutiny of its own expenditure.74x Ibid.
      In short, amongst the three institutions in the ‘legislative’ triangle, we know most about how the EP uses the ECA special reports in scrutiny (though there have been few studies of this scrutiny process up close within the committees). We see evidence from the ECA itself that its reports are useful for Commission learning, in terms of improving systems management and policy design (though there are no studies bringing in Commission perspectives on the ECA work). The Council remains very much a ‘black box’, which is worrying in terms of democratic accountability, but again, underlines the role tension when a co-legislator must evaluate its own output.

    • F The Emerging Role of the EPRS: Helping Audit Feed into Scrutiny

      The EP has sought to streamline its own processes and develop its own capacity, wary of relying too heavily on Commission evaluations alone. The creation of the EPRS would appear to be a major milestone in this process.
      The 2011 Niebler report called for the establishment of an integrated impact assessment process within the EP in support of parliamentary committees.75xEuropean Parliament, ‘Report of 18 April 2011 on Guaranteeing Independent Impact Assessments (2010/2016(INI))’, Committee on Legal Affairs, ‘Rapporteur: Angelika Niebler’, available at: <www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A7-2011-0159&language=EN>. A service was set up in January 2012 within the EP’s own administration, with the focus initially on ex ante impact assessment. In November 2013, the service became part of the newly created DG for EPRS and broadened its remit to ex post evaluation. The 2011 report invited the EU institutions to adopt a holistic approach to impact assessment ‘throughout the whole policy cycle, from design to implementation, enforcement, evaluation and to the revision of legislation’ (point 2), stressing the need to ‘evaluate more accurately whether the objectives of a law have actually been achieved and whether a legal act should be amended or retained’ (point 25) (September 2016: 7). The importance of both ex ante and ex post impact assessment for evidence-based policymaking in the context of better lawmaking was subsequently reiterated in the EP 2016 resolution on the state of play and outlook of REFIT.76xEuropean Parliament, ‘Report of 24 June 2015 on Regulatory Fitness and Performance Programme (REFIT): State of Play and Outlook (2014/2150 (INI))’, available at: <www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+REPORT+A8-2015-0208+0+DOC+XML+V0//EN/>.
      Parliamentary libraries have traditionally helped extend the oversight role of committees, offering informational resources to parliamentarians. Yet, libraries have been increasingly challenged in meeting the needs of ‘information-rich, time-poor parliamentarians in a complex information environment’ and are today play a ‘critical, active role in supporting democracy’.77xR. Missingham & S. Miskin, ‘An Informed Parliament: The Role of the Federal Parliamentary Library’, Australian Journal of Political Science, Vol. 46, No. 2, 2011, pp. 331-332. As the case of the Australian Parliament shows, a robust’ parliamentary library should be able to provide “independent, high-quality and impartial information, analysis and advice to members of parliament across the political spectrum” and are a “primary means to secure accountability and transparency”.78x Ibid., p. 333. In short, although parliamentary libraries were initially sources of information, they have since developed research services that offer advice and analysis “based on professional expertise across subject areas relevant to the parliament”.79x Ibid. The creation by parliament of ‘an independent unit or body to assist parliamentary scrutiny and to undertake research’ can raise the quality of debate and promote transparency and accountability.80xACCA, p. 13.
      In light of this context and rationale for the establishment of the EPRS,81x See ‘European Parliamentary Research Service Blog’, available at <https://epthinktank.eu>. how does this relate to wider institutional developments within the EU? First, what we see when looking at EP-ECA relations is indicative of wider transformation in the policymaking machinery at the supranational level. We might understand the establishment of the EPRS as an attempt to empower MEPs by improving the quantity and quality of informational resources. This does not necessarily mean that all information is generated from within, but that the service of qualified researchers monitors and oversees the production and publication of policy-relevant material – including briefs, reviews, reports and audits – produced by other key stakeholders, and where relevant engages with academic experts and professionals in the field. The more oversight provided by the EPRS, then arguably, the more empowered the committees, the better the quality of deliberation, and potentially, the more effective its decision-making. In a word, encouraging more reflexive governance means the greater scrutiny of past policies with a view to better future policies. The EPRS provides a permanent staff of researchers who can be relied on to bolster the work of the committees and its individual members. Compared to Commission evaluations, often highly technical, comprising hundreds of pages, and taking up to 18 months to produce, the EPRS’s ‘European Implementation Assessments’ are tailored to parliamentarians’ needs, with a focus on ease of use. Recent topics have included the EU copyright framework, the Employment Equality Directive, and the UN Convention on the Rights of Persons with Disabilities.82xAnglmayer, 2016, p. 7. In support of the EP scrutiny work in the area of ex post impact assessment, the research service provides succinct appraisals of the state of implementation of all legislative acts that the Commission has listed for revision in its annual work programme.
      These institutional developments imply that we should also not think of scrutiny purely as an event that takes place purely in the confines of the committee venue at a specific point of time, but rather, as a continual process of monitoring and judging, made possible through the provision of high-quality information, i.e. the collection and interpretation of policy-relevant data by a permanent corps of EP researchers. We might even understand the emergence of the EPRS as the result of the weak evaluation capacity within the pre-existing committee system, and of the EU institutions in general, considering the traditional outsourcing of evaluation activity by the Commission.

    • G Conclusion

      The literature has paid a lot of attention to the central role of the Commission in rolling out the new ‘Better Regulation’ agenda. By contrast, the role of the EP is not well understood. This article contributes a first set of insights into how the EP helps achieve better lawmaking, especially through ex post impact assessment and the emerging role of the EPRS in fostering policy learning, in part, by drawing on input from ECA performance audit reports. The closer attention paid to the conclusions and recommendations of ECA special reports – pertaining to policy performance alongside legal compliance – in addition to European Commission policy evaluations, should encourage a more effective scrutiny of policies. Likewise, a better understanding of policy, particularly the dynamics of implementation, through specific insights into the successes and shortcomings of spending programmes at street level, should better enable parliamentary co-decision makers to consider the choice and design of legislation as a policy instrument. Audit and evaluation that places greater specific emphasis on the achievement of policy objectives and concerns of effectiveness should arguably encourage decision makers – through the process of parliamentary scrutiny – to think ahead, in so doing establishing a more logical link between the earlier and later stages of the policy cycle.
      These institutional developments raise a key question, whether or not ex post evaluation risks becoming politicized (see Smismans83xS. Smismans, ‘The Politicization of ex post Evaluation in the EE’, in this Special Issue, 2017. in this special issue who considers the pros and cons of politicization)? Why might this be so? First, because it is taking place inside a political institution; and second, because it is taking place for political actors. There is of course a risk of the politicization of evaluation since the EPRS was set up to serve the needs of the institution. However, we should remember that all scrutiny by MEPs is inherently political. We might expect that all ex post evaluations conducted by objective external stakeholders to (have the potential to) feed into the process of political deliberation – though MEPs will essentially pick and choose those findings that support their political cause. It is unlikely, however, that reports drafted in-house, produced by what is effectively a highly qualified, non-political, research arm of the legislature, will have a political agenda; the outputs of the EPRS, as for any research body, may at a fundamental level display the inherent beliefs and ideologies of the individual author or researcher, but otherwise demonstrate the high standards of professional researchers.

    Noten

    • 1 European Parliament Report of 18 April 2011 on guaranteeing independent impact assessments (2010/2016(INI)), Committee on Legal Affairs, ‘Rapporteur: Angelika Niebler’, available at: <www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A7-2011-0159&language=EN>.

    • 2 European Commission, ‘Better Regulation Agenda: Enhancing Transparency and Scrutiny for Better Law-Making’, 19 May 2015, available at: <http://europa.eu/rapid/press-release_MEMO-15-4989_en.htm>.

    • 3 M. Eliantonio & A. Spendzharova, ‘Introduction’, European Journal of Law Reform, in this Special Issue, 2017.

    • 4 See multiple authors in Special Issue on the Better Regulation Package: ‘How Much Better is Better Regulation?’, Vol. 6, No. 3, 2015.

    • 5 European Parliament, Council of the European Union and European Commission, ‘Inter-institutional Agreement on Better Law-Making’, 13 April 2016, OJ L 123/1, available at: <http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016Q0512(01)&from=en>.

    • 6 Ibid.

    • 7 In Part III, ‘Tools for Better Law-Making’, sub-section ‘Ex Post Evaluations of Existing Legislation’, Arts. 20-24 state as follows: “20. The three Institutions confirm the importance of the greatest possible consistency and coherence in organising their work to evaluate the performance of Union legislation, including related public and stakeholder consultations. 21. The Commission will inform the European Parliament and the Council of its multiannual planning of evaluations of existing legislation and will, to the extent possible, include in that planning their requests for in-depth evaluation of specific policy areas or legal acts. The Commission’s evaluation planning will respect the timing for reports and reviews set out in Union legislation. 22. In the context of the legislative cycle, evaluations of existing legislation and policy, based on efficiency, effectiveness, relevance, coherence and value added, should provide the basis for impact assessments of options for further action. To support these processes, the three Institutions agree to, as appropriate, establish reporting, monitoring and evaluation requirements in legislation, while avoiding overregulation and administrative burdens, in particular on Member States. Where appropriate, such requirements can include measurable indicators as a basis on which to collect evidence of the effects of legislation on the ground. 23. The three Institutions agree to systematically consider the use of review clauses in legislation and to take account of the time needed for implementation and for gathering evidence on results and impacts. The three Institutions will consider whether to limit the application of certain legislation to a fixed period of time (‘sunset clause’). 24. The three Institutions shall inform each other in good time before adopting or revising their guidelines concerning their tools for Better Law-Making (public and stakeholder consultations, impact assessments and ex-post evaluations).”

    • 8 E. Mastenbroek, S. van Voorst & A. Meuwese, ‘Closing the Regulatory Cycle? A Meta Evaluation of Ex-post Legislative Evaluations by the European Commission’, Journal of European Public Policy, Vol. 23, No. 9, 2016, pp. 1329-1348.

    • 9 I. Anglmayer, ‘Evaluation and Ex-post Impact Assessment at EU level’, European Parliament Research Service (Ex-Post Impact Assessment Unit), September 2016, available at: <www.europarl.europa.eu/RegData/etudes/BRIE/2016/581415/EPRS_BRI(2016)581415_EN.pdf>.

    • 10 Eliantonio & Spendzharova, in this Special Issue, 2017.

    • 11 Communication from the Commission, ‘A Citizen’s Agenda – Delivering Results from Europe’, 10 May 2006, COM(2006) 211 final.

    • 12 Communication from the Commission, ‘A Europe of Results – Applying Community Law’, 5 September 2007, COM(2007) 502 final.

    • 13 C. Mendez & J. Bachtler, ‘Administrative Reform and Unintended Consequences: An Assessment of the EU Cohesion Policy “Audit Explosion”’, Journal of European Public Policy, Vol. 18. No. 5, 2011, pp. 746-765.

    • 14 S. Højlund, ‘Evaluation in the European Commission – For Accountability or Learning?’, European Journal of Risk Regulation, Vol. 6, No. 1, 2015, pp. 35-46.

    • 15 N. Font & I.P. Durán, ‘The European Parliament Oversight of EU Agencies Through Written Questions’, Journal of European Public Policy, Vol. 23, No. 9, 2016, pp. 1349-1366, at 1350.

    • 16 M.D. McCubbins, R. Noll & B. Weingast, ‘Administrative Procedures as Instruments of Political Control’, Journal of Law, Economics and Organization, No. 3, 1987, pp. 242-279.

    • 17 D. Finke & T. Dannwolf, ‘Who Let the Dogs Out? The Effect of Parliamentary Scrutiny on Compliance with EU Law’, Journal of European Public Policy, Vol. 22, No. 8, 2015, pp. 1127-1147, at 1128.

    • 18 Font & Durán, 2016, pp. 1351-1352.

    • 19 T. Winzen, ‘Technical or Political? An Exploration of the Work of Officials in the Committees of the European Parliament’, The Journal of Legislative Studies, Vol. 17, No. 1, 2011, pp. 27-44, at 27.

    • 20 S. Martin, ‘Parliamentary Questions, the Behaviour of Legislators, and the Function of Legislatures: An Introduction’, The Journal of Legislative Studies, Vol. 17, No. 3, 2011, pp. 259-270, at 259.

    • 21 Winzen, 2011, p. 28; C. Neuhold, ‘The “Legislative Backbone” Keeping the Institution Upright? The Role of European Parliament Committees in the EU Policy-Making Process’, European Integration Papers Online, Vol. 5, No. 10, 2001.

    • 22 Winzen, 2011, p. 28.

    • 23 M. Egeberg et al., ‘Parliament Staff: Unpacking the Behavior of Officials in the European Parliament’, Journal of European Public Policy, Vol. 20, No. 4, 2013, p. 495.

    • 24 D. Monk, ‘A Framework for Evaluating the Performance of Committees in Westminster Parliaments’, Vol. 16, No. 1, 2010, p. 2.

    • 25 Ibid., p. 5.

    • 26 Ibid.

    • 27 Ibid.

    • 28 P. Thomas, ‘Effectiveness of Parliamentary Committees’, Parliamentary Government, No. 44, 1993, pp. 10-11.

    • 29 V. Schmidt, ‘Democracy and Legitimacy in the European Union Revisited: Input, Output and “Throughout”’, Political Studies, Vol. 61, 2013, pp. 2-22.

    • 30 Schmidt, 2013, p. 5.

    • 31 ACCA (Association of Chartered Certified Accountants), ‘Parliamentary Financial Scrutiny in Hard Times’, 2011, available at: <www.accaglobal.com/content/dam/acca/global/PDF-technical/public-sector/tech-tp-pfs.pdf>.

    • 32 Ibid., p. 1.

    • 33 Ibid.

    • 34 Ibid., p. 5.

    • 35 Ibid., pp. 5-6.

    • 36 Ibid., p. 20.

    • 37 Ibid., p. 20.

    • 38 A. Kanis, ‘Ex-Post Budgetary Oversight in Europe’, European Court of Auditors Journal, No. 6, 2011, pp. 15-17.

    • 39 Ibid., p. 16.

    • 40 P. Stephenson, ‘Starting from Scratch? Analysing Early Institutionalization Processes: The Case of Audit Governance’, Journal of European Public Policy, Vol. 23, No. 10, 2016, pp. 1481-1501.

    • 41 European Parliament, ‘The Case for a European Audit Office: Introduction by Heinrich Aigner (Vice-Chairman of the Committee on Budgets’, Secretariat Directorate-General for Research and Documentation, September 1973.

    • 42 B. Laffan, ‘Becoming a “Living Institution”: The Evolution of the European Court of Auditors’, Journal of European Public Policy, Vol. 37, No. 2, 1999, pp. 251-268.

    • 43 Arts. 285-287 TFEU state inter alia that ‘The ECA provides the European Parliament and the Council with a statement of assurance as to the reliability of the accounts and the legality and regularity of the underlying transactions which is published in the Official Journal of the European Union. This statement may be supplemented by specific assessments for each major area of Union activity’ and that ‘The European Court of Auditors also assists the European Parliament and the Council in exercising their powers of control over the implementation of the budget.

    • 44 P. Stephenson, ‘Sixty Five Years of Auditing Europe’, Journal of Contemporary European Research, Vol. 12, No. 1, 2016.

    • 45 European Court of Auditors, ‘Stuttgart Report’, 24 October 1983, OJ C 287.

    • 46 Stephenson, 2016, supra note 44.

    • 47 Art. 285 TFEU asserts “The Court of Auditors shall examine the accounts of all revenue and expenditure of the Community. It shall also examine the accounts of all revenue and expenditure of all bodies set up by the Community insofar as the relevant constituent instrument does not preclude such examination.”

    • 48 European Parliament, ‘Resolution of 4 February 2014 on the future role of the Court of Auditors (2012/2064 (INI))’. available at: <www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P7-TA-2014-0060+0+DOC+XML+V0//EN>. In the section ‘The Court’s new dimension and challenges’, para. 9, the European Parliament: “Acknowledges the historic, constructive role of the DAS exercise in focusing on legality and regularity as useful indices of good financial practices and management performance at all levels of Union spending and in showing the way that EU funds have been used in accordance with the decisions of Parliament, acting as legislator and budget authority; underlines, however, that at this point, and in the future, the Court should devote more resources to the examination of whether economy, effectiveness and efficiency have been achieved in the use of the public funds entrusted to the Commission; the results of the findings obtained in Special Reports should imply corresponding adjustments in EU programmes.”

    • 49 P. Stephenson, ‘Reconciling Audit and Evaluation? The Shift to Performance and Effectiveness at the European Court of Auditors’, European Journal of Risk Regulation, Vol. 6, No. 1, 2015, pp. 79-89.

    • 50 C. Mendez & J. Bachtler, ‘Administrative Reform and Unintended Consequences: An Assessment of the EU Cohesion Policy “Audit Explosion”’, Journal of European Public Policy, Vol. 18, No. 5, 2011, pp. 746-765.

    • 51 M. Barzelay, ‘Central Audit Institutions and Performance Auditing: A Comparative Analysis of Organizational Strategies in the OECD’, Governance, Vol. 10, No. 3, 1997, pp. 235-260.

    • 52 D. O’Keefe, ‘The Court of Auditors’, in D.M. Curtin & T. Heukels (Eds.), Institutional Dynamics of European Integration: Essays in Honour of Henry G. Schermers, Vol. II, 1994, pp. 177-194.

    • 53 G. Karakatsanis & B. Laffan, ‘Financial Control: The Court of Auditors and OLAF’, in J. Peterson & M. Shackleton (Eds.), The Institutions of the European Union, Oxford, Oxford University Press, 2012, pp. 242-261.

    • 54 European Court of Auditors, Gaps, Overlaps and Challenges: A Landscape Review of EU Accountability and Public Audit Arrangements, 2014, p. 7, available at: <www.eca.europa.eu/Lists/ECADocuments/LR14_01/QJ0214776ENC.pdf>.

    • 55 Karakatsanis & Laffan, p. 249.

    • 56 B. Laffan, ‘Auditing and Accountability in the European Union’, Journal of European Public Policy, Vol. 10, No. 5, p. 772.

    • 57 O’Keeffe, 1994, p. 183.

    • 58 Interview with Klaus-Heiner Lehne in the European Court of Auditors Journal, May 2014. Available at: <www.eca.europa.eu/en/Pages/Journal.aspx>.

    • 59 European Parliament, 2014, see section ‘The Court’s New Dimensions and Challenges’, note 10.

    • 60 The 2014 report followed the 2012 public hearing of the Committee on Budgetary Control. The proceedings were titled ‘Future role of the European Court of Auditors: Challenges ahead and possible reform’. Rapporteur: Inés Ayala Sender, 30 May 2012.

    • 61 European Court of Auditors, International Peer Review of the European Court of Auditors, 2008, available at: <www.eca.europa.eu/Lists/ECADocuments/PEERREVIEW2008/PEERREVIEW2008_EN.PDF>.

    • 62 Bundesrechnungshof, International Peer Review of the European Court of Auditors, 2014 available at: <www.eca.europa.eu/Lists/ECADocuments/2013_PEER_REVIEW/2013_PEER_REVIEW_EN.pdf>.

    • 63 European Court of Auditors, Opinion No 4/2015 concerning the proposal for a Regulation of the European Parliament and of the Council on the European Fund for Strategic Investments and amending Regulations (EU) No 1291/2013 and (EU) No 1316/2013, 12 March 2015, available at: <www.eca.europa.eu/Lists/ECADocuments/OP15_04/OP15_04_EN.pdf>.

    • 64 Proposal for a Regulation of the European Parliament and of the Council on the European Fund for Strategic Investments and amending Regulations (EU) No 1291/2013 and (EU) No 1316/2013, 31 January 2015, COM(2015) 10 final.

    • 65 European Court of Auditors, ‘Special Report No 2/2016: 2014 report on the follow up of the European Court of Auditors’, 26 February 2016, available at: <www.eca.europa.eu/en/Pages/DocItem.aspx?did=35401>.

    • 66 European Parliament, ‘EPRS Scrutiny Toolbox’, European Parliamentary Research Service Blog, 2017, available at: <https://epthinktank.eu/scrutiny-toolbox/>.

    • 67 Ibid.

    • 68 European Court of Auditors, ‘Special Report No 18/2014: Europe Aid’s Evaluation and Results-Oriented Monitoring System’, 11 December 2014, available at: <www.eca.europa.eu/en/Pages/DocItem.aspx?did=30363>.

    • 69 European Parliament, 2017.

    • 70 See Council of the European Union, ‘Budget Committee’, available at: <www.consilium.europa.eu/en/council-eu/preparatory-bodies/budget-committee/>.

    • 71 European Parliament, ‘The European Parliament’s Right to Grant Discharge to the Council’, Documentation of a Workshop Held on 27 September 2012, March 2013, available at: <www.europarl.europa.eu/RegData/etudes/workshop/join/2013/490667/IPOL-JOIN_AT(2013)490667_EN.pdf>.

    • 72 A. D’Alfonso, ‘Discharge to the Council and European Council’, 16 October 2014, available at: <www.europarl.europa.eu/EPRS/EPRS-AaG-538960-Discharge-to-the-Council-and-European-Council-FINAL.pdf>.

    • 73 F. Chaltiel, ‘Scrutiny of Budget Implementation by the European Parliament’, in The European Parliament, The European Parliament’s Right to Grant Discharge to the Council – Documentation of a Workshop held on 27 September 2012, 2013, p. 77.

    • 74 Ibid.

    • 75 European Parliament, ‘Report of 18 April 2011 on Guaranteeing Independent Impact Assessments (2010/2016(INI))’, Committee on Legal Affairs, ‘Rapporteur: Angelika Niebler’, available at: <www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A7-2011-0159&language=EN>.

    • 76 European Parliament, ‘Report of 24 June 2015 on Regulatory Fitness and Performance Programme (REFIT): State of Play and Outlook (2014/2150 (INI))’, available at: <www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+REPORT+A8-2015-0208+0+DOC+XML+V0//EN/>.

    • 77 R. Missingham & S. Miskin, ‘An Informed Parliament: The Role of the Federal Parliamentary Library’, Australian Journal of Political Science, Vol. 46, No. 2, 2011, pp. 331-332.

    • 78 Ibid., p. 333.

    • 79 Ibid.

    • 80 ACCA, p. 13.

    • 81 See ‘European Parliamentary Research Service Blog’, available at <https://epthinktank.eu>.

    • 82 Anglmayer, 2016, p. 7.

    • 83 S. Smismans, ‘The Politicization of ex post Evaluation in the EE’, in this Special Issue, 2017.