Part 3: The current state of performance reporting

The problems, progress, and potential of performance reporting.

3.1
Although legislation, standards, and guidelines have been developed to better support performance reporting, criticisms of how public sector performance is analysed and reported on continue to be made.

3.2
Performance reporting is part of a public organisation's regular performance management cycle. Public organisations mainly report performance information in their annual reports. The Treasury describes annual reports as "one of the most important ways the department is accountable to members of Parliament and the public they represent".32

3.3
In total, the public sector prepares and publishes about 3400 annual reports each year. About 70% of these are from schools, which are not required to report on their service performance.

3.4
In this Part, we consider the state of performance reporting today and draw on information from people we interviewed, our analysis of annual reports, auditors' findings, and the findings of other researchers.

Public organisations take their performance reporting obligations seriously

3.5
Everyone we interviewed understood the importance of performance reporting and wanted to improve it. Some people expressed the need for different approaches to performance reporting. These included alternatives to hard-copy annual reports, reporting at different times, and more focus on sector-wide performance.

3.6
People in the public sector meet regularly to discuss performance reporting. For example, a group of central government public sector officials – called the Planning and Performance Network – share their knowledge and experience about public sector planning and reporting matters.

3.7
Our research indicated that people involved in performance reporting take their obligations seriously, and the many reports they prepare largely comply with the relevant rules and regulations.

3.8
Our review of 33 public organisation's annual reports found that the outcomes that public organisations report on are broadly appropriate for their sector. We also found that public organisations do try to use performance frameworks to explain how their activities contribute to the outcomes they seek.

3.9
Some public organisations try to go beyond the statutory requirements for reporting. They explore different ways of tailoring their performance reporting to meet the expectations of the people who most value the information.

3.10
For example, the Ministry of Housing and Urban Development publishes a dashboard of information that tracks progress on various aspects of the Government's housing programme. This includes:

  • how many families have bought their first home;
  • how many people need warm, safe, and sustainable housing; and
  • people's movement through the public housing system.33

3.11
The New Zealand Customs Service is another example. It includes in its annual report information and indicators about its ongoing programme to safeguard integrity and prevent corruption, such as data on the outcomes of investigations into allegations of unacceptable behaviour and what type of corrective action was necessary.34

Performance reporting is not always meaningful or understandable

3.12
Our research found that performance reporting can still be difficult to understand and confusing for readers, despite the considerable time and resources public organisations spend each year collecting information about, describing, and reporting their performance.

The way performance information is described can be incomplete, changeable, and overly complicated

3.13
Some of the people we interviewed told us that performance information was too complicated, technical, and prescribed. Others we interviewed said that performance information did not properly capture the overall performance of their public organisation and did not show what was really important or meaningful.

3.14
Our observations from reviewing annual reports are consistent with what people told us. Our analysis suggests that, at times, readers of annual reports must grapple with large amounts of narrowly focused performance information that can change significantly from year to year.

3.15
The larger data set we compiled includes annual report information collected from about 300 public organisations between 2016 and 2019. Of the annual reports we included, the length of these documents ranged from eight pages to 358 pages. The average length was about 110 pages.

3.16
Looking at output indicators in the annual reports of 33 organisations:

  • From 2016 to 2018, the annual reports had 50 output indicators on average.
  • In 2016, the annual reports had a total of 1849 output indicators. By 2017, 30% of them were new or had changed their description. By 2018, almost 50% of them were new or had changed their description. One entity reported 187 output indicators in 2016 but had removed or changed the description of about 119 (64%) of these by 2018.
  • In 2018, there were a total of 1650 output indicators. Of these, only about 3% related to fairness and integrity, about 1% incorporated a regional perspective, and about 4% related to Māori or Pasifika communities. Less than 1% related to concepts of collaboration, inclusion, or participation.

3.17
From our larger database of output indicators, we found that, from 2016 to 2019, 109 annual reports had more than 100 output indicators and 28 annual reports had more than 200 output indicators. One public organisation had 340 output indicators in a single annual report.

3.18
Looking at reporting about wider outcomes in 2018, we found that 15 of the 33 organisation's used either qualitative or quantitative outcome indicators in their annual reports – but not both.

3.19
Over the years, our auditors have raised various issues with public organisations about the performance information in the annual reports. Our auditors have observed annual reports with:

  • too many measures, which makes it difficult to identify what is important;
  • too few measures and targets to properly understand performance; and
  • a lack of up-to-date information or information from previous years, which makes it difficult to understand performance over time.

3.20
These findings are consistent with other research. For example, in 2019, the Productivity Commission found that:

The current performance-reporting requirements on local authorities, including the financial and non-financial information disclosures, are excessively detailed, inappropriately focused and not fit for purpose.35

3.21
One reason why performance information does not always properly capture overall performance is that the model commonly used to describe performance is not well suited to describing more complex situations or what is important for Parliament and the public.

3.22
For example, Vitalis and Butler argue that seeking to understand and describe performance in terms of a linear and logical production-line process (see Figure 1) might work for some public sector activities (such as issuing passports). However, it does not work well for many other more complex activities (such as collaboratively managing complex social problems).36

The way performance information is presented can be difficult to follow and understand

3.23
The people we interviewed as part of our research told us about issues with how performance information is presented and reported. These included reporting that:

  • did not always meet the needs of its audiences;
  • was too rigidly structured; and
  • was at times adjusted to avoid explaining non-performance.

3.24
Our observations from reviewing 33 organisation's annual reports from 2016 to 2018 were consistent with what people told us. These observations suggest that, at times, readers of performance reports are faced with fragmented information that can be inadequate and confusing. For example:

  • For 14 of the organisations, we could not clearly identify how the indicators of outcomes aligned with the indicators of outputs in their annual reports.
  • For 23 of the organisations, we could not clearly identify the money spent on planned outcomes in their annual reports.
  • Seven public organisations did not include a performance framework, and there was no clear link between the outcomes they wanted to achieve, the impact they were trying to make, and their reported output performance indicators.
  • We found many instances of indicators being presented in an overly abbreviated or technical way, with little accompanying explanation. Some were very difficult to understand.
  • There was little analysis of overall performance. Only seven of the 33 organisation's mentioned value for money, cost-effectiveness, or productivity when reporting against their outcomes in their annual reports.

3.25
Some of those interviewed also commented about having little choice about what they can report and that the timing, scope, and format of reporting needed to be more flexible. One particular concern was the time and cost of the reporting exercise and the ever-increasing reporting requirements placed on the people who prepare the performance information.

3.26
Our auditors regularly identify improvements needed to the performance reporting of public organisations. From 2015 to 2018, these matters mainly related to the way public organisations set out and explained their performance framework and the systems and controls associated with performance reporting.

3.27
Our auditors have also made suggestions to public organisations about how they can link outputs, impacts, and outcomes together to tell a more coherent performance story. They have also suggested that performance frameworks and the associated performance measures can better reflect an organisation's strategic intentions or outcomes.

3.28
These findings are consistent with other research we have seen. For example, a 2020 study by the McGuinness Institute observes that New Zealand's public and private sector reporting framework is complex, fragmented, inefficient, and lacks a stewardship lens.

3.29
The McGuinness Institute study noted that private and public sector annual reports are not being used to their full potential, and that users' demands for reporting exceed preparers' provision of information. There were many differences in what users and preparers considered to be important performance information. On the whole, users believed that annual reports did not report information well.37

3.30
Understanding what services and outcomes public money is spent on at a whole-of-government level can be just as difficult. For example, in 2020, we published a report on government department spending on managing natural hazards.38 Our main finding was that it is extremely difficult to obtain a meaningful picture of what the government spends on managing natural hazards.

3.31
The Financial Statements of the Government provide a comprehensive picture of the financial health of the whole of the government (excluding local government). Accompanying these statements with a set of whole-of-government service and outcome information could provide a more integrated, comprehensive, and useful story of the government's performance.

3.32
It could also better inform people about what the government is spending money on, how much it is spending, and what they are getting in return.

3.33
Insights could be drawn from the way that the Treasury shows financial statement projections alongside wider living standards in He Tirohanga Mokopuna – 2016 Statement on the Long-Term Fiscal Position.

3.34
We acknowledge that there are challenges in bringing an integrated picture together. For example, the way public organisations capture financial and service performance information, and the way appropriations are designed, do not always align meaningfully with the services that an organisation provides.

Performance reporting is not always audited in a helpful way

3.35
Effective independent assurance is fundamental to good performance reporting. It provides public organisations, Parliament, and the public with the confidence they need to rely on that reported information.

3.36
A public organisation is responsible for preparing reports about its performance. An auditor provides an independent opinion on whether that reported information fairly reflects the public organisation's performance.

3.37
As we mentioned in paragraphs 3.26-3.27, auditors can raise issues with, or improvements needed in, performance reporting with public organisations. A 2018 study found evidence that the Auditor-General had raised such issues in reporting on local government in New Zealand, and it improved subsequent service performance reporting in local authorities' long-term plans.39

3.38
Figure 2 shows that, in the past five years, only a small number of public organisations have received "modified" audit opinions for their performance information. An auditor gives a modified opinion when:

  • the information is inappropriate for the purpose of the activity being reported on;
  • the information is not a fair reflection of what happened;
  • there is not enough supporting evidence for the reported performance; and/or
  • the information is not consistent with generally accepted accounting practice.

Figure 2
Number and percentage of modified audit opinions, from 2015 to 2019

The table shows the number and percentage of audit opinions that were modified for selected public organisations. Not all public organisations are required to report on service performance.

2015 2016 2017 2018 2019
Selected public organisations* 637 607 561 530 532
Number of modified opinions 27 26 18 4 10
Percentage modified 4.2% 4.3% 3.2% 0.8% 1.9%

*Data includes local authorities, council-controlled organisations, government departments, Crown entities (excluding schools and Crown Research Institutes), and energy companies. Source: Office of the Auditor-General.

3.39
Despite the small number of modified opinions, there are clear signs that reported information is not properly explaining performance, as we discussed in paragraphs 3.12-3.34. This suggests that complying with minimum reporting and auditing standards is not enough to report performance effectively.40

3.40
However, public organisations can view the audit opinion as an indicator of success in performance reporting or success in performance generally. This can lead to a situation where public organisations focus on meeting the minimum standard that they think will satisfy the auditor.

3.41
Auditors have sometimes received criticism that their approach to auditing has made reporting and innovation in reporting difficult. Although it is dated, an example of this type of criticism appeared in a 2007 review of the engagement and decision-making provisions in the Local Government Act 2002.

3.42
As part of this review, the Local Government Commission considered the process involved in the first round of long-term council community plans. A comment from the review said:

there needs to be a balance found between Audit's desire for detailed control over the interpretation of legislative provisions and the form of the [long-term council community plans] and allowing local authorities room to exercise reasonable discretion.41

3.43
In practice, the nature of the audit process might not always incentivise public organisations to try new approaches or innovate in how they report performance information. However, in our view, public organisations should not think of audits as a barrier to improving the way they report performance information. Some organisations are already exploring different ways to improve their performance reporting (see paragraphs 3.9-3.11).

3.44
Public organisations should invest in the systems and processes needed to understand organisational performance. This not only helps them to manage their performance effectively but also enables them to be accountable to Parliament, the public, and the communities they serve.

Monitoring functions are not leading to significant improvements in performance reporting

3.45
New Zealand's public management and accountability systems rely heavily on regular monitoring and scrutiny of public organisations' reported performance information.

3.46
Monitoring can be carried out by teams in organisations that have many other functions (for example, the Treasury) or can be more dedicated monitoring entities.

3.47
These monitors provide regular checks on the process of reporting performance. They assess whether that performance reporting supports Ministerial and Parliamentary scrutiny and assist where necessary.

3.48
Schick has observed that this reliance on organisational monitoring is a particular feature of our public management and accountability systems.42 He observes that some other countries do not rely as much on regularly monitoring and assessing a public organisation's performance. Instead they place more emphasis on evaluating the performance of specific programmes of work.

3.49
Despite the importance of this monitoring function to the system of performance management, people we interviewed told us that there can be confusion about the roles and responsibilities between the public organisations that are monitored, the monitoring agencies or teams, and the users of the monitoring information.

3.50
People also talked about capability and capacity issues in monitoring teams, the need for a broader and more integrated picture of performance, and the desire for a more strategic and forward-looking approach to monitoring performance.

3.51
These responses suggest that some people are not clear about the purpose of monitoring an organisation's performance, the meaningfulness of the measures used, and the value that this type of monitoring has for Ministers, Parliament, the public, and public organisations.

3.52
Gill noticed similar issues with the monitoring function in 2011.43 He reflected on why outcome-oriented performance monitoring was underdeveloped and observed that "it seems as if most New Zealand government departments do not understand the role and character of monitoring and evaluation".

3.53
In the context of regulatory (special purpose) reporting, the Productivity Commission observed in 2014 that:

Monitoring plays an important part in New Zealand's accountability framework for regulators. However, current practice and gaps in capability are limiting the potential of monitoring to assure ministers, Parliament and the public that regulators are effectively and efficiently pursuing their regime's objectives or that risks of failure are being appropriately managed.44

3.54
The effectiveness of the monitoring function can impact on the usefulness of, and demand for, better performance reporting. Confusion or tensions in the roles and responsibilities of the parties involved, however, can mean that opportunities to improve performance reporting through the monitoring relationship might be missed.

3.55
In 2020/21, we began a performance audit to examine the effectiveness of statutory monitoring arrangements for Crown entities throughout central government. We expect to finish this audit in 2021/22.

Performance reporting is not always used appropriately

3.56
The way performance information is described, presented, audited, and monitored can be disjointed, confusing, and unhelpful. This might be why performance reporting practice has not progressed in the way that it should. It might also be why it is not valued or demanded in many cases.

3.57
As Dormer observed in 2018:

governments, and individual government agencies, often publish significant amounts of information that is neither read nor understood by those to whom they are accountable.45

Many New Zealanders want to understand public sector performance

3.58
Our recent research into the state of our public accountability system suggests that people are interested in learning more about the services public organisations deliver and how they spend public money to make a difference in communities.

3.59
However, many people we surveyed did not know why annual reports were prepared and few read or used them. We asked how public organisations could better communicate what they do. A one-stop government website was the most preferred option out of the seven possible options. Annual reports were the least preferred.

3.60
The people we surveyed expected public organisations to provide services in an efficient, reliable, and responsive way. They also expected them to act, and be seen to act, with respect, integrity, openness, and honesty. They were also clear that public organisations need to provide relevant, convenient, easily found, and accessible information.

3.61
The need to provide relevant and accessible information to New Zealanders and their communities is clearly important to understanding and using performance information. However, this does not always happen in practice.

3.62
For example, as part of a wider report about health services and outcomes, the Waitangi Tribunal sought views on the way health organisations are held to account and the importance of effective measures and reporting for reducing Māori health inequities.

3.63
Claimants to the Tribunal highlighted many issues, including:

that public information on the effectiveness of Government policies and programmes is insufficient, denying Māori communities any real opportunity to monitor the Crown's performance.46

3.64
The Tribunal also found that:

there are few measures in place that can be used to hold district health boards to account effectively for the persistence of Māori health inequity.47

3.65
The Tribunal considered that, to be useful, measures "… need to be visible and easily understood both by the sector and by the wider public".48

The public sector does not always use performance information well

3.66
Some have suggested that the public sector makes limited use of performance information. For example, in 2011, Gill found the following issues with the use of organisational performance management information:49

  • Legislatures do not want or do not directly or systematically use performance information.
  • Ministers' use of performance information is variable and limited.
  • Central agencies use performance information like a fire alarm – they ignore it until it goes off.
  • In public organisations, managers' use of performance information varies according to their function, level, and role.
  • Public organisations use performance information more for internal "control" purposes and less for seeking external legitimacy and organisational learning.

3.67
A report by the New Zealand Institute of Economic Research about the role and limits of performance measures in New Zealand also noted that:

empirical research provides a generally negative view of how parliamentarians, cabinets, councils and citizens use performance information and a mixed view on usage by managers and individual politicians.50

3.68
In 2012, the then Auditor-General observed that the information and processes that should help central government organisations understand how well they transform their financial resources into outcomes or results (referred to in the report as "value management") were not in place or not widely used.51

3.69
In 2017, the Productivity Commission found that, too often, "the wrong performance information was provided to central agencies or the information provided was used in the wrong way or not at all".52

3.70
In 2018, a Productivity Commission paper on improving state sector productivity noted that "the lack of demand for productivity-related performance information by agency leaders may also reflect the low importance placed on this information by Ministers".53

3.71
We do not expect public organisations to use performance information just to demonstrate performance externally. We also expect them to use it routinely to inform effective management and governance.

3.72
Using performance information internally might involve more detailed information and analysis, but it should also start with what is important to the organisation's external stakeholders. Understanding how well a public organisation is managing its financial resources, capability, and risks to achieve its overall strategic goals is as relevant to the organisation's leadership team as it is to Parliament and the public.

3.73
A clear understanding of how an organisation's activities and services deliver value and contribute to the outcomes that are important for New Zealand is needed for public resources to be directed and managed well. To be useful for public accountability, that story must also be told in a way that is meaningful and accessible for Parliament and the public.


32: The Treasury (2019), Year end reporting: Departmental annual reports and end-of-year performance information on appropriations, page 6.

33: See www.hud.govt.nz.

34: See New Zealand Customs Service, Annual report 2020, page 42.

35: The Productivity Commission (2019), Local Government funding and financing, page 112.

36: Vitalis, H and Butler, C (2019) "Organising for complex problems – beyond contracts, hierarchy and markets", presentation at the XXIII IRSPM Annual Conference.

37: McGuinness Institute (2020), Report 17 – ReportingNZ: Building a reporting framework fit for purpose, pages 41:-47, 87, and 94.

38: Office of the Auditor-General (2020), Analysing government expenditure related to natural hazards.

39: Keerasuntonponga, P and Cordery, C (2018), "How might normative and mimetic pressures improve local government service performance reporting?", Accounting and Finance, Vol 58, page 1193.

40: See the appendix for a summary of these performance standards.

41: MWH and JHI consultancy (2007), Review of Local Government Act 2002 engagement and decision-making provisions, page 15.

42: Schick A (1996), The spirit of reform: Managing the New Zealand State Sector in a time of change, page 80.

43: Gill, D, et al (2011), The Iron Cage Recreated – The performance management of state organisations in New Zealand, Institute of Policy Studies, Victoria University of Wellington, page 451.

44: The Productivity Commission (2014), Regulatory institutions and practices, page 372.

45: Dormer, R (2018), Accountability and public governance in New Zealand, unpublished research paper for the Office of the Auditor-General, pages 31-32.

46: Waitangi Tribunal (2019), Hauora: Report on Stage One of the health services and outcomes Kaupapa inquiry, page 64.

47: Waitangi Tribunal (2019), Hauora: Report on Stage One of the health services and outcomes Kaupapa inquiry, page 122.

48: Waitangi Tribunal (2019), Hauora: Report on Stage One of the health services and outcomes Kaupapa inquiry, page 125.

49: Summarised from Gill, D, et al (2011), The Iron Cage Recreated – The performance management of state organisations in New Zealand, Institute of Policy Studies, Victoria University of Wellington, page 474.

50: New Zealand Institute of Economic Research (2012), Role and limits of performance measures – Report of the Performance Measurement Research Project for the Technical Working Group, pages 6 and 35.

51: Office of the Auditor-General (2012), Reviewing financial management in central government, page 5.

52: The Productivity Commission (2017), Efficiency and performance in the New Zealand State Sector: Reflections of Senior State Sector Leaders, page 40.

53: The Productivity Commission (2018), Improving state sector productivity: Final report of the measuring and improving state sector productivity inquiry, volume 1, page 14.