Part 4: Performance reporting

New Zealand Customs Service: Collecting customs revenue.

4.1
In this Part, we discuss:

  • the integrity of the Service’s performance measurement information;
  • the measures used to report on the effectiveness and efficiency of customs revenue collection;
  • monitoring compliance; and
  • how the Service compared its performance with other customs services.

The integrity of performance information

4.2
We expected the data used to report performance measures in the Service’s accountability documents to be accurate and complete.

Our findings

4.3
The customs revenue collection performance measures were a mix of financial and non-financial measures, which we list in the Appendix. The Service obtained figures for some activity and financial measures from information held in CusMod and the FMIS (see rows 1, 2, and 4 to 8 of the Appendix). As we reported in Part 3, this information was accurate and complete.

4.4
For other quantitative measures (see rows 3 and 12 to 14 of the Appendix), such as the number of revenue audits carried out, the Service kept statistics. We randomly selected one of these measures, and the Service produced an audit trail showing how the statistics were collected.

Our conclusion

4.5
The data the Service used to report performance measures was accurate and complete.

Measuring the effectiveness and efficiency of customs revenue collection

4.6
The Public Finance Act 1989 requires a government department’s annual report to contain, among other things, a Statement of Service Performance that includes, for each class of outputs, the standard of delivery performance achieved compared with the forecast standards.

4.7
The purpose of measures and standards is to produce information to enable an informed assessment of a department’s performance during the year.

4.8
Annual reports should help answer the question “What difference did you make?” We have published Reporting Public Sector Performance1 to stimulate discussion about how public entities measure and report performance. Our report can help in considering measures to report performance.

4.9
We expected the Service’s performance measures in its annual report to provide assurance over the customs revenue and about the effectiveness and efficiency of customs revenue collection.

Our findings

4.10
The Service prepared an annual report that included a Statement of Service Performance reporting performance against measures and standards compared with the previous year.

4.11
The measures can be characterised mainly as “busyness” measures, which are often demand-driven. They described how many activities were carried out and, in some cases, how well. Therefore, they generally did not assess the Service’s performance in revenue collection.

4.12
Despite the voluntary compliance regime, which relies on individuals and traders understanding what they need to do to comply with customs legislation, there were no measures about information or education. For example, there were no measures about issuing accurate procedure statements for CCAs or about educational visits to CCA licensees or traders.

4.13
In the body of its annual report (but not in the Statement of Service Performance), the Service reported continuing improvements in the cost-efficiency of its revenue collection. In the year ended 30 June 2006, the Service collected about $3,000 for every dollar spent to collect revenue. However, it did not include all costs spent to collect revenue. The ratio reported on a narrow range of costs within the Revenue collection, accounting and debt management output class, which were mostly the costs of running the National Credit Control Unit and some costs from the finance department.

4.14
The Service’s 2006/07 Annual Plan included a project to comprehensively review the output classes it used to report its performance, including its performance measures, and then to review its costing model. This project was in progress during our audit. Although we saw some of the Service’s draft documents, because the contents were subject to change we did not review them.

Our conclusions

4.15
The Service’s measures did not enable a reader to assess the Service’s effectiveness or efficiency and did not cover the whole voluntary compliance regime. The measures provide information about some aspects of the voluntary compliance regime but do not give assurance about the whole of its customs revenue arrangements.

Recommendation 5
We recommend that the New Zealand Customs Service enhance its performance reporting measures to illustrate the contribution of all of its activities – including education, intelligence, and audit – to the voluntary compliance regime.

Monitoring compliance

4.16
One of the difficult areas for revenue collecting agencies is to know whether all the revenue that should be collected is being collected. Nevertheless, we expected the Service to be able to demonstrate, to the greatest degree practicable, whether compliance with the Act was improving.

4.17
We know that not all the customs revenue due is collected. This is because, in any voluntary compliance regime, a balance is struck between the cost-effective collection of revenue and encouraging trade through the efficient flow of goods. The Government accepts that every transaction cannot be audited. However, there is an expectation that other measures will reduce, to the lowest degree practicable, the unpaid customs revenue.

Our findings

4.18
We could not assess from the information available whether compliance was improving. For example, there was no analysis of the reasons for customs revenue not being collected and no analysis of trends in reasons for non-compliance.

4.19
One of the Service’s strategic goals was to provide increased assurance about the integrity of customs revenue collection. To achieve this goal, in 2006/07 it intended to develop and trial a method for assessing the level of customs revenue collected. It then planned to use that understanding to target ways of reducing the customs revenue not collected. At the time of our audit, this work was in progress. It built on work the Service commissioned in 2004 to “estimate and target so-called revenue gaps”.

Our conclusions

4.20
We could not assess from the information available whether compliance with customs revenue requirements was improving, and we recognise that this can be a difficult area for revenue collecting agencies to address. We are interested in the results of the Service’s project to develop and trial a method for assessing the level of customs revenue collected.

Comparing performance with other customs agencies

4.21
We expected the Service to compare its performance with other countries to help it identify areas to improve the effectiveness and efficiency of customs revenue collection.

Our findings

4.22
The Service did not undertake formal benchmarking projects or compare key performance measures with other customs agencies. The Service told us that it had not yet found a customs agency similar enough to make comparisons feasible. The Service continued to seek opportunities to undertake formal comparisons.

4.23
The Service kept an extensive calendar of international meetings and events, so staff could find out when meetings would be held and who, if anyone, the Service would send. As part of its continuing contact with other customs agencies, the Service was able to compare its performance with those agencies in various ways.

4.24
Customs agencies collaborated through the WCO to improve performance in all aspects of customs. The Service was able to use the WCO’s resources as needed:

  • In 2006, the WCO created WorldCap, an information management database with data from member countries, to support capacity building initiatives and benchmarking, and promote partnerships between members.
  • In 2005, the WCO published a Compendium of Integrity Best Practices that included two examples from New Zealand.

4.25
The WCO was directed by the full Council of 169 member countries and the 24-member Policy Commission. The Service was a member of the Policy Commission, and expected to be increasingly looked to for the contribution it could make to capacity building for less-developed member countries and to the WCO at a strategic and policy level.

4.26
As an example of this, in January 2007, because of the views the Service expressed at a WCO meeting in India and a book it commissioned from the Institute of Policy Studies,2 the WCO’s Secretary General asked the Service to produce a discussion paper that gave an independent (that is, non-customs) view of:

  • what the border might look like in the 21st century; and
  • how customs agencies would interact with that border and with their main stakeholders.

Our conclusion

4.27
Through its international contacts and participation in joint activities, the Service had opportunities to assess its performance with other countries. However, there are difficulties with undertaking formal benchmarking between different revenue collection systems.


1: Second edition, January 2002, www.oag.govt.nz.

2: Andrew Ladley and Nicola White (2006), Conceptualising the Border, Institute of Policy Studies, Wellington.

page top