Part 7: Is regional services planning delivering the intended effects?

Regional services planning in the health sector.

7.1
In this Part, we discuss our findings about how well the Ministry knows whether regional services planning has been successful in delivering the intended effects.

7.2
The descriptions of the intended effects have moved somewhat over time. However, to recap, they are to secure future improvements in clinical and financial sustainability by focusing on:

  • making vulnerable services more resilient;
  • reductions in cost by service, compared with previous trends; and
  • improving quality of patient care.

7.3
Three years on, the Ministry does not know whether regional service planning is working as intended. This is because:

  • the Ministry's evolutionary approach to regional services planning will take longer to show results;
  • the Ministry did not define the desired benefits expected from regional services planning in a measurable way (either quantitatively or qualitatively), outside the back-office work;
  • the Ministry does not monitor clinical and financial sustainability through regional services plans (instead, the Ministry monitors sustainability through other operational plans, activities to achieve the aims of those plans, and performance towards some national targets); and
  • there is little evidence of measurable change in clinical and financial sustainability – this is partly because the first regional services plans had no baselines to compare with.

The evolutionary and regulatory approaches

7.4
The Ministry's monitoring of regional services plans has changed since 2011. However, the Ministry's monitoring remains focused on activities, rather than the intended effects or outcomes of regional services planning. This means that it is difficult to find evidence of the extent to which regional services planning is helping to improve performance in the health and disability sector.

7.5
Figure 3 shows the main steps in the evolutionary approach the Ministry has taken to putting regional services planning into effect, and compares it to the approach implied by the Review Group's report, amendments to the Act, the regulations, and the Ministry's written guidance.

Figure 3
Putting regional services planning into effect

Figure 3: Putting regional services planning into effect.

7.6
The main difference between the approaches is the stage at which it will be possible to see measurable changes resulting from regional services planning. The evolutionary approach will see full measurement of outcomes by June 2016 in three services, whereas the regulatory approach anticipated full benefits by June 2014.

7.7
The Ministry considers that progress on regional collaboration within the first few years was in line with expectations. It considers that the Review Group's expectation of full benefits emerging in about three years was too optimistic. The NHB saw the building of relationships created during planning as being more important than the specific content of the plans. The Ministry points to creating the right foundations to support links between regions, including building capacity and capability. It took a deliberately slower path to putting regional services plans into effect in full, to ensure consistency of approach, and to secure the involvement of clinicians.

7.8
Although we do not disagree with the importance of these elements, we were looking for more objective evidence, even if that was qualitative rather than quantitative. In 2013, the Prime Minister's Chief Science Advisor stated that "without objective evidence, the options and the implications of various policy initiatives cannot be measured".9 He went on to say that, without objective evidence, judgement is often based on opinion or belief. He recommended planned evaluation to ensure that the desired effects of the policy are being realised, especially where complexity makes forming policy particularly challenging.

7.9
Without evaluation, we cannot say whether the Ministry's leadership is taking the health sector far or fast enough. In the remainder of this Part, we discuss the problems we had in trying to locate measurable results for the intended effects.

Vulnerable services and clinical sustainability

7.10
We expected to see evidence in regional services plans that regions were setting up sustainable solutions to strengthen vulnerable services. We expected to see that vulnerable services had been defined and identified. We then expected to see plans addressing those services. One expected outcome was that services that needed to be planned and funded nationally or regionally were identified.

7.11
DHBs are not required to use the Ministry's definition of vulnerable services. The four 2013/14 regional services plans address vulnerable services differently and have done so in each of the three rounds of regional services planning. The regional services plans for 2013/14 show some evidence that the Northern, Midland, and Central regions remain focused on clinical services that they consider vulnerable. The Midland region has a focus on information technology as a vulnerable service. The South Island region identifies the workforce in general as being vulnerable.

7.12
The Ministry's guidance for 2013/14 focuses on future financial and clinical viability of a safe, quality public health and disability service, rather than vulnerable services specifically. Noting that DHBs "have responded quickly to identify service vulnerabilities", the guidance mentions vulnerable services only as a subset of mental health services.

7.13
This mirrors what we found in our fieldwork and analysis of documents. The Ministry and the regions had moved on to thinking about vulnerable services as part of their "whole of system" approach to improve quality. This follows the New Zealand "Triple Aim" objectives (see Figure 4).10

Figure 4
New Zealand Triple Aim Initiative objectives

Figure 4 New Zealand Triple Aim Initiative objectives.

Sources: United States Institute for Healthcare Improvement Triple Aim Initiative, Ministry of Health

7.14
We found some good examples of a sustained focus on a vulnerable service, such as the Central region's continued work to strengthen its Women's Health Service. However, the approach to identifying and monitoring vulnerable services was so variable that we could not verify whether the Minister's intention of strengthening vulnerable services had been met.

7.15
Where regions include a reference to vulnerable services, the Ministry will provide feedback through monitoring. However, if a regional services plan is silent on vulnerable services, the Ministry does not challenge this. We could not consistently track reduction in the vulnerability of services in the 2012/13 plans or the 2013/14 plans.

7.16
Regions told us that services become vulnerable or are no longer vulnerable for many reasons. Although we understand this comment, we would expect to see a narrative on services that have moved in or out of vulnerability. This could be in the regional services plans or a regional risk register, if more appropriate. Although we make no specific recommendation, we encourage the Ministry to consider whether it has made enough progress in identifying those services that need to be planned nationally and regionally.

The changing rate of increase in health spending

7.17
We expected to find that regions were reducing the rate of increase in costs of health and disability services, compared with previous trends. We also expected that chief financial officers would be:

  • aware and have evidence of this intended effect; and
  • able to identify cost-benefits from delivering services regionally.

7.18
We were not looking exclusively for absolute cost reductions, although we thought we might have seen some of this for example, as procurement savings filtered down into service delivery.

7.19
During our fieldwork, we asked for examples of this intended effect. We were given just one example arising from a regional services planning initiative (see Figure 5). The Ministry, regional offices, and DHBs were unable to provide other examples.

Figure 5
The Northern region's First Do No Harm programme

The Northern region launched the First Do No Harm programme in December 2011. Putting this programme into effect successfully is one of the main goals of the Northern Regional Health plan. The First Do No Harm website states that there is clear evidence that certain interventions, if systematically applied, will improve patient safety, reduce costs, and save patient lives. A study carried out in 2009 of hospital discharges in Otago in 1998 found that 12.9% had adverse events. Of those, 15% were permanent or fatal and 33% were significantly avoidable. At an average cost of $13,000 for each adverse event, the cost of preventable events is estimated to be $573 million a year.

First Do No Harm focuses on reducing harm from falls and pressure injuries in hospitals and residential aged care, reducing health-care-associated infections in acute care, improving medication safety, and improving safety during case transitions. The programme is planned, funded, and delivered through the Northern DHB support agency, working with primary health care as well as DHBs and aged residential care. The agency is in turn funded by contributions from the four DHBs.

The Northern region has clear targets related to improving quality of care and "return on investment". The region has calculated that, if it met the targets for the project (reducing harm and, therefore, improving quality of care), it would see a 1% reduction in expenditure in the four Northern region DHBs, "which would result in a payback of around 250% on the $0.9 million budget in 2012/13".
Did First Do No Harm contribute to the intended effects of regional services planning?
Plan, fund, deliver tick.
Quality of care tick.
Quality of care tick.
Reduce costs tick.
Measure outcomes tick.

7.20
We saw no Ministry monitoring of changes in cost by service arising from regional services plans. DHB financial break-even is an objective (and measure) in the regional services planning guidance and is monitored through DHB annual plans. The Ministry told us that, because the starting point of DHBs for regional collaboration was so uneven, it was unrealistic to expect the first regional services plans to include a full range of quantitative measures, such as costs. However, the planning regulations required the plans to be fully costed from the start. This "implementation lag" is why we have had difficulty finding evidence that the intended effects had happened.

7.21
Some quantified savings are forecast in back-office support services, such as banking services, insurance, and information systems.11 These flow from the work of HBL. HBL reaches agreement with each DHB on the costs and benefits expected from HBL initiatives. By July 2013, HBL was reporting achievement of $213.4 million of savings in the first three years. The reporting of savings is based on (unaudited) returns that DHBs submit to HBL. We say more on this in Health sector: Results of the 2011/12 audits.12

7.22
In addition to the HBL savings, regional shared services agencies also use joint procurement and supply to drive down costs. Examples include joint purchasing of expensive radiology and information technology systems and equipment.

7.23
The Ministry and DHBs gave us the following main reasons for the lack of information on costs in health and disability services in regional services plans:

  • It is difficult to attribute changes in costs to any one thing, including regional services planning.
  • It is too early to see cost savings from regional services plans.
  • It is too difficult to get the data from information systems.
  • Costs are increasing as more interventions take place.
  • Although costs are actually increasing, productivity or throughput is increasing for the same resources (the Ministry and the DHBs did not provide any evidence of increasing productivity).

Improving patient care

7.24
We expected to see evidence of improvements in the quality of care that could be attributed to regional services planning. As quality can be interpreted differently, we looked specifically at improvements in timeliness and equity of access. We use equity of access to describe how people are able to access services, irrespective of where they live in the region. We did not audit clinical safety because the work of the Health Quality and Safety Commission was outside the scope of our performance audit. The Health Quality and Safety Commission works with the health sector, with the overriding aim of reducing preventable harm to patients and service users.

7.25
On timeliness, we looked for quantitative evidence of performance improvement from one year to the next. For example, we looked for increases in numbers or percentages of patients receiving timely, high-quality treatment. We did find some examples of changed targets in initiatives that had been running for some years (in workstreams such as cancer services, cardiac services, and stroke services). For example, the Northern region action plan for cardiovascular disease set a target of 90% of outpatient coronary angiograms to be seen within three months in 2013/14. This was up 5% on the previous year's achievement. However, we saw few measures outside well-established workstreams.

7.26
On equity of access, we found few examples of initiatives outside the cancer services workstream. For instance, we saw little evidence of new regional clinical protocols that would increase equity of access to care.

7.27
Where improvements were being achieved, they were often the result of other nationally led initiatives, many of which had further funding attached, such as:

  • the Better, Sooner, More Convenient policy aimed at treating people more quickly and closer to home - this includes integrated health centres, intended to provide a full range of services, including specialist assessments by general practitioners, minor surgery, walk-in access, chronic care, increased nursing, and selected social services;
  • targets to increase the number of elective operations, with financial incentives for those DHBs that meet them;
  • further resources for older people, specifically for dementia;
  • Better Public Services initiatives, particularly for vulnerable children; and
  • the Maternity Quality Initiative.

7.28
This is not an exhaustive list, but gives a flavour of the complicated policy landscape within health and disability services. This reflects the Review Group's observation that "funding for new national initiatives also tends to be ‘layered' on top of existing DHB activity". It also shows that there are few, direct incentives linked to regional services planning.

7.29
We tested our findings about equity of access with staff from regional offices, the Ministry, and DHB senior managers. Almost all said that it was too early to see evidence of regional services planning having a positive effect on quality of care.

7.30
We heard a lot about work in progress, particularly on information technology systems, that would help to speed up access to services, and between points in the health and disability system. These included:

  • GP2GP file transfer – so that medical records move swiftly between general practices if a patient changes their general practitioner (about 820 general practices are using this technology);
  • maternity clinical information system – due to be phased in towards the end of 2013;
  • patient portals, due by 2014, which enable patients, as well as those involved in their care, to see their medical records; and
  • the national shared-care planning programme.

7.31
Many of these initiatives are relatively new or not put into effect fully. A recent evaluation found that the national shared-care planning programme had been slow to take off. The evaluation highlighted factors beyond the information technology systems, such as workforce development, getting appropriate funding, and understanding the patient's point of view. However, some clear benefits are possible, and some earlier changes, such as making referrals electronically, are becoming well established.

7.32
Regions had some good ideas about how improvements in performance could be recorded more systematically for a range of initiatives and plans. Clinical leadership of networks is starting to lead to a more evidenced-based approach to auditing for improved outcomes. A common comment from many senior staff was that they would like the plans to evolve to have a longer-term view with fewer mandatory priorities. We consider that this is a good time for the Ministry and the regions to consider how they can show progress. In 2016, we will return to the topic of regional services planning.

Recommendation 7
We recommend that the Ministry of Health and district health boards work together to prepare an evaluation framework and use it to work out whether regional services planning is having the intended effects.

9: Gluckman, P. (2013), The Role of Evidence in Policy Formation and Implementation, Office of the Prime Minister's Science Advisory Committee, available at www.pmcsa.org.nz.

10: The United States Institute for Healthcare Improvement prepared the Triple Aim Initiative framework. The Ministry of Health is a partner in the Initiative.

11: We say more about how HBL has set up collective insurance arrangements in our June 2013 report, Insuring public assets, available at www.oag.govt.nz.

12: Controller and Auditor-General (2013), Health sector: Results of the 2011/12 audits, available at www.oag.govt.nz.

page top