Ontario's municipal performance measurement program: fostering innovation and accountability in local government.
Five years later, the MPMP has proven useful to municipalities and taxpayers alike. After playing a leading role in developing the program, municipalities are now poised to receive some payback. The measures are now well defined and widely understood, and the results are available to taxpayers, elected officials, and administrators. Municipalities can compare their performance year-over-year or with the results of their peers. The public can use the information to persuade their representatives to make cost-saving improvements to services. Data from the MPMP has been used to uncover dozens of best practices in municipal service delivery--practices that have documented and quantifiable benefits and that are made freely available to all municipalities to replicate. Similarly, for the provincial government that launched the program, long-term success is now tied to the improvements it fosters.
This article examines how the MPMP has helped improve municipal accountability and service delivery in Ontario. It explains why the province embarked on the initiative and describes what the program measures and how it works. It concludes by highlighting the program results to date and the challenges to continued success.
WHY ONTARIO DEVELOPED THE MPMP (1)
With 11 million residents, Ontario is Canada's most populous province and the engine that powers the Canadian economy. Larger in area than Texas and New Mexico combined, the province comprises urban regions and agricultural communities in the south, and sparsely populated, resource-based economies in the north. The 445 municipalities in the province are a mix of large cities such as Toronto and Ottawa, and regional and county municipalities with constituent cities, towns, townships, and villages.
In the 1990s, the Ontario government initiated a realignment of responsibilities and sweeping reforms in the provincial-municipal relationship. Municipalities assumed full financial responsibility and greater control of 12 services that were previously shared with the provincial government. About $3 billion dollars in services were exchanged. At the same time, the provincial government introduced municipal financial reform that brought consistent province-wide market value assessment to the municipalities' property tax base. The same reform effort lifted 50 percent of the costs of public education from the municipal property tax base, allowing municipalities to re-direct this revenue to meet their expanded service responsibilities.
Finally, the province enacted new legislation in 2001 that gave municipalities more flexibility in managing their operations, but also required greater accountability to local taxpayers. The MPMP was a key element in the accountability framework set out in the legislation.
At the same time, many municipalities were implementing their own cost saving or performance improvement initiatives. In 2000, municipal administrators from the largest municipalities formed the Ontario Municipal CAO's Benchmarking Initiative to identify exemplary practices in public service provision. Since then, OMBI has attracted a wider clientele among progressive municipalities that see the benefits of measurement and comparison. The OMBI experience, which is discussed below, is illustrative of the timeliness and readiness for the provincial performance measurement program.
THE MPMP AND HOW IT WORKS
The MPMP is a measurement and reporting system that provides high-level information to help local taxpayers, elected officials, and administrators evaluate municipal services. The program consists of performance measures, data collection guides, and reporting tools that allow municipalities to generate, disseminate, and use standardized performance data on core municipal services.
The Ontario Ministry of Municipal Affairs and Housing is the provincial body responsible for the MPMP, and acts in an administrative capacity to ensure the program runs smoothly and improves over time. In this capacity, the ministry is also responsible for the overall integrity of the MPMP data, and has an ongoing roll in data verification and validation.
Municipal Involvement. Early on, the ministry recognized the need for municipal involvement in developing the program. Municipal administrators had recently formed the Ontario Municipal CAO's Benchmarking Initiative to identify and share best practices in municipal government. OMBI is a partnership of a progressive group of city administrators(2) and service area experts who are champions of performance measurement for municipal service improvement. OMBI had developed performance measures for municipal programs, services, and activities. For the initial group of MPMP measures, the ministry borrowed heavily from the OMBI experience. OMBI had spent considerable time and effort in developing and refining municipal performance measures, and using their measures was a logical starting point for the MPMP.
In 2001, the ministry formed the MPMP Advisory Committee in partnership with the Association of Municipalities of Ontario. AMO is the association of elected municipal officials in the province, and its support for the program was essential. The committee is made up of representatives from AMO, other municipal and professional associations, and small, medium and large municipalities. OMBI experts are also involved. Through technical working groups of subject matter experts, the committee reviews the measures and makes recommendations for improvements. It also helps facilitate the use of the measures by directing MPMP information to municipal associations, elected officials, and staff.
Performance Measures. The MPMP framework was developed taking a phased approach. At first, the program looked at 35 measures in nine municipal service areas; now it examines 54 measures in 12 areas. (3) The service areas are fire, police, roadways, transit, wastewater, storm water, drinking water, solid waste, land use planning, local government, libraries, and parks and recreation. These services represent major cost centers for Ontario municipalities, as shown in Exhibit 1.
The MPMP uses effectiveness and efficiency measures to assess performance in each of these areas. Effectiveness refers to the extent to which a service is achieving its intended results--for example, the percentage of garbage that is recycled. Efficiency refers to the amount of resources used to produce a given amount of service. The efficiency measures are based on operating costs only--for example, the operating costs per ton of garbage recycled. Although the MPMP excludes capital costs in calculating efficiency, the advisory committee is now looking at ways to include capital costs in the calculations. The program defines operating costs to exclude principal and interest payments on long-term debt so that the way a municipality funds its capital projects does not influence the performance measurement results.
After the first year, the municipal stakeholders made recommendations for improvements to many of the definitions, instructions, and formulas for calculating the required measures. The recommended changes were implemented in the next reporting year. About two-thirds of the measures were modified or replaced between the first and second year. Once the measures in the nine service areas were stabilized in the third year, the program expanded to include three additional service areas, including hard-to-measure service areas such as parks and recreation.
Accounting Challenges. One challenge to ensuring fair comparisons among the municipalities was accounting for the impact of indirect costs on the various service areas. Indirect costs refer to the costs of internal services (such as payroll) used by all municipal departments. According to municipal treasurers, indirect costs represent between 5 and 15 percent of the costs of general government. Any lack of consistency in the treatment of indirect costs would undermine the comparisons of efficiency and value in municipal service delivery.
In 2001, the ministry asked OMBI experts to develop a model for addressing indirect costs. OMBI proposed dividing general government costs into three categories: governance, corporate management, and program support. Program support refers to services handled centrally, but attributable to particular service areas. Such services include payroll and printing, for example. Corporate management costs are general expenses, such as legal support and corporate communications, which are less easily allocated to service areas. Governance includes things like municipal council expenses, which are also not easily assigned to service areas.
OMBI recommended phasing in the standardized reporting of indirect costs over a two-year period. It suggested asking larger municipalities--those with populations exceeding 100,000--to use the OMBI method, but allowing the smaller ones to allocate indirect costs as set percentages across service areas. The ministry published the OMBI methods and phase-in plan with a cautionary note explaining the transition to analysts making comparisons among municipalities.
Municipal Reporting. Municipalities collect and submit the performance data to the province using the annual Financial Information Return, or FIR. To accommodate MPMP reporting, the ministry added a set of non-financial information schedules to the existing FIR schedules. Using the FIR made sense for three main reasons: (1) the municipalities were familiar with the program, (2) they were already inputting much of the cost and statistical data needed to calculate efficiency measures, and (3) the reporting routines and schedules were well established. The ministry could simply build on the existing platform and its instructional materials.
Using FIR schedules, the municipalities report performance data on service efficiency and effectiveness, as well as the various factors that help to explain the results. The municipalities also describe their responsibilities and contractual agreements for service delivery, and provide information to use in the numerator and denominator of some measures. The ministry encourages municipalities to provide explanatory information to help put results in perspective. For example, the unit cost of snow removal for a municipality is influenced by a range of factors, including the amount of snowfall and the frequency of winter storm events, road types, traffic volume, maintenance standards, and service level decisions made by the municipal council. Municipalities have been eager to explain their MPMP results with this type of information. The explanations have proven useful in refining the measures and identifying the factors affecting performance results.
Most municipalities reported their 2000 performance results to the province in the summer of 2001 and to their taxpayers in the fall. Though some delays occurred, the exercise ran as smoothly as anticipated. The municipalities used a number of different media to communicate the results to taxpayers, the most common of which were posting the results on the Internet, mailing them to households in their tax or water bills, and publishing them in local newspapers. Some municipalities reported the MPMP results with the results of other initiatives, such as the OMBI or their own citizen surveys. Media coverage across the province was generally balanced. News reports often compared the local results with those of neighbouring or similar-sized municipalities, in most cases allowing municipal leaders to comment on or provide explanations for any significant differences among jurisdictions. In fact, the media's support of the program was helpful in gaining the buy-in of municipal leaders.
Benefits to Municipalities. Ensuring tangible benefits to municipalities is central to the long-term success of the MPMP. The Ontario Centre for Municipal Best Practices is a joint provincial-municipal initiative that was created for this purpose. The centre analyzes performance data from MPMP, OMBI, and other benchmarking activities to identify and promote exemplary service practices. It classifies service practices and examines underlying factors to identify the best-by-class performers. It makes technical analyses of service practices in case study formats, and suggests approaches to incorporating these practices into municipal decision making. The centre also provides feedback on the MPMP standards and their use.
The Association of Municipalities of Ontario first proposed the centre in 2002 as a complimentary institution to the MPMP and OMBI (see Exhibit 2). Funded by the province, the centre operates as a virtual (online) resource for municipalities. In its first two years, the centre published more than 40 studies on best practices in Ontario municipalities in four service areas. It is now polling AMO members to determine their interest in the case studies on best practices, and to ask what more the centre can do to help municipalities improve their service delivery.
Even though the ministry had been granted legislative authority to enact a performance measurement framework for municipalities in the province, the decision to make MPMP a mandatory program generated some resistance in the municipal sector, especially among municipalities that were not practicing performance measurement at the time. While the mandatory system jump started performance measurement in municipalities where it previously did not exist, the onus fell on the ministry to convince these municipalities that the benefits of performance measurement extend well beyond the provincial reporting requirements. The Ontario Centre for Municipal Best Practices was part of the negotiated settlement between the ministry and the municipal sector to ensure that all stakeholders were committed to performance improvement in service delivery.
Five years after its establishment, the Ontario Municipal Performance Measurement Program is still a work in progress. While the MPMP has achieved important results for Ontario municipalities and their taxpayers, there are challenges that must be overcome in the coming years.
Results to Date. The MPMP has increased awareness and knowledge of municipal performance among citizens, municipalities, and provincial authorities. It has strengthened public accountability by requiring that key performance information be made public. It has increased elected officials' and administrators' understanding of their own performance through training, measurement, and reporting activities. Municipalities are now able to compare their results year-over-year or with the results of other municipalities (see Exhibit 3). This has led to valuable information exchanges and, in some cases, to decisions for reviews of how a service is being delivered to the public.
The program has also been a catalyst for institutional development in the municipal sector. It has resulted in multi-stakeholder structures such as committees and working groups for reviewing performance measures, activities, and data. It has reinforced provincial-municipal relations through the creation of the Ontario Centre for Municipal Best Practices. It has helped foster a performance measurement culture in the province and stimulated the work of more intensive measurement initiatives such as OMBI. It has led to municipalities strengthening their internal management structures, and encouraged results-based planning and improvement.
The program has created conditions for improving the quality of municipal services. It has also created the tools and data to verify this claim. While anecdotal evidence suggests that municipalities are making small yet valuable improvements in their operations, the program's benefits become more apparent as comparative data are compiled each year. In the meantime, the ministry continues to develop tools and guides to help municipalities use the data to improve service delivery and financial management.
Challenges to Overcome.
With the growing pains of program start-up now subsiding, the ministry is focused on the goal of achieving lasting success. There are some challenges. A small number of municipalities continue to voice their opposition to the idea of measuring and reporting performance. Their voices will grow if they do not experience the benefits just described. Some municipal treasurers have complained that while they are increasingly being asked to perform analyses of their local MPMP results, a great deal of effort is required to do any sort of meaningful comparison with their peers. Most municipalities have overcome their initial apprehension, but not everyone embraces the program.
The ministry is responding by releasing summary data and plans to establish a Web-based tool that will allow users to take individual MPMP results and place them into a larger context using provincial means and averages. While there is a desire to allow comparison of individual results among municipalities, the ministry must also consider its earlier promise to not publicly rank municipalities using the MPMP results.
Another challenge is timeliness. Ensuring the accuracy of the data for comparative reporting takes time. On average, it takes most municipalities 12 months from year-end to provide the required data, and just as long for the ministry to examine and verify the data provided. The exercise is becoming more efficient, but data users expect further improvements.
The desire to move to higher-order efficiency measures in some service areas presents other challenges to the program. In these instances, the ministry would like to adopt measures that have denominators based on units of service (as opposed to per capita or per $1,000 of assessment), but the methodology for measurement is often complicated. For example, the new efficiency measure in library services is cost per library use. But more than 100 Ontario municipalities have more than one library board, which makes the apportionment of library use a complex and coordinated effort.
The MPMP has proven useful to municipalities and taxpayers alike. There are many reasons for its initial successes, though the quality of municipal engagement in the program and the consistency in data gathering are paramount. This in turn has created added spin-offs. MPMP data are being used in creative and constructive ways, such as research into best practices in municipal service delivery. Municipalities have been central in defining the measures, classifying services, analyzing data, sharing results, and raising their citizens' awareness of the program and its purpose. Now, they are active in facilitating the use of the program for accountability and improvement purposes. If the first five years are any indication, the MPMP will enhance the performance and accountability of Ontario municipalities for many years to come.
Exhibit 1: Municipal Revenue Fund Expenditures in Ontario in 2003 Local Government 8% Fire 5% Police 9% Roadways 8% Transit 8% Sewage 6% Garbage 3% Parks and Recreation 6% Libraries 2% Planning 2% Other 37% Note: Table made from pie chart.
Exhibit 2: Municipal Revenue Fund Expenditures in Ontario in 2003
MPMP Alignment with Other Programs
* OCMBP Best Practices Review (Analyze, Verify, Communicate)
* OMBI Service & Activity Indicators (Performance Management)
* MPMP Indicators (Leadership & Accountability)
(1.) For a more complete discussion of the rationale for developing the Ontario Municipal Performance Measurement Program, see John Burke, "Why are Provinces and Territories Promoting Performance Measurement in Municipalities?" in Provincial-Territorial Charrette on Municipal Performamce and its Measurement (May 2004), available at www.mah.gov.on.ca
(2.) The chief administrative officer, or CAO, position in Canadian municipalities is similar to the city manager position in the United States.
(3.) For more information about the Municipal Performance Measurement Program, including a listing of the measures themselves, visit www.mah.gov.on.ca
(4.) The Ontario Fire Service Performance Measurement & Benchmarking System is another measurement initiative to benefit from municipal-provincial partnerships. For more information about the project, visit www.ofspmbs.ca.
(5.) The MPMP Summary of 2001 Results (2003) respects provincial promises to avoiding naming and ranking municipalities. The report is available online at www.mah.gov.on.ca
JOHN BURKE is deputy minister of Ontario's Ministry of Municipal Affairs and Housing. Before this appointment, he served as deputy minister of the Ministry of Natural Resources and as the chief administrative officer of the City of Ottawa, the Regional Municipality of Halton, the City of Dartmouth, Nova Scotia, the City of Gloucester, and the City of North Bay. Mr Burke holds a bachelor's degree in commerce from St. Mary's University and a diploma in public administration from the University of Western Ontario.
|Printer friendly Cite/link Email Feedback|
|Publication:||Government Finance Review|
|Date:||Jun 1, 2005|
|Previous Article:||The UK'S CIPFA FM model: financial management and effectiveness in public service organizations.|
|Next Article:||Sweden's Kommuninvest financing cooperative: proving that the best financing solutions are created together.|