Printer Friendly

The measurement of police force efficiency: an assessment of U.K. Home Office policy.

I. INTRODUCTION

Since the 1960s and the introduction of efficiency measurement techniques of police forces, successive U.K. governments have tried to ensure that public funds were used in an "economic, efficient and effective" manner. This initial experiment in measuring the performance of police forces during the 1960s and 1970s led to many revisions, from input-output operations management techniques to scorecards, and to the Spottiswoode (2000) report (for a discussion of early performance reforms see, for example, Drake and Simper 2001; Stephens 1994; Sullivan 1998). (1) The Spottiswoode report advocated the use of data envelopment analysis (DEA) and stochastic frontier analysis (SFA) because this allows an interaction between inputs and outputs in the policing function, thereby bypassing many of the failings in the initial 1960-1970s performance measurement program (which was mainly output based). These procedures allow for multiple input-output configurations in a cost or production model to obtain efficiency scores.

However, the U.K. government has elected not to follow the recommendations of Spottiswoode (2000) and promoted the use of the alternative performance radar approach (during 2001-2003) and more recently a table of best value performance indicators (BVPIs) (see, for example, National Policing Plans 2003/04 and 2004/05, Home Office 2002, 2003). These two techniques are wholly output (outcome) based measurement programs to allow the public to determine whether their local police force can satisfy six domain criteria promoted on certain policing functions as specified by the Home Office. (2) The five domains in the performance radar approach are: reducing crime, investigating crime, promoting public safety; citizen focus, and resource usage (discussed in detail in section II). It is the aim of the performance radar approach, therefore, that these domains should show whether individual police forces are below a specified Home Office target or below an average performance level obtained from a set of reference forces (known as most similar forces, MSFs). Hence, in line with the Police Reform Act of 2002, police forces will now be assessed on their performance with respect to these domains, although it is important to note that this performance assessment takes no account of the costs incurred or resources deployed in the attainment of these targets. (3)

Although these performance radar targets are consistent with the methodology introduced by the new Labour government in connection with BVPIs (see DETR 1999), they do not follow the stated public policy aim of value for money. In the Home Office report "What Price Policing?", for example, it was stated that "police managers need to work harder to ensure that value for money is achieved, for competitive pressure has to be created internally. The costing of activity with subsequent measurement and comparison of performance provide the means by which such encouragement is given" (HMIC Report 1998, para. 10). Furthermore, by not linking outcomes to resource utilization and costs, the Home Office is also not following the recommendation made in Spottiswoode (2000) that states, "Best Value is the central plank in the drive to improve police performance. A systematic measure of police efficiency--where efficiency is a measure of the polices' performance in meeting their overarching aims and objectives for the money spent--is crucial if Best Value is to work effectively" (p. 4) (for further discussion see Drake and Simper 2002). As alluded to previously, the report further advocated the use of techniques such as nonparametric and parametric efficiency analysis, stating that "this approach would also mean that 'efficiency' is about finding ways of improving the performance delivered for the money that each authority and force has" (p. 5).

Within this context, therefore, the aim of this article is to demonstrate that the current government's analysis of police force performance could result in policy and resourcing decisions based on inconsistent rankings of forces, especially when resource usage or costs and environmental factors are excluded from the analysis. More specifically, to present a legitimate critique of the performance radar approach, the authors adopt an alternative and innovative, two-stage, nonparametric efficiency measurement technique that can take account of the potential impact of environmental factors. Furthermore, to ensure comparability between the two alternative approaches, the authors specifically incorporate the nonsurvey-based indicators used by the Home Office as inputs or outputs (as appropriate) in the nonparametric efficiency analysis. As explained subsequently, the authors do not advocate the use of survey based data directly in the efficiency analysis. However, the authors do present some analysis of these survey based variables in subsequent second stage analysis. Finally, the authors explicitly analyze whether the inclusion of resource costs (specifically excluded in the Home Office analysis) materially affects the resultant efficiency scores and ranks.

The article is organized as follows. In section II the authors provide details on the performance radar approach adopted by the Home Office. Section III discusses the methodology and data used and provides a brief review of the most relevant literature. On the former, details are provided on the two-stage DEA methodology, based on Fried et al. (1999), which is used to account for potential environmental influences on police force efficiency. Section IV presents the two-stage DEA results. These are the results from an initial DEA analysis and the subsequent stage 2 results, which adjust the inputs to take account of the influence of environmental factors. To the authors' knowledge, this is the first article to apply this two-stage approach to the study of police efficiency. Furthermore, this approach is used in the context of both the cost and production methodologies. That is, the DEA relative efficiency analysis is conducted with and without the inclusion of resource costs (as measured by net budget revenue) as an input. These various DEA efficiency results are analyzed with respect to the results obtained from the alternative performance radar approach. In section V, the authors analyze the relationship between the two survey-based domains used by the Home Office and both the DEA pure technical efficiency scores and a range of environmental variables. Section VI concludes.

II. THE PERFORMANCE RADAR APPROACH TO MEASURING THE PERFORMANCE OF ENGLISH AND WELSH POLICE FORCES

The performance radar diagrams, or spidergrams as they are sometimes referred to, were first published by the Home Office in February 2002. Part of the motivation for their usage appears to be a desire to avoid the publication of national league tables of performance such as those published in the United Kingdom in respect to schools and hospitals (and which were advocated for police forces in the Spottiswoode 2000 report). John Denham, Home Office Minister, for example, stated that "the monitors ('performance radars') cannot be used to make national comparisons between police forces or to construct league tables" ("Police Spidergrams Provoke Confusion," The Guardian, 19 February 2003).

The initial Home Office performance data was collected over the 2001-2002 period, and a report was presented 10 months after their initial release. Within each of the five performance domains the performance of each police force was assessed relative to the average performance of their peer forces, as measured by their MSFs. Table 1 presents the performance indicators used in the initial HMIC assessment with respect to the various performance domains. For example, Domain A: "level of public satisfaction with the police as measured by the British Crime Survey," is based on the question "how good a job do the police do?" with the quoted result being the percentage of respondents saying "good" or "excellent." (4)

In the resultant performance radar diagrams, as illustrated in Figure 1 for the Leicestershire police force, for example, the shaded pentagon area represents the average for the force's MSF, and the black line represents the Leicestershire force's score with respect to each domain, where a better (above average) performance is shown as the black line extending further outward than the shaded area. When the second set of performance radars were released in October 2003, however, bar graphs were added to show how well each force was doing compared to its peer group or MSF in respect of each domain. In terms of the bar graphs presented in Figure 1, for example, the performance of the force (darkest bar) is shown against their MSFs and the horizontal line shows the average of the MSFs. In this case, the performance of the Leicestershire force is very similar to the average performance of the MSFs, as indicated by the bar graphs. Hence, there is a very close correspondence between the dark line and the shaded grey pentagon.

[FIGURE 1 OMITTED]

The validity and robustness of the performance radar approach is an extremely important issue given that policy decisions may be enacted on the basis of the results this approach provides. A particular concern in this respect is that two of the performance domains: promoting public safety and citizen focus, are based on the use of survey data. It is well known, however, that survey results can be extremely sensitive to factors such as sample size, sample selection, and the impact of environmental factors, such as socioeconomic and demographic factors (see for example, Waddington and Braddock 1991). In addition, the Spottiswoode (2000) report expressed strong reservations about the quality of survey data and the use of such data in police relative efficiency measurement.

As mentioned previously, given these reservations concerning the use of survey data, the authors elect not to use this data directly in the first-stage efficiency analysis. However, the authors undertake second stage regression analysis to establish whether there is any relationship between police force efficiency and, for example, the public's fear of crime to analyze any potential biases. For example, do forces that are more efficient in clearing up crimes have any impact on the public's fear of crime and to what extent are these survey-based results influenced by environmental factors outside the control of individual police forces, rather than by the efficiency of the forces themselves.

The next section describes the nonparametric efficiency methodology and the data set used to determine the potential biases that can be induced into police performance measurement by: using purely output (outcome) based measures as in the performance radar approach; using survey-based data and failing to allow for the potential impact of environmental factors.

III. DATA AND METHODOLOGY

Although the Spottiswoode (2000) report advocated the use of both nonparametric techniques (such as DEA) and parametric techniques (such as cost function SFA), for the analysis of the relative efficiency of police forces, nonparametric estimation has been more widely used due to the difficulty in obtaining input price data for public service departments and sectors. A number of studies of police forces have utilized the standard Charnes et al. (1978) (CCR) program and analyzed the stability of the efficiency scores due to changes in the variable specifications and/or input weights. (See, for example, Nyhan and Martin 1999 for a study using U.S. police force data; Thanassoulis 1995 and Drake and Simper 2002, who both examined English and Welsh police force efficiency.)

Other police force studies have considered the actual nonparametric program specification, and whether this has had an effect on scores. The recent study by Diez-Ticio and Mancebon (2002) advanced a multiactivity model of policing in which they have two production functions using shared resources. The results show differences between the standard CCR model and the multiactivity DEA specification. Drake and Simper (2003a) also demonstrate differences in efficiency estimates across various techniques: CCR specification (DEA), free disposable hull (FDH), Anderson and Peterson (1993) superefficiency, and the parametric stochastic input distance function (SIDF). However, they find that the rank correlations are positive (DEA and FDH equal to 0.39, DEA and SIDF equal to 0.67, and SIDF and FDH equal to 0.34).

Finally, researchers have also utilized the technical efficiency scores from BCC programs to determine whether external factors have had a direct influence on ranks or scores. For example, Sun (2002) regresses technical efficiency scores against location, jurisdiction area, population, and the proportion of young people living in the area for 14 police precincts in Taipei city, Taiwan. Surprisingly, he finds none of these external factors have a significant effect on police force efficiency. In addition, Carrington et al. (1997) regress the technical efficiency of New South Wales police patrols on the proportion of young people, government housing, and the location of a patrol, using a tobit specification. They again find, following Sun (2002), that technical efficiency is not influenced by socioeconomic, demographic, or geographic factors external to the police forces.

However, a recent study by Drake and Simper (2003a) finds significant environmental and geographical factors affecting English and Welsh police force's technical efficiency. In this study, the exogenous variables utilized in a tobit specification included population, criminal offenses recorded, and six geographic dummies. The significance of the geographic dummy variables in this study is argued to be indicative of the influence of residual factors that are not being adequately accounted for in the basic DEA analysis. One possibility raised by the authors is that the police funding formula (PFF), which is supposed to ensure that the allocation of police force funding takes adequate account of the many demographic and socioeconomic differences across U.K. police forces, is fundamentally flawed. Hence, any use of DEA, without adequate adjustment for the impact of environmental factors, could lead to flawed and spurious relative efficiency scores.

This article extends the work of Drake and Simper (2003a) significantly by introducing a new data set of environmental, socioeconomic, and demographic variables that could have an impact on policing and on DEA relative efficiency scores and rankings. To the authors' knowledge this is the first such police force study undertaken using such a set of actual external environmental factors rather than a set of proxy variables. As mentioned previously, the authors further extend the nonparametric modeling methodology by using Fried et al.'s (1999) slacks-based environment adjusted input specification. These DEA results are contrasted with the relative performance results obtained using the performance radar approach advocated by the Home Office.

In addition to the problems associated with obtaining accurate input price data for U.K. police forces, a further reason for using a nonparametric specification such as DEA is that no knowledge of the underlying functional form for the cost or production function is required. All that is required is that some correspondence exists between the inputs and the outputs/outcomes across the decision-making units (DMUs). The nonparametric approach DEA was originally developed by Farrell (1957) and later elaborated by Banker et al. (1984) and Fare et al. (1985). The constructed relative efficiency frontiers are nonstatistical or nonparametric in the sense that they are constructed through the envelopment of the DMUs with the "best practice" DMUs forming the nonparametric frontier. This nonparametric technique was referred to as DEA by Charnes et al, (1978).

As mentioned previously, the authors use the two-stage DEA approach of Fried et al. (1999) in an attempt to purge the raw DEA scores of any impact from environmental or external factors outside the control of individual police forces. In this particular article, the authors restrict the DEA analysis to the analysis of pure technical efficiency (PTE) rather than examining such issues as scale efficiency. Furthermore, an input-orientated DEA program is used as one of the key inputs specified is offenses committed. Hence, it is seen as desirable that policing activity aims to reduce the incidence of crime while at the same time maintaining the detection rates relating to these crimes. The latter is specified as an output (outcome) in the DEA analysis.

A. Data

The literature on modeling the efficiency of police forces has led economists to posit two alternative methodologies, the cost and the production approach (see Drake and Simper 2003b). The former relates inputs/costs to possible outputs/outcomes (such as offenses cleared; see early cost function estimation of U.S. policing by Darrough and Heineke 1979, Gyapong and Gyimah-Brempong 1988; and more recently Nyhan and Martin 1999; for the United Kingdom, Cameron 1989 and Drake and Simper 2000). In contrast, the latter production methodology relates the number of offenses committed to the effectiveness of forces in offenses cleared (see Thanassoulis 1995 for a U.K. example; Sun 2002 for a Taiwanese example; and finally Diez-Ticio and Mancebon 2000 for an example of the production approach used to assess the efficiency of Spanish policing).

This article presents comparative results utilizing both the production (Model 1) and cost (Model 2) methodologies, providing a further innovation in the literature. The former methodology adheres more closely to the Home Office performance radar approach by utilizing only the number of offences as inputs, whereas the latter has an additional input variable, police force costs. It is important to note that with respect to the production methodology, the authors have adopted the nonsurvey-based data used by the Home Office (in the performance radar analysis), as far as possible, as either inputs or outputs (outcomes) in the DEA analysis. Hence, although other potential variables are available, the aim is to ensure that the nonparametric efficiency analysis is as comparable as possible with the Home Office's preferred approach so that a valid critique can be presented. The summary statistics for the two model's inputs and outputs are presented in Table 2.

In Model 1, therefore, the inputs relate to offenses committed and, following the break down given by the Home Office (2002), these are number of burglaries, number of vehicle crimes, and number of robberies. The choice of these variables is directly related to the domains given in the performance radar diagram already discussed. As can been seen from Table 2, there are wide variations across the police force areas in England and Wales, which highlights with respect to their diversity the demographic, geographic, and socioeconomic characteristics of the area population. For example, some forces cover rural areas (Dyfed Powys, North Wales, South Wales, etc.), whereas other police force areas are dominated by urban, highly populated cities. The largest police force is the Metropolitan force, which covers London.

In specifying the incidences of various categories of crimes as inputs in both these DEA models, the authors are ensuring that the relative efficiency analysis adequately captures the performance domain of reducing crime, as specified by the Home Office (2002). (5) In terms of outputs, this study again follows the methodology behind the Home Office (2002) performance radar diagram approach by specifying the outputs (outcomes) total offenses cleared and police and civilian days lost (suitably transformed into a more is better variable).

The output total offenses cleared, has been used by Nyhan and Martin (1999). Due to the small number of police forces present in the present sample (41), however, using separate offenses cleared for burglaries, violent crimes, and robberies would be likely to result in many forces being located on the efficient frontier. Hence, the authors prefer to specify the aggregate of offenses cleared. The use of days lost to illness follows the recommendations of the first official U.K. government sponsored study into measuring police efficiency (Home Office 2001). This report argued that this variable can be thought of as a managerial activity variable in which it would be desirable to reduce the incidence of sickness absence in police forces. Indeed, it may well be that days lost are actually a symptom of an underlying morale or management problem and may therefore be related to poor performance in other key output (outcome) areas.

Finally, in Model 2, the authors also include the net budget revenue of each police force as an additional input variable. This variable is the yearly expenditure over which forces have direct control (hence, excluding specific grants and other income from operations). (6)

B. Two-Stage DEA

For each DMU in turn, using x and y to represent its particular observed inputs and outputs, PTE is calculated by finding the lowest multiplicative factor, [theta], which must be applied to the firm's use of inputs, x, to ensure it is still a member of the input requirements set or reference technology. That is, let [Y.sub.i] = [[y.sub.li],..., [y.sub.mi]] be an M X I vector of outputs of DMU i = 1,..., I, and let [X.sub.i] = [[x.sub.li],..., [x.sub.Si]] be an S X I vector of inputs of DMU i = 1,..., I, then PTE is calculated by solving the problem of finding the lowest multiplicative factor, [theta], which must be applied to the firm's use of inputs, [X.sub.i], to ensure it is still a member of the input requirements set or reference technology. That is choose {[theta],[lambda]} to: min [theta] such that

(1) [theta][X.sub.i] [greater than or equal to] [[lambda].sub.i][X.sub.i]

[Y.sub.i] [less than or equal to] [[lambda]'.sub.i][Y.sub.i]

[[lambda].sub.i] [greater than or equal to] 0, [[SIGMA].sub.i][[lambda].sub.i] = 1.

Following Fried et al. (1999), the input slacks from program (1) are obtained and regressed on a set of environmental factors that are likely to affect the technical efficiency of police forces. That is, the authors estimate,

(2) I[S.sub.j.sup.k] = [f.sub.j]([Z.sub.j.sup.k], [[beta].sub.j], [[epsilon].sub.j]),

where I[S.sub.j.sup.k] is the total radial and nonradial input slack j for police force k; [Z.sub.j.sup.k] is a vector of j external factors that are likely to affect the efficiency of police force k and hence its input slack I[S.sub.j.sup.k]; [[beta].sub.j] is a vector of parameters to be estimated; and finally [[epsilon].sub.j] is the disturbance term.

The vector of external variables relating to each police force area consists of a set of variables obtained for each police force from various sources including the UK Census, national statistics, the PFF, and individual police force returns to various government departments. The daytime population consists of four elements--residential population, commuters (net inflow), overnight visitors, and daytime visitors--and aims to determine whether cities that have greater inflows of people to work or who live in the city (or large town) might experience a greater feeling of fear of crime. In addition to the daytime population, the authors also include police force estimates of daytime population relative to resident population to take account of those cities (large towns) that might have a disproportionate influx of commuters relative to those living in urban areas.

As a proxy for deprivation, the authors include the proportion of lone parent households, homes that have a single parent and at least one dependent child. The analysis also considers the measure of proportion of terraced housing, which relates primarily to inner-city areas in which residents live in close communities. These areas tend to have higher levels of deprivation and hence this variable could be linked positively with crime rates. Conversely, the relatively high level of housing density combined with the characteristic of close-knit communities could contribute to a lower fear of crime.

Finally, when the PFF was introduced in 1995/96, police forces argued that it could be more expensive to police sparsely populated rural areas. In 1996/97 the government introduced this element into the PFF, and the authors have also included this variable to test the possible impact of this environmental factor on police force efficiency and also to test whether such factors are adequately incorporated into funding via the PFF. (7)

Hence, based on the regressions of input slacks against the set of environmental variables, the inputs are adjusted using the difference between the predicted maximum input slack I[^.S.sub.j.sup.kMaximum] and the predicted slack I[^.S.sub.j.sup.k]. That is,

(3) [x.sub.j.sup.kadjusted] = [x.sub.j.sup.k] + [I[^.S.sub.j.sup.kMaximum] - I[^.S.sub.k.sup.k]].

It is clear from equation (3) that the adjustments of the inputs are made to reflect the negative or positive impact of environmental factors on the slacks, such that these are purged from the reestimation. The DEA program (1) is then reestimated using the adjusted inputs and the first stage outputs to obtain new, stage 2, technical efficiency scores. These stage 2 DEA results are obtained with respect to both the production and cost approaches to modeling police force efficiency, and these results are contrasted with the results obtained from the Home Office's performance radar approach. The stage 2 DEA results are also used in the context of the subsequent second-stage tobit regression analysis outlined previously.

To the authors' knowledge, little use has been made to date of the Fried et al. (1999) approach to adjusting DEA results for the potential impact of environmental factors. Such an adjustment is likely to be very important, however, in the case of the analysis of relative efficiency in public sector services such as policing. Furthermore, there have been relatively few examples of the employment of second-stage regression analysis, particularly with respect to the impact of environmental variables, relative to the plethora of DEA relative efficiency studies. One such study, however, is that of Chilingerian (1995), in which potential environmental effects on the overall and technical efficiency of physicians were examined using a second-stage tobit model that regressed efficiency on a vector of explanatory factors including age, size of caseload, and so on. In the second-stage model many of the external factors were found to be significant with the implication that the raw DEA scores may be biased due to their failure to incorporate these external factors. Similarly, Gillen and Lall (1997) analyzed airport productivity and found, utilizing a second-stage tobit approach, that factors such as number of airline hubs, number of gates, and whether an airport had a rotational runway also affected the DEA efficiency scores. Finally, in a very recent study, Linna et al. (2003) found that socioeconomic factors also had a significant impact on the technical efficiency of Finnish health centers.

C. Second-Stage Regression Analysis

The second stage regression analysis uses the tobit regression approach, which takes the form:

(4) [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]

where [beta] is a k X 1 vector of unknown parameters and [x.sub.i] is a k X 1 vector of known environmental variables (discussed previously) and the stage 2 PTE scores. The first set of tobit regressions relate to the domain promoting public safety. Hence, the [y.sub.i]'s relate to survey results pertaining to the public's fear of crime. Specifically, the dependent variables relate to the proportion of the respondents worried about violent crime, worried about burglary, and worried about vehicle crime. The government's Public Service Agreement target 1 is that the level of fear of crime should be significantly lower (at the 10% critical level) by 2006 than that reported in 2002 in the categories of violent crime, burglary and vehicle crime (Home Office 2003).

The information on the public's fear of crime in these three categories is obtained from the British Crime Survey (BCS) (Home Office 2002). The BCS is an annual survey of 40,000 people across each police force. The surveyed individuals are asked whether they have had a crime committed against them and are asked to gauge their level of fear of crime. The BCS differs from the police recorded level of crime because not all crimes committed are reported to the police, for example, a burglary in which a small amount of goods of little value are taken and the owner is not insured. This has led to differences between the BCS and the recorded crime statistics published by police forces (see McDonald and Pyle 2000). According to the BCS, for example, domestic burglary as of 2001 is down by 39% since 1997, whereas burglary crime recorded by the police was up by 28% in 2001 relative to the previous year.

The final second-stage tobit regression relates to the remaining survey based performance domain specified by the Home Office, citizen focus. In this case the independent variables are the same as those specified for the promoting public safety domain, but the dependent variable is the survey responses relating to residents thinking the police do a good job. This survey data is again taken from the BCS, which monitors the response to the question "how good a job do the police do?" In the tobit regression the dependent variable is the percentage of respondents indicating "good" or "excellent" in answer to this question.

In summary, therefore, the inputs and outputs specified in the DEA analysis allow the authors to capture the three nonsurvey-based performance domains used by the Home Office: investigating crime, reducing crime, and resource usage; whereas the second-stage regression analysis allows the authors to analyze the two remaining, survey-based domains, promoting public safety, and citizen focus. The next section presents a discussion of the empirical results obtained across these various performance domains.

IV. THE IMPACT OF ENVIRONMENTAL FACTORS ON POLICE FORCE EFFICIENCY

Table 3 provides details of the PTE results. As explained previously, these are provided for two model specifications with respect to policing activity. Model 2 differs from Model 1 in terms of specifying an additional input, net budget revenue. Furthermore, for both these models, two sets of results are provided. The stage 1 results represent what might be termed the raw PTE scores, whereas the stage 2 results represent an attempt to purge these raw PTE scores of any environmental influences. As detailed previously, this amounts to regressing the relevant input slacks on the specified set of environmental variables and using these regression results to produce adjusted input levels for each police force. These adjusted input levels are then used, in conjunction with the original output data, to produce the stage 2 PTE results.

Concentrating initially on the contrast between the Model 1 and 2 (stage 1) results, it is clear from Table 3 that with respect to those forces deemed to be inefficient, the PTE scores are generally much lower under the Model 1 specification and also exhibit much greater diversity. This is graphically illustrated by the cases of the Cleveland and Cambridgeshire forces, which exhibit Model 1 PTE scores of 14.40 and 16.10, respectively, but record Model 2 PTE scores of 67.26 and 74.33. For the sample as a whole, the mean PTE score under Model 1 is only 54.89 in contrast to the mean of 76.55 for Model 2, and the greater diversity of efficiency scores under Model 1 is evident from the comparative minimum scores (14.40, 40.22) and standard deviations (31.46, 17.31).

Despite the marked contrast in the relative efficiency scores across Models 1 and 2, however, it is frequently the case that the efficiency ranks are relatively unaffected. Furthermore, the majority of the forces that are technically efficient (and ranked joint first) under Model 1 are also technically efficient under Model 2. In fact, there are nine police forces that exhibit PTE under both model specifications. This strong correspondence between the two sets of relative efficiency scores in terms of rankings is also confirmed by the Spearman's rank correlation coefficient of 0.72 (significant at the 1% critical level).

Notwithstanding this strong ranking correspondence, it is quite clear that certain forces do exhibit marked differences in both relative efficiency scores and ranks across Models 1 and 2. The most striking example of this is the case of the Warwickshire force, which exhibits a PTE score of only 25.60 (rank 32nd) under Model 1 but a score of 100 (rank 1st) under the Model 2 specification. Other less extreme examples include the Cambridgeshire force (Model 1 PTE = 16.10, rank 40th, Model 2 PTE = 74.33, rank = 19th) and the Wiltshire force (Model 1 PTE = 38.92, rank 23rd, Model 2 PTE = 81.77, rank = 15th).

These examples illustrate very clearly that a minority of police forces can be significantly disadvantaged by the failure to relate outcomes such as crime clear-ups to a measure of resource costs, as well as to inputs relating to other performance domains, such as the incidences of various categories of crime. As has been discussed, this can impact not only the performance ranking of the force but also the perceived degree of inefficiency relative to the best performing (efficient reference set) forces. Such discrepancies may be extremely important if relative performance across various domains is used to influence policy and funding decisions.

A possible explanation for the substantial diversity in performance across police forces in the stage 1 PTE results, particularly with respect to Model 1, is that there are environmental factors that may impact on police performance in the various domains but that are not adequately captured in the DEA analysis. As discussed previously, it is for this reason that the authors attempt to purge the DEA relative efficiency scores of any potential environmental influences using the two-stage process advocated by Fried et al. (1999).

The resultant stage 2 PTE scores are reported in Table 3 for both Models 1 and 2. As might be expected, the incorporation of environmental factors does appear to have an important influence on the relative efficiency of English and Welsh police forces. If the authors contrast the Model 1 results under Stage 1 and Stage 2, it is clear that the appropriate incorporation of environmental factors raises the mean PTE efficiency score, from 54.89 to 78.86, and also considerably reduces the degree of diversity in performance. With respect to the latter, the minimum PTE score increases from 14.40 to 40.22, and the standard deviation declines from 31.46 to 14.99.

As with the contrast between the Model 1 and Model 2 (stage 1) results, the incorporation of environmental factors does not have a marked impact on the efficiency rankings in many cases. However, given the intention of using relative performance indicators as the basis for constructing police efficiency groupings and for resourcing decisions, and so on, as stated in the Spottiswoode (2000) report, even relatively modest changes in efficiency rankings could have important implications. Furthermore, some changes in efficiency rankings are far from modest. The Cambridgeshire force, for example, would be ranked 40th (PTE = 16.10) under Model 1 stage 1 but ranked 29th (PTE = 88.65) under Model 1 stage 2. Conversely, the South Yorkshire force would be ranked at 24th (PTE = 37.80) under Model 1 stage 1 but ranked 39th (PTE = 75.05) under Model 1 stage 2.

As outlined previously, the PFF is designed to relate the funding and resourcing of police forces to their perceived needs and hence to take account of environmental factors which may impact on criminal activity, and so on. Hence, it would be expected that if the PFF is an appropriate way of adjusting resourcing to reflect environmental factors, there would be little difference between the stage 1 and 2 PTE results for Model 2. It is clear from Table 3, however, that in moving from the stage 1 to stage 2 results, the mean PTE score increases from 76.55 to 95.60, and the minimum PTE score increases from 40.22 to 75.30. Furthermore, the diversity in PTE scores across police forces also declines, as is evident from the reduction in the standard deviation of the PTE scores from 17.31 under stage 1 to 13.00 under stage 2 (Model 2). Once again, the significant feature of these results is that the relative efficiency rankings of some individual police forces can be dramatically affected by the failure to adequately take account of environmental factors.

It is clear from this analysis, therefore, that the incorporation of environment factors is of crucial importance in respect of any robust analysis of relative police force performance/efficiency. Furthermore, the apparent sensitivity of the Model 2 PTE results to the stage 1 and 2 analysis suggests that for some forces, the use of the PFF might not be adequate with respect to creating a level playing field from which to assess relative police force efficiency.

Having undertaken a rigorous analysis of police force efficiency using DEA under two alternative model specifications, and with and without the adjustment for environmental factors, it is potentially illuminating to contrast these results with those obtained from the performance radar analysis produced by the Home Office (2002). Clearly, given the input/output (outcome) configurations adopted in the DEA analysis, it is only appropriate to contrast police performance across three of the five domains (reducing crime, investigating crime, and resource usage). As emphasized previously, however, reservations can be expressed concerning the use of survey data as a basis on which to conduct police performance comparisons, as in the two remaining domains (citizen focus and promoting public safety). Furthermore, as will be seen from the comparisons discussed shortly, for the vast majority of police forces, these latter two performance domains tend to have very little discriminatory power. Specifically, performance across these two domains tends to be very similar to the average performance across the other comparator forces, as chosen by the Home Office. For completeness, however, in section V the authors do investigate the relationship between the DEA results (under both model specifications and under stages 1 and 2) and the performance measures in these two remaining domains.

It is clear from the performance radar analysis (Home Office, 2002), that the best performing forces across the three relevant domains are the Dyfed-Powys and Gwent forces (especially with respect to reducing and investigating crime) and the Northumbria and Suffolk forces. Their superior performances are clearly illustrated in Figure 2. It is interesting to note, therefore, that these forces are also found to be consistently efficient (ranked first) across all the DEA permutations reported in Table 3. A further force that displays performance well above average in respect of reducing and investigating crime is the Hampshire force (Figure 2). Once again, it is clear from Table 3 that this force is also ranked as consistently efficient under Models 1 and 2 (stage 1). However, it is clear that when environmental factors are taken into account, this force is marginally less efficient (stage 2).

At the other end of the performance spectrum, it is clear from Table 3 that the Avon and Somerset force is consistently one of the worst performing forces. It is ranked 41st out of 41 forces in three out of the four DEA specifications, and is ranked 39th according to the Model 1 stage 2 analysis. In Figure 2, this very poor performance across the relevant three domains is also evident in the performance radar diagram analysis and clearly represents the worst performance of any police force across these domains.

Hence, it is evident that in the most extreme cases of exceptional or poor performance, there is a very strong correspondence between the DEA results and the performance radar evidence, notwithstanding the methodological reservations concerning the latter. However, a further comparison of the two sets of performance results indicates that the performance radar approach can produce highly misleading relative performance results. The Bedfordshire and Lincolnshire forces, for example, represent cases in point. It is clear from Figure 2 that both forces exhibit performances that are reasonably close to the average (relative to the MSFs) across the three relevant domains. However, the Bedfordshire force is ranked first under all various DEA approaches, whereas the Lincolnshire force has various scores from 30.26 to 100.00 depending on which particular DEA model is estimated. Even with respect to the latter force, however, the relative efficiency scores are 99.74 and 100.00 (under Models 1 and 2, respectively) once environmental factors are adequately taken into account (stage 2). Hence, in both these cases the forces would be deemed as close to the average according to the Home Office's evaluation. According to the stage 2 DEA analysis, however, these forces should be considered as two of the most efficient forces in the country.

[FIGURE 2 OMITTED]

In summary, these results suggest that the performance radar approach adopted by the Home Office in the United Kingdom could produce misleading relative performance measures for some police forces. This is potentially serious if these results are used to inform target setting and relative resourcing decisions, and most especially if these decisions are made relative to other police forces that are considered (possibly erroneously) to be the MSFs. Particular shortcomings in the performance radar approach relate to the failure to relate desired outcomes to the resources used in attaining these outcomes, the failure to incorporate the potential impact of environmental factors, the ambiguity regarding the set of comparator forces selected for individual forces, and the consequent inability to contrast the results of forces on a general pairwise basis and to produce national performance rankings. Finally, the use of performance radars also presents difficulties with respect to how to appropriately weight the outcomes across the various domains to produce a relative measure of overall performance, especially when performance in some domains is above average and in others below.

[FIGURE 2 CONTINUED OMITTED]

In contrast, the DEA approach represents a more analytically robust methodology that can address all these shortcomings inherent in the performance radar approach. Furthermore, the various DEA results reported in Table 3 indicate very clearly that it could be extremely important in some cases to adequately account for both environmental influences and resource cost/usage to produce a fair and unbiased measure of police performance or efficiency. Hence, as the nonparametric DEA approach can potentially address all the shortcomings of the Home Office's preferred approach, and given that this approach tended to produce rankings of the best and worse performers that were consistent with the performance radar approach, the authors would advocate that the U.K. Home Office evaluates police force performance across the various, nonsurvey-based, domains using a relative efficiency methodology such as DEA. This would also bring Home Office policy into line with the recommendations of the Spottiswoode (2000) report.

V. PUBLIC CONFIDENCE AND FORCE EFFICIENCY

Due to reservations concerning the use of survey data, and also due to potential size related biases associated with the use of percentage figures as outputs (outcomes) in DEA, the authors elected not to use the domains citizen focus and promoting public safety in the initial comparison of DEA and the performance radar approach. Notwithstanding the reservations with respect to these two domains, however, they are employed by the Home Office in the relative assessment of police force performance. The purpose of this second-stage regression analysis is to establish if there is any relationship between the two sets of survey results and police efficiency. Specifically, are variables relating to the public's fear of crime and their perception of whether the police are doing a good job related to the intrinsic economic efficiency of police forces, as measured by the DEA scores pertaining to the three domains resource usage, investigating crime, and reducing crime. In addition environmental (demographic and socioeconomic) and incidence of crime variables are used to determine whether these factors could have a significant influence on the survey-based domains.

To ensure that the authors have adequately accounted for any potential environmental influences on relative police force efficiency, they use only the stage 2 DEA (PTE) results. Because these have already been purged of any environmental influences in the two-stage Fried et al. (1999) DEA analysis, it is legitimate to use these PTE scores as an independent variable, alongside the various environmental factors, in the second-stage tobit regression analysis. Given that the inclusion or exclusion of resource costs may have an impact on the second stage regression results, however, the authors report these both with respect to Model 1 and Model 2.

It is clear from Table 4 that with respect to the fear of various types of crime, the variable PTE is not a significant explanatory variable under either the Model 1 or Model 2 specifications. This is a significant result because it raises questions over the inclusion of this type of survey-based variable in any assessment of police performance, given that there appears to be little that the police can do to influence these fears via improvements in their underlying efficiency. In other words, these fear of crime results appear to be essentially outside the influence of individual police forces. This view is reinforced by the finding that there is only a limited influence of the incidence of certain crimes on the recorded fear of these crimes. Although the number of burglaries and the number of vehicle crimes do appear to significantly influence the fear of the corresponding crimes, there is no such evidence with respect to the number of violent crimes under either Model 1 or 2.

Hence, the lack of significance of PTE in the fear of crime regressions, combined with the lack of any strong and consistent influence from the incidence of crime to the fear of crime, clearly raises questions over both the veracity of the survey data and the logic of including these types of fear of crime variables as a separate domain in the performance radar analysis. This argument is reinforced by the fact that in both the DEA and performance radar approaches, the incidence of crime is formally included in the analysis. One would suspect that the general levels of various types of crime would inevitably have a very powerful and large influence on the public's fear of crime. Hence, the fact that this relationship is not showing up clearly in the tobit regression results tends to confirm the authors' reservations over the use of this type of survey-based data.

It is interesting to note, however, that a number of environmental variables appear to be significantly related to the public's fear of crime, and these relationships generally conform to a priori expectations. The fact that the fear of violent crime is negatively and significantly related to the variable TERRACED HOUSING, for example, may relate to the fact that such communities tend to be relatively close-knit and more densely populated. The converse of this may explain why the fear of burglary is positively and significantly related (at the 10% level) to the variable SPARCITY. Finally, fear of vehicle crime seems to be positively related to LONE PARENT HOUSEHOLDS and SPARCITY under both Models 1 and 2. The former may relate to the fact that both the incidence (reported and unreported), and fear of vehicle crime may be higher in areas of economic and social deprivation. The latter relationship, however, may relate to the fact that the probability of detection may be lower in more sparsely populated areas. Hence, both the incidence and fear of vehicle crime may also be higher in these areas.

The fact that such socioeconomic variables are outside the control of individual police forces, however, serves to reinforce the argument that survey responses, such as fear of crime measures, should not form a domain with respect to police performance measurement. It could be argued, moreover, that the responsibility for influencing measures relating to the fear of crimes, which are themselves strongly influenced by socioeconomic factors, should rest with the government rather than with individual police forces.

Finally, with respect to the remaining domain, citizen focus, it is clear from Table 4 that PTE is a positive and significant explanatory variable in respect of the public's perception of whether the police are doing a good or excellent job. Interestingly, however, this is only the case for Model 1 and not Model 2. This strongly suggests that the public has a reasonably accurate view of the efficiency of police forces as it relates to domains such as reducing crime and investigating crime (which are captured by Model 1). It is clearly less easy for the public to assess the efficiency of police forces in terms of value for money. Hence, this may explain why PTE is not significant under Model 2 (which includes resource costs as an additional input). Once again, however, it is clear that the public's perception of whether the police are doing a good job is also influenced by environmental variables. The variable INCOME SUPPORT (a government insurance scheme for the unemployed and those on low income), for example, is positive and significant at the 10% level in both Models 1 and 2. This is an interesting and somewhat surprising result as it suggests that public's perception of police performance tends to be better in less affluent areas. Finally, the public's perception of police performance appears to be negatively and significantly related (at the 10% level) to daytime population, at least according to Model 1.

VI. CONCLUSIONS

This article provides a comparative analysis of the performance radar approach introduced by the U.K. Home Office to assess the relative performance of police forces. Specifically, the relative efficiency of U.K. police forces is analyzed across the three nonsurvey-based performance domains specified by the Home Office using the well-established technique of DEA. To ensure that external or environmental factors (which may impact on relative efficiency levels but are typically outside the control of individual police forces), are adequately accounted for, the authors employ the two-stage approach of Fried et al. (1999). Furthermore, both the production and cost approaches are specified to establish whether the omission of resource usage costs as an input can result in biases in relative efficiency measurement in respect of policing.

The present results confirm that the latter is indeed the case. The authors also establish that it is extremely important to adequately incorporate the impact of environmental variables, because failure to do so can produce misleading and unfair relative performance measures in respect of some police forces.

More significantly, the comparison of the DEA results with the performance radar relative performance measures suggests that although there is often a great deal of consistency at the extremes of the performance spectrum, in many cases the latter approach can produce misleading assessments of the performance of individual police forces. A central weakness of the performance radar approach is the failure to relate the outputs or outcomes of individual police forces directly to the resources (costs) incurred in their attainment. This serious omission is rectified by the cost specification of the DEA approach. From a policy perspective, the impact of the inclusion of resource costs on relative efficiency measures (using DEA), combined with the frequent marked differences between the DEA results and ranks and those of the performance radar approach, suggests that it would be inadvisable to use the latter results in target setting or resource allocation decisions.

Notwithstanding reservations concerning the use of survey data in the assessment of policing performance, the second-stage tobit regressions indicates that the PTE appears to have no impact on the various fear of crime indicators specified in the promoting public safety domain. Furthermore, there is only limited evidence of a link between the incidences of crime and the publics' fear of crime. Clear evidence does emerge, however, of the influence of socioeconomic variables on the public's fear of crime. Because these factors are outside the control of individual police forces, however, this result, combined with the other evidence, strongly suggests that this type of survey data should not be used in the context of police performance domains.

Finally, with respect to the citizen focus domain, it appears that the technical efficiency of the police under the production approach does have a significant and positive impact on the public's perception of whether the police are doing a good job. There is no such relationship, however, when PTE is measured using the cost approach. This strongly suggests that the public has an accurate perception of relative police performance with respect to crime levels and crime fighting (clearing up crimes), but not with respect to value for money. Because the latter has tended to be the main objective of governments in recent years, this result again calls into question the legitimacy of using this type of survey based data in the context of police performance domains.

REFERENCES

Anderson, P., and N. C. Peterson. "A Procedure for Ranking Efficient Units in Data Envelopment Analysis." Management Science, 39, 1993, 1261-64.

Banker., R. D., A. Charnes, W. W. and Cooper. "Some Models for Estimating Technical and Scale Efficiencies in Data Envelopment Analysis," Management Science, 30, 1984, 1078-92.

Cameron, S. "Police Cost Function Estimates for England and Wales." Applied Economics, 21, 1989, 1279-89.

Carrington, R., N. Puthucheary, R. Deirdre, and S. Yaisawarng. "Performance Measurement in Government Service Provision: The Case of Police Services in New South Wales." Journal of Productivity Analysis, 8, 1997, 415-30.

Charnes, A., W. W. Cooper, and E. Rhoades. "Measuring the Efficiency of Decision Making Units." European Journal of Operational Research, 2, 1978, 429-44.

Chilingerian, J. A. "Evaluating Physician Efficiency in Hospitals: A Multivariate Analysis of Best Practices," European Journal of Operational Research, 80, 1995, 548-74.

Darrough, M. N., and J. M. Heineke. "Law Enforcement Agencies as Multiproduct Firms: An Econometric Investigation of Production Cost." Public Finance 34, 1979, 176-95.

Department of the Environment, Transport and the Regions (DETR). Performance Indicators for 2000/2001. Her Majesty's Stationary Office, 1999, London.

Diez-Ticio, A., and M.-J. Mancebon. "The Efficiency of the Spanish Police Service: An Application of the Multiactivity DEA Model." Applied Economics, 34, 2002, 351-62.

Drake, L. M., and R. Simper. "Productivity Estimation and the Size-Efficiency Relationship in English and Welsh Police Forces: An Application of DEA and Multiple Discriminant Analysis." International Review of Law and Economics, 20, 2000, 53-73.

______. "The Economic Evaluation of Policing Activity: An Application of a Hybrid Methodology," European Journal of Law and Economics, 12, 2001, 181-200.

______. "The Measurement of English and Welsh Police Force Efficiency: A Comparison of Distance Function Models." European Journal of Operational Research, 2003a, 147, 165-86.

______. "An Economic Evaluation of Inputs and Outputs in Policing: Problems in Efficiency Measurement," Journal of Socio-Economics, 2003b, 32, 701-10.

Fare, R., S. Grosskopf, and C. A. K. Lovell,. The Measurement in Efficiency Production. Boston: Kluwer Nijhoff, 1985.

Farrell, M. J. "The Measurement of Productive Efficiency." Journal of the Royal Statistical Association Series A, 120, 1957, 253-81.

Fried, H. O., S. S. Schmidt, and S. Yaisawarng. "Incorporating the Operating Environment into a Nonparametric Measure of Technical Efficiency." Journal of Productivity Analysis, 12, 1999, 249-67.

Gillen, D., and A. Lall. "Developing Measures of Airport Productivity and Performance: An Application of Data Envelopment Analysis." Transport Research Economics, 33, 1997, 261-73.

Gyapong, A. O., and K. Gyimah-Brempong. "Factor Substitution, Price Elasticity of Factor Demand and Returns to Scale in Police Production: Evidence from Michigan." Economic Journal, 54, 1988, 863-78.

Her Majesty's Inspectorate of Constabulary (HMIC). What Price Policing. Her Majesty's Stationary Office, London, 1998.

Home Office. Demonstration Project. Her Majesty's Stationary Office, London, 2001.

______. Home Office Annual Report: The Government's Expenditure Plans 2002-03 and Main Estimates 2002-03 for the Home Office. Her Majesty's Stationary Office, London, 2002.

______ SR2002 Public Service Agreement Technical Notes. Her Majesty's Stationary Office, London, 2003.

Linna, M., A. Nordblad, and M. Koivu. "Technical and Cost Efficiency of Oral Health Care Provision in Finnish Health Centres." Social Science and Medicine, 56, 2003, 343-53.

McDonald, Z., and D. Pyle. Illicit Activity: The Economics of Crime, Drugs and Tax Fraud. Dartmouth, U.K.: Ashgate, 2000.

Nyhan, R. C., and L. M. Martin. "Assessing the Performance of Municipal Police Service Using Data Envelopment Analysis: An Exploratory Study." State and Local Government Review, 31, 1999, 18-30.

Spottiswoode, C. Improving Police Performance. Public Services Productivity Panel, H. M. Treasury, London, 2000.

Stephens, M. "Care and Control: The Future of British Policing," Policing and Society, 4, 1994, 237-51.

Sullivan, R. R. "The Politics of British Policing in the Thatcher/Major State," The Howard Journal, 37, 1998, 306-18.

Sun, S. "Measuring the Relative Efficiency of Police Precincts Using Data Envelopment Analysis." Socio-Economic Planning Sciences, 36, 2002, 51-71.

Thanassoulis, E. "Assessing Police Forces in England and Wales Using Data Envelopment Analysis." European Journal of Operational Research, 87, 1995, 641-57.

Waddington, P. A. J., and Q. Braddock. "'Guardians' or 'Bullies'? Perceptions of the Police amongst Adolescent Black, White and Asian Boys." Policing and Society, 2, 1991, 31-45.

LEIGH M. DRAKE and RICHARD SIMPER*

Drake: Professor, Nottingham University Business School, Jubilee Campus, Nottingham, United Kingdom, NG8 1BB. Phone 44-(0)-115-8466602, Fax 44-(0)-115-8466667, E-mail leigh.drake@nottingham.ac.uk

Simper: Lecturer in Economics, Department of Economics, Loughborough University, Loughborough, United Kingdom, LE11 3TU. Phone 44-(0)-1509-222701, Fax 44-(0)-1509-223910, E-mail r.simper@lboro.ac.uk

1. Indeed the Police Act 1964, s. 4.(1) states that "it shall be the duty of the Police Authority for every police area for which a police force is required to be maintained by section 1 of this Act to secure the maintenance of adequate and efficient police force for the area, and to exercise for that purpose the powers conferred on a police authority by this Act" (emphasis added). The managerialism of the police service, coming from the Home Office Circular 114/83 also introduced the three E's: economy, efficiency, and effectiveness. Subsequent legislation, the Police and Magistrates Court Act 1994 replaced adequate and efficient with efficient and effective, and finally the Local Government Act 1999 introduced Best Value, such that "a Best Value authority must make arrangements to secure continuous improvement in the way in which its functions are exercised, having regard to a combination of economy, efficiency and effectiveness" (emphasis added).

2. U.K. policing is split into three distinct political jurisdictions, the Police Service of Northern Ireland, the police forces of Scotland, and those stationed in England and Wales. This article is concerned with the latter group of which the Home Office is responsible directly for their economy, efficiency, and effectiveness in policing matters.

3. Indeed, instead of differential targets as proposed in Spottiswoode (2000), the Home Office criteria of forces "meeting an annual target of savings/efficiency gains equivalent to 2% of their annual budget" is to continue (National Policing Plan 2003/04, p. 39).

4. It is interesting to note, as already discussed, that these performance radars were not linked to police efficiency and the funds spent. However, HMIC noted that "the framework will require the development of additional performance indicators to capture the breath of policing responsibilities, as well as the collection of activity-based costing data to established the link between the resources used and the outcomes delivered" (HMIC Report of Her Majesty's Chief Inspector of Constabulary 2002-2003, December 2003).

5. For example, the first Home Office's Public Service Agreement 1 is to reduce recorded vehicle crime by 30% by 31 March 2004; domestic burglary by 25% by March 2005; and robbery in principle cities by 14% by 2005 (Home Office 2002).

6. For example, Thames Valley police force's gross expenditure equaled [pounds sterling]315,344 million (2003/04), which, less the specific grant ([pounds sterling]19,311 million) and less other income ([pounds sterling]6,186 million), gave a net budget revenue equal to [pounds sterling]289,847 million.

7. At present there are few local statistics available for the United Kingdom that closely follow police force boundaries and, hence, this limits the choice of data available in this study. In the United Kingdom, statistics can be based on boundaries under different settings, for example: local authority wards, fire service boundaries, criminal court boundaries, etc.

ABBREVIATIONS

BCS: British Crime Survey

BVPI: Best Value Performance Indicator

CCR: Charnes, Cooper, and Rhoades (1978)

DEA: Data Envelopment Analysis

DMU: Decision Making Unit

FDH: Free Disposable Hull

HMIC: Her Majesty's Inspectorate of Constabulary

MSF: Most Similar Force

PFF: Police Funding Formula

PTE: Pure Technical Efficiency

SFA: Stochastic Frontier Analysis

SIDF: Stochastic Input Distance Function
TABLE 1 Initial PPAF Performance Indicators for Performance Radar
Analysis

Citizen focus (Domain A)
 Level of public satisfaction with the police force as measured by the
 BCS.
Reducing crimes (Domain B)
 Number of burglaries per 1,000 households as measured by recorded
 crime.
 Number of robberies per 1,000 resident population as measured by
 recorded crime.
 Number of vehicle crimes per 1,000 resident population as measured by
 recorded crime.
Investigating crime (Domain C)
 Percentage of sanction detections.
 Percentage of offences brought to justice.
 Number of offenders brought to justice for the supply of class A
 drugs, as measured by recorded crime statistics.
Promoting public safety (Domain D)
 Level of fear of crime as measured by the BCS.
 Level of feeling of public safety as measured by the BCS.
Resource usage (Domain E)
 Number of working days lost trough sickness as measured by returns
 submitted to HMIC.

Source: HMIC (2003).

TABLE 2 English and Welsh Police Force Summary Statistics: 2001-2002

 Minimum Maximum Mean

Inputs: Models 1 and 2
 Number of burglaries 714 39,081 8,661.10
 Number of vehicle crimes 2,052 74,775 19,711.49
 Number of robberies 26 13,322 1,653.22
 Additional input for Model 2
 Net budget revenue 60,474 432,769 151,136.17
 [pounds sterling]000's)
Outputs Models 1 and 2
 Total offenses cleared 10,553 107,955 27,720.07
 Total days lost to sickness 28,586 159,180 130,748.17

TABLE 3 English and Welsh Police Force PTE Results

Police Force Area and Region PTE (M1) Stage 1 PTE (M2) Stage 1

Avon and Somerset 16.33 (39) 40.22 (41)
Bedfordshire 100.00 (1) 100.00 (1)
Cambridgeshire 16.10 (40) 74.33 (19)
Cheshire 26.20 (31) 56.06 (35)
Cleveland 14.40 (41) 67.26 (28)
Cumbria 45.70 (21) 87.89 (12)
Derbyshire 33.27 (26) 60.86 (33)
Devon and Cornwall 92.92 (10) 92.92 (11)
Dorset 20.92 (37) 73.10 (23)
Durham 34.24 (25) 73.63 (21)
Dyfed Powys 100.00 (1) 100.00 (1)
Essex 48.21 (20) 54.89 (36)
Gloucestershire 25.21 (33) 87.81 (13)
Greater Manchester 74.08 (13) 74.56 (18)
Gwent 100.00 (1) 100.00 (1)
Hampshire 100.00 (1) 100.00 (1)
Hertfordshire 17.00 (38) 49.15 (40)
Humberside 27.80 (29) 59.07 (34)
Kent 68.24 (16) 68.56 (27)
Lancashire 69.24 (14) 74.17 (20)
Leicestershire 40.00 (22) 68.80 (26)
Lincolnshire 30.26 (28) 81.16 (16)
Merseyside 48.40 (19) 49.60 (39)
Norfolk 20.93 (36) 63.30 (32)
North Wales 26.61 (30) 69.83 (25)
North Yorkshire 25.20 (334) 65.88 (30)
Northamptonshire 100.00 (1) 100.00 (1)
Northumbria 100.00 (1) 100.00 (1)
Nottinghamshire 31.49 (27) 64.67 (31)
South Wales 100.00 (1) 100.00 (1)
South Yorkshire 37.80 (24) 53.15 (37)
Staffordshire 48.62 (18) 67.26 (28)
Suffolk 100.00 (1) 100.00 (1)
Surrey 22.42 (35) 51.43 (38)
Sussex 61.95 (17) 72.73 (24)
Thames Valley 74.30 (12) 85.79 (14)
Warwickshire 25.60 (32) 100.00 (1)
West Mercia 75.80 (11) 79.66 (17)
West Midlands 100.00 (1) 100.00 (1)
West Yorkshire 68.74 (15) 73.40 (22)
Wiltshire 38.92 (23) 81.77 (15)
Minimum 14.40 40.22
Mean 54.89 76.55
SD 31.46 17.31

Police Force Area and Region PTE (M1) Stage 2 PTE (M2) Stage 2

Avon and Somerset 55.88 (41) 75.30 (41)
Bedfordshire 100.00 (1) 100.00 (1)
Cambridgeshire 88.65 (29) 96.10 (27)
Cheshire 97.02 (16) 100.00 (1)
Cleveland 82.98 (38) 100.00 (1)
Cumbria 92.33 (22) 100.00 (1)
Derbyshire 84.40 (36) 92.10 (32)
Devon and Cornwall 100.00 (1) 100.00 (1)
Dorset 95.05 (18) 88.60 (35)
Durham 89.11 (28) 97.20 (24)
Dyfed Powys 100.00 (1) 100.00 (1)
Essex 90.13 (27) 98.10 (22)
Gloucestershire 95.99 (17) 100.00 (1)
Greater Manchester 88.22 (31) 95.50 (28)
Gwent 100.00 (1) 100.00 (1)
Hampshire 97.27 (15) 95.30 (29)
Hertfordshire 86.43 (33) 85.80 (37)
Humberside 85.23 (34) 85.40 (39)
Kent 91.38 (25) 91.60 (33)
Lancashire 90.87 (26) 100.00 (1)
Leicestershire 84.57 (35) 97.00 (26)
Lincolnshire 99.74 (12) 100.00 (1)
Merseyside 83.82 (37) 99.00 (19)
Norfolk 93.75 (21) 93.70 (31)
North Wales 100.00 (1) 94.50 (30)
North Yorkshire 93.98 (20) 88.70 (34)
Northamptonshire 100.00 (1) 100.00 (1)
Northumbria 100.00 (1) 100.00 (1)
Nottinghamshire 71.82 (40) 98.50 (20)
South Wales 99.39 (13) 97.60 (23)
South Yorkshire 75.05 (39) 81.20 (40)
Staffordshire 87.93 (32) 98.20 (21)
Suffolk 100.00 (1) 100.00 (1)
Surrey 92.05 (23) 87.00 (36)
Sussex 95.03 (19) 97.10 (25)
Thames Valley 100.00 (1) 100.00 (1)
Warwickshire 91.62 (24) 99.40 (18)
West Mercia 100.00 (1) 100.00 (1)
West Midlands 100.00 (1) 100.00 (1)
West Yorkshire 88.51 (30) 85.50 (38)
Wiltshire 97.78 (14) 100.00 (1)
Minimum 40.22 75.30
Mean 78.86 95.60
SD 14.99 13.00

Note: M1 and M2 denotes Model 1 and Model 2, which respectively excludes
and includes net expenditure as an extra input.

TABLE 4 Public Confidence and Force Efficiency Regressions

 Model 1 Model 2
 Estimate SE Estimate SE

Dependent variable: Worried about violent crime
 Constant 12.9200 9.1765 16.9508 10.9479
 PTE 0.0435 0.0693 -0.0111 0.1081
 Number of 0.0001 0.0001 -0.0001 0.0002
 violent crime
 Income support 0.0001 0.0001 0.00007 0.00005
 Daytime -0.00003 0.00003 -0.000003 0.000003
 population
 Terrace housing -20.1793* 11.8982 -20.1290* 12.0544
 Lone parent 49.1410 102.2460 50.0341 105.4870
 households
 Sparcity 19.7799 47.5262 25.6173 48.2537
 Sigma 3.6801** 0.4064 3.6972** 0.4082
Dependent variable: Worried about burglary
 Constant 6.3007 6.2008 7.8868 7.2474
 PTE 0.0039 0.0545 -0.0196 0.0749
 Number of 0.0001** 0.00007 0.0001** 0.00007
 burglaries
 Income support 0.00007** 0.00003 0.00007** 0.00003
 Daytime -0.00001** 0.000002 -0.000006** 0.000002
 population
 Terrace housing -8.3845 7.8635 -8.7247 7.9684
 Lone parent -5.5937 66.9099 -1.1788 68.7617
 households
 Sparcity 56.3620* 31.6124 58.7097* 31.9154
 Sigma 2.43806** 0.2692 2.4362** 0.2690
Dependent variable: Worried about vehicle crime
 Constant 2.1170 7.0498 1.0902 8.0899
 PTE -0.0247 0.0523 -0.0121 0.0758
 Number of 0.0009** 0.0003 0.0009** 0.0003
 vehicle crimes
 Income support -0.000006 0.00004 -0.00001 0.00003
 Daytime -0.000001 0.000002 -0.000001 0.000002
 population
 Terrace housing -7.5137 8.9019 -7.8234 9.0176
 Lone parent 126.9490* 75.0130 131.2090* 75.7708
 households
 Sparcity 67.6690** 36.0222 66.6374* 36.5915
 Sigma 2.7589** 0.3047 2.7655** 0.3054
Dependent variable: Police doing an excellent or good job
 Constant 52.3886** 7.75054 54.9212** 9.3266
 PTE 0.1089** 0.0579 0.0762 0.0880
 Income support 0.00007* 0.00004 0.00007* 0.00004
 Daytime -0.000004* 0.000002 -0.000004 0.000003
 population
 Terrace housing -4.3269 10.0257 -2.646 10.6670
 Lone parent -116.3610 84.2987 -136.7670 87.8322
 households
 Sparcity -39.1271 40.1376 -34.5144 41.8774
 Sigma 3.1085** 0.3438 3.2106** 0.3545

Note: ** and * denotes significant at the 5% and 10% critical level,
respectively.
COPYRIGHT 2005 Western Economic Association International
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Drake, Leigh M.; Simper, Richard
Publication:Contemporary Economic Policy
Geographic Code:4EUUK
Date:Oct 1, 2005
Words:11240
Previous Article:Capital control and domestic interest rates: a generalized model.
Next Article:The effect of unemployment benefits, welfare benefits, and other income on personal bankruptcy.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters