Printer Friendly

Change in the Medicare case-mix index in the 1980s and the effect of the prospective payment system.

Persistent increases in the Medicare case-mix index over the 1980s have been ascribed to changes both in medical treatment (real changes) and in the way medical information is recorded ("coding changes") in hospitals. These changes have been attributed, in the absence of appropriate data and analyses, to the incentives of the Medicare prospective payment system (PPS). Using data for 1980-1986 from 235 hospitals, we estimate the effect on the Medicare case-mix index of a series of variables that reflect medical treatments and coding practices. Each of these underlying real or coding variables was changing prior to PPS and would likely have continued to change even in the absence of PPS. Furthermore, PPS may have had a distinct effect on these variables. These underlying trends and the PPS effects must each be estimated. Thus, the analysis begins by developing separate estimates for each of these real and coding variables (1) in the absence of PPS (autonomous effects) and (2) as a result of PPS (induced effects). Then, changes in the case-mix index are regressed against all of these variables to determine the degree to which specific autonomous real or coding variables or induced real or coding variables actually influenced measured case mix. Results show that real and coding changes each accounted for about half of the change in the Medicare case-mix index between 1980 and 1986, with the influence of coding staring to wane by 1986. PPS-induced factors explain about 80 percent of the change in measured case mix over time, autonomous factors about 20 percent. Especially powerful determinants of case-mix change included PPS-induced substitution of surgical for medical care and PPS-induced improvements in the accuracy of coding that led to assignment of patients to higher-weighted DRGs. Also, stringent Medicare peer review organizations appeared to restrain rises in case-mix indexes for their hospitals. Outpatient substitution for inpatient treatment, which others attributed to PPS, was well underway before PPS was announced.

The Medicare prospective payment system (PPS), implemented in October 1983 for many hospitals, has had a number of important effects on hospitals. One is the unprecedented rise in the average Medicare case-mix index (MCI) of hospitals. The average MCI rose 6 percent over fiscal years 1981-1984 and 9.9 percent over FY 1984-1988.(1) The MCI is the distribution of a hospital's Medicare patients across more than 470 diagnosis-related groups (DRGs) weighted by the relative average charge of treating the typical U.S. Medicare patient in the DRG. The MCI reflects the increased intensity (sometimes called severity) or hospital resource requirements of treating Medicare patients over time.

For Medicare policy decisions it is important to understand how much of any increase in measured case mix represents a real increase in the average sickness and the consequent increase in the resources needed to treat the patient, and how much is due to coding change, that is, statistical artifacts caused by changes in the way that hospitals record and provide information on their patients. The annual "PPS update factor," recommended by the U.S. Department of Health and Human Services and the Prospective Payment Assessment Commission (PROPAC), and ultimately set by Congress, allows for "real" changes, among other things, in the average Medicare case-mix index (see Steinwald and Dummit 1989 for an overview).

Estimates of the real annual change in the MCI are substantially below the 2 percent average yearly change in the hospital case-mix index from FY 1981 to 1984 and the, 2.5 percent average change from FY 1984 to 1988. The average yearly real change in the patient-weighted MCI was estimated by PROPAC (1990, 61) as 1.0 and 2.1 percent for fiscal years 1981-1983 and 1984-1988, respectively. The increase allowed by Congress for real changes in the MCI was actually about 2.0 percent for FY 1984-1988 on average.

The intent of this research is to increase our understanding of the change in the Medicare case-mix index by (1) estimating the effect of PPS on the MCI, (2) separating coding changes from real changes in the MCI, and (3) estimating the degree to which coding and real changes in the MCI over the 1980s can reasonably be attributed to PPS.

Previous Research

Several studies have measured the reasons for change in the Medicare case-mix index. One set of studies was undertaken by Ginsburg and Carter.(2) They attempted to decompose the MCI change into (1) PPS-related changes in outpatient substitution and coding changes, (2) pre-PPS trends in medical practice, (3) the aging of the hospital population, and (4) data source inconsistencies. Using Health Care Financing Administration Medicare claims abstracts, they computed a 9.2 percent increase in the MCI from 1981 to 1984. Using data from both HCFA and other sources and a variety of statistical techniques, they attributed 3.5 of the 9.2 percentage points to PPS (0.6 to outpatient substitution and 2.8 to coding), 1.4 percentage points to pre-PPS medical practice trends, no effect to aging of the population over time, and 4.0 percentage points to measurement error within the Medicare claims data base.(3) This measurement error is due primarily to the absence of diagnostic and treatment detail on Medicare data files before PPS.(4)

Two other sets of studies have attempted to estimate the extent of real case-mix change by reabstracting patient medical records. The Office of the Inspector General of the U.S. Department of Health and Human Services estimated the effect of coding changes on the case-mix index for 1985 by reabstracting 7,050 medical records from 239 hospitals, then comparing the reabstracted DRGs with the DRGs for which the hospitals were actually paid (Hsia, Krushat, Mark, et al. 1988). The study concluded that about 21 percent of claims were based on an incorrect DRG. While hospitals were underpaid in 38.3 percent of the cases where the original DRG was incorrect, hospitals were overpaid in 61.7 percent of the cases. The net effect of correcting the DRG assignment was to lower the case-mix index computed from these 7,050 patients by 0.0158 points, or 1.9 percent of the case-mix index.(5)

In a more recent pair of studies, the RAND Corporation analyzed data for 1987 and 1988 that had been reabstracted by the SuperPRO (the organization responsible for evaluating individual Peer Review Organizations [PROs]). Results from one study suggested that about one-third of the 3.6 percent change in the patient-weighted Medicare case-mix index between 1987 and 1988 was due to real case-mix change (Prospective Payment Assessment Commission 1990, 68). In a similar study using data from one year earlier, RAND concluded that one-half to three-quarters of the 2.1 percent increase observed in the sarnple of cases was real (Prospective Payment Assessment Commission 1990, 39). These two studies suggest that the real increase in the patient-weighted MCI has been on the order of 1.2-1.5 percent per year in recent years.

The studies just cited vary in their estimates of the amount by which coding changes and real changes explain the change in the MCI over time, but the studies are consistent in concluding that both explanations have been important. To provide further insights into the trends in the real and coding-practice portions of the change in the MCI we have developed, first, a methodology different from those used previously to decompose the change in the MCI into its component parts. This methodology is closer to the work of Ginsburg and Carter than to the reabstracting studies, but it differs from Ginsburg and Carter's work in a number of important ways. It uses a unified system of equations to help us understand the effects on the MCI of the underlying factors; nearly all of these factors are allowed to be induced partially by PPS and to change independent of PPS. Second, our study uses data that have been consistently collected by discharge abstracting companies for hospitals over the entire period of the study. These differ from the HCFA data, which were inconsistent in the items collected over the study period. Third, this study uses data that have been processed so that data inaccuracies can be assessed and coding changes can be measured directly rather than estimated as a residual, as in the Ginsburg and Carter work. The description of our approach and the presentation of our estimates is given in the next four sections - conceptual framework, statistical methods, results, and concluding comments.

Conceptual Framework

Our method distinguishes between real changes in medical treatment patterns and in coding practices, as have others; further, it separates trends that were already in place before the imposition of PPS from changes that may have occurred because of PPS. Table 1 summarizes conceptually the possible sources of change in the Medicare case-mix index by whether they might have been induced by PPS or not. The major hypothesis is that a variety of forces already emerging in the early 1980s (and therefore exogenous to PPS) were causing the Medicare case-mix index to rise. That is, even if there had never been a prospective payment system, the nation would have observed in the mid-1980s some increase in measured Medicare case mix that resulted from, for example, changes in the practice of medicine, aging of the population, and collection and coding of information. On the other hand, the impact of prospective payment might have been to significantly strengthen these trends. Also, PPS may have raised the age of hospitalized patients by stimulating outpatient treatment for less severely ill younger patients. Hence, to fully understand the sources of change in the MCI, it is necessary to decompose the various factors defining practice pattern changes and coding changes into their exogenous (autonomous, not influenced by PPS) and endogenous induced by PPS) components.
Table 1: Framework for Studying Sources of Change in the
MCI, 1980-1986
 Influences on Case Mix
 Preexisting Induced by PPS
Changes in practice patterns
 Outpatient substitution X X
 Substitution of surgical
 for medical care X X
 Technology diffusion X X
Aging of the hospitalized
 population X X
Changes in coding practices
 Completeness X X
 Accuracy X X
 Gaming X
PRO activities X
Age distribution of population X
Hospital structural changes X
Medicare program changes X
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and

Certain other factors that cannot be attributed directly to PPS may also have affected the MCI: aging of the Medicare population; structural changes in the hospital industry, such as a greater need to market services to HMOs; and non-PPS changes in the Medicare program, for example, the Tax Equity and Fiscal Responsibility Act (TEFRA). These must be controlled to avoid attributing their influences to PPS. Finally, two factors are a result of PPS and have no pre-PPS equivalents: (1) coding changes induced by "gaming" to maximize reimbursement; and (2) the impact on hospitals of the Peer Review Organizations (PROs), established to control unnecessary admissions and to monitor the quality of care for Medicare beneficiaries. These factors are included to capture, where possible, the direct effects of PPS mechanisms. Some of the concepts listed in Table I are discussed in more detail later, when their measurement is described.

Statistical Methods

Statistical Decomposition

of the MCI

Using the conceptual framework of Table 1, we specified a model for estimating the effect of those factors on the Medicare case-mix index [(MCI.sub.t)] over 28 quarters from 1980 through 1986. The model below is also described schematically in the Appendix.

[MCI.sub.t] = f(A[X.sub.i,t], PPSI[X.sub.i,tt], OTHPP[S.sub.j], SEASON,

CMR0*tt, CMR1*t, CMR2*t)

(1) where:

A[X.sub.i,t] = i estimated preexisting factors (specified in

Table 1), which also have induced counterparts;

PPSI[Xi,t] = i estimated PPS-induced factors (specified in

Table 1), which have preexisting counterparts;

OTHPP[S.sub.j] = j variables related only to PPS (see Table 1) and
 for which there are no pre-PPS equivalents;
 SEASON = dummy variables for fall, winter, spring;
 t = 1,..., 28, quarters from January 1980 through
 December 1986;
 CMRO = 1 for t = 1,...,11, 0 otherwise;
 CMR1 = 1 for t = 12,.., [t.sub.PPS-1], 0 otherwise;
 CMR2 = 1 for t = [t.sub.PPS],..., 28, 0 otherwise; and

[t.sub.PPS] = the quarter the hospital first entered onto PPS.(6)

According to this specification, the MCI depends on i factors of which some portion predates PPS and is likely to have continued (the A[X.sub.i] variables) and of which the remaining portion is influenced by PPS (the PPSI[X.sub.i] variables). The MCI also depends on PPS-related changes for which there are no cost-reimbursement era equivalents [OTHPP[S.sub.J]: PRO activities and gaming), on seasonal factors, and on other effects unspecified by the previous variables. If unspecified effects exist, they will be captured by the pre-TEFRA, TEFRA, and PPS era dummy variables (CMRO*t, CMR1*1, and CMR2*t, respectively). If these unspecified effects are statistically significant, then the Table 1 conceptual framework may be judged as incomplete.



The autonomous and PPS-induced underlying variables must be estimated. For each of the i autonomous determinants of the MCI, we estimated:

A[X.sub.i,t1] = [f.sub.1i] (t1, CMR*t1, SEASON, H)

(2) where:

t1 = 1,...,15, quarters January 1980 to September

1983, the pre-PPS period;

CMR = 1 for t = 12,..., 28, the Medicare case-mix

reimbursement era, 0 otherwise; and

H = hospital and community characteristics.

This equation generally uses the trend of 15 quarters prior to PPS and hospital-specific variables to estimate what might have happened without PPS. In Equation 2, TEFRA both is part of the pre-PPS time trend (the t1 variable) and is allowed to shift the Axis for quarters 12 through 15, through the CMR*t1 interaction term.

The parameters estimated in Equation 2 are then used to project autonomous trends of the Axis into the PPS period and to smooth estimates for the pre-PPS period to be consistent with PPS-period projections. Assuming no PPS for quarters 16 through 28, cost-based reimbursement would have been in effect, so that the case-mix reimbursement term is omitted.(7) Estimated autonomous variables are:

A[X.sub.i,t] = [f.sub.t1] (t, SEASON, H)

(3) where:

t = 1,..., 28.

A fourth equation estimates the effect of PPS on the same 15 underlying factors modeled in Equation 2 using information from the entire 1980-1986 time period. This equation includes both an overall time trend variable and two factors that allow for quarterly responses that differ on average among the pre-TEFRA, TEFRA, and PPS periods.

[X.sub.i,t] = [f.sub.2i] (t, CMR1*t, CMR2*t, SEASON, H),

(4) where:

t = 1,..., 28.

With parameters from Equation 4, we generate estimates of the [X.sub.i]s (including the intercept) that incorporate the effects of case mix-based and prospective reimbursement:

[X.sub.i,t2] = [f.sub.2i] (t2, CMR1*t, CMR2*t, SEASON, H)

(5) where:

t2 = 12,..., 28.

The distinction between Equation 3 and Equation 5 is important. Equation 3 forecasts what the values of the [X.sub.i] variables would have been had there been no case mix-based or prospective reimbursement nor any anticipation of prospective payments, whereas Equation 5 predicts values of these [X.sub.i] variables given the prospective payment system. As a result, the pure effect of PPS on the values of the various X variables can be calculated as:
 PPSI[X.sub.i,t] = 0 for t = 1,..., 11 and
 = [X.sub.i,t2] - A[X.sub.i,t] for t2 = 12,..., 28.

That is, Equation 6 generates the values of the PPS-induced variables for quarters 12-28 by subtracting the forecasted autonomous trends computed in Equation 3 from the (smoothed) trends estimated in Equation 5. Since there could be no PPS effect in quarters 1-11, the values of the induced variables are set to zero in these quarters. Estimation for each equation was based on the linear functional form.(8)


The primary source of data is hospital discharge abstracts from the Hospital Cost and Utilization Project for years 1980-1986 (HCUP-2). Confidential discharge and financial data were obtained under legal agreements and protections for a national sample of short-term, general, nonfederal hospitals. Our analysis is based on all Medicare discharges in 235 hospitals that provided complete data for each of the 28 quarters between January 1, 1980 and December 31, 1986;(9) these 235 hospitals were represented by three discharge abstracting companies.(10) The patient discharge abstracts were supplemented with data from the American Hospital Association's Annual Survey of Hospitals and the Bureau of Health Professions' Area Resource File of county statistics. Classification of teaching hospitals was supplemented by information from the Association of American Medical Colleges. Information on PPS and Peer Review Organizations was obtained from the Health Care Financing Administration.

The 235 hospitals are more or less representative of short-term general, nonfederal hospitals in the United States, but with a few important exceptions, as Table 2 shows. One major exception is a shortage of investor-owned hospitals. Also, the sample of hospitals has a relatively higher proportion of hospitals affiliated with a medical school than is true for the universe.(11) The sample also differs somewhat from U.S. hospitals as a whole with respect to bed size, average length of stay, and regional distribution.
Table 2: Comparison of Study Hospitals (i.e., 235 HCUP-2
Hospitals with Continuous Discharge Data for 1980-1986)
with all U.S. Short-term, General, Nonfederal Hospitals
(AHA Annual Survey, 1982)
 HCUP-2 Study U.S.
 Characteristics Hospitals Hospitals
Percent by type of control
 Private, not-for-profit 75.3 55.9
 Public 23.0 27.0
 Investor owned 1.7 12.4
Percent urban (Standard 58.3 54.8
Metropolitan Statistical Area
Percent affiliated 25.1 15.9
with a medical school
Average number of short-term 215.8 194.0
beds per hospital
Average occupancy rate 70.5 69.2
Average full-time equivalent 2.9 2.7
employees per bed
Average total expenses per $26 $26
admission in hundreds of dollars
Average length of stay in days 7.6 8.6
Percent by census region
 Northeast 16.2 15.2
 North Central 36.6 28.6
 South 27.2 37.8
 West 20.0 18.4
Number of hospitals 235 5502
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and


Table 3 provides definitions and descriptive statistics for the Medicare case-mix index (MCI) and underlying factors estimated in this study. Mean values are shown across all 28 quarters for all "autonomous" (A) and "induced" (I) components of the underlying variables. Recall that the induced variable represents the "correction" to the prediction of the X variables when PPS is taken into account. Thus, the means and standard deviations are always smaller for the I[X.sub.i]s than for the A[X.sub.i]s.
Table 3: Variable Definitions and Descriptive Statistics, MCI
and Underlying Factors
 Variable Definition Acronym Mean s.d.
Medicare case-mix index. MCI 1.10 0.13
Summed cross-product of
the proportion of discharges
in each of 471 DRGs
for the hospital quarter,
multiplied by the
published HCFA relative
cost-weight for that
DRG. The DRG classifications
and relative weights were
those in effect from October
1, 1985 to September 30,
1986 (Federal Register
Percent inpatients in AOTHMDOP 11.28 2.94
medical DRGs w/outpatient IOTHMDOP -1.10 1.36
potential. Percent of
Medicare discharges per
quarter in this category
based on a list of DRGs
used by Ginsburg and Carter
(1986). The average
cost-weight for this group
was 0.5781, compared with
0.8431 for medical DRGs
without outpatient
Percent inpatient lens ADRG39 3.14 2.10
procedures. Percent of IDRG39 -0.95 1.29
Medicare discharges per
quarter in DRG 39, lens
procedures. The cost-weight
is 0.5721.
Percent inpatients in other AOTHSGOP 5.91 1.87
surgical DRGs w/outpatient IOTHSGOP 0.01 0.26
potential. Percent of
Medicare discharges per
quarter in procedural DRGs
other than lens procedures
that have outpatient
substitution possibilities.
The average cost-weight for
the DRGs in this group was
0.7677, compared with
1.8074 for procedural DRGs
without outpatient
Ratio of outpatient to all AOPSURG 22.36 7.32
procedures. Proportion of IOPSURG 5.10 6.00
all procedures performed at
a hospital that were
performed in the outpatient
department. Computed from
yearly AHA data, it does
not vary by quarter, and it
includes both Medicare and
non-Medicare patients.
Percent inpatients in AINSURG 14.74 5.69
surgical DRGs. Medicare IINSURG 0.92 1.11
discharges per quarter in
procedural DRGs as a
percent of discharges in
all DRGs that cannot be
treated on an outpatient
Percent inpatients in six ATECH 1.87 1.82
"new" technologies. Percent ITECH 0.14 0.38
of Medicare discharges per
quarter treated with at
least one of the following
procedures: cardiac
percutaneous transluminal
coronary angioplasty,
pacemaker insertion,
coronary artery bypass
graft surgery, lithotripsy,
or magnetic resonance
Mean age of inpatients. AMEANAGE 73.45 1.61
Average age of Medicare IMEANAGE -0.07 0.22
discharges per quarter.
Number of diagnoses of ANUMDX 3.81 0.37
inpatients. Average number INUMDX 0.21 0.26
of diagnosis reported on
Medicare discharge
abstracts per hospital
Percent missing or invalid. APSLOPPY 9.01 6.86
Percent of Medicare IPSLOPPY 1.10 2.21
discharge records per
quarter with at least one
missing or invalid entry
for zip code, race, or day
of principal procedure.
Percent inpatients in DRG ADRG468 1.60 0.53
468. Percent of Medicare IDRG468 -0.22 0.26
records per quarter
classified into DRG 468:
principal procedure is
unrelated to the principal
Percent inpatients in DRG ADRG470 0.25 0.42
470. Percent of Medicare IDRG470 -0.07 0.19
discharges per quarter with
a principal diagnosis
that is not a valid
ICD-9-CM code.
Percent inpatients in young AYNGDRG 0.13 0.11
DRGs. Percent of Medicare IYNGDRG 0.03 0.06
discharges per quarter
assigned to DRGs defined
for patients aged 36 or
Percent inpatients in AFLOW 11.53 1.22
"false low" DRGs. Percent IFLOW -0.66 0.79
of Medicare discharges per
quarter assigned to a
class of 271 DRGs we call
"false low" DRGs. "False
low" DRGs are those DRGs,
based on HCFA unpublished
data, that were often
erroneously assigned to
patients for which the
correct assignments would
have had a higher
cost-weight. There is no
direct way to determine
what the correct DRG
assignment for an HCUP-2
discharge would actually be
with these secondary data.
Percent inpatients in AFHIGH 10.64 0.93
"false high" DRGs. Percent IFHIGH 0.15 0.28
of Medicare discharges per
quarter assigned to a class
of 221 DRGs we call "false
high" DRGs - those DRGs
often misassigned where a
correct assignment would
have been to a
lower-cost-weight DRG.
Notes under "Percent
false lows" (AFLOW/IFLOW)
also apply here.
Game gap. The difference AGAMEGAP 0.26 0.07
between the maximum IGAMEGAP 0.01 0.04
potential MCI and the
hospital's actual MCI each
quarter. The maximum
potential MCI is calculated
by finding for each
discharge the diagnosis
among all recorded for that
discharge that maximized
DRG reimbursement.
Denial leniency. The DENYLEN 2.38 4.30
percent of cases reviewed
during the cycle
(generally a year) by the
PRO and denied Medicare
payment, subtracted from
the percent of cases
re-reviewed by the
independent (and generally
stricter) SuperPRO and
recommended for denied
payment. This is a
state-level measure
constant for all hospitals
belonging to the same PRO.
The various PROs began
operations between quarters
19-21; the first quarter in
which every PRO that covered
hospitals in our data set
was subject to SuperPRO
oversight was quarter 22.
DRG assignment leniency. DRGLEN 1.04 4.97
The percent of cases
reviewed during the cycle
for which DRG reassignment
was made by the PRO,
subtracted from the
SuperPRO DRG reassignment
rate. For other details on
this measure see "Denial
leniency" (DENYLEN).
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and

Outpatient substitution is measured for each hospital quarter by variables that reflect the degree to which patients have been assigned to DRGs that include diagnoses or treatments that can be done on an outpatient basis (see variables OTHMDOP, DRG39, OTHSGOP in Table 3). DRGs with outpatient substitution potential were defined in the work by Ginsburg and Carter (1986). For our purposes we isolated DRG 39, lens procedures (because of its especially dynamic behavior), and divided the remaining DRGs into medical and surgical DRGs with outpatient potential. As another indicator of trends toward outpatient treatment we included the relative volume of outpatient to total surgeries performed at the hospital (OPSURG).

Substitution of surgical for medical care among Medicare patients is measured by inpatients in surgical DRGs relative to those in all DRGs that should be treated in an inpatient setting. This variable (INSURG) attempts to capture the hospital's style of practice with regard to surgery, which should have an effect on its measured MCI.

Technology diffusion with implications for DRG assignments is measured by TECH, the percent of a hospital's Medicare discharges per quarter treated with at least one of the following procedures: cardiac catheterization, percutaneous transluminal coronary angioplasty, pacemaker insertion, coronary artery bypass graft surgery, lithotripsy, or magnetic resonance imaging. These technologies were emerging technologies during the early 1980s, and their use altered the DRG to which the patient was assigned, generally in a way that raised the DRG payment and that could be identified with ICD-9-CM data.

Aging of the hospitalized population is measured by the average age of Medicare patients in this sample of hospitals (MEANAGE). Although it reflects somewhat the aging of the U.S. population, it reflects more the increasing average age of hospitalized patients as younger, less seriously ill patients are treated outside the hospital.

Changes in coding practices can be measured with HCUP-2 discharge abstract data more readily than with the HCFA data used by Carter and Ginsburg and others. From 1980 through 1984 we suspect that hospitals provided more accurate data to their discharge abstracting vendors, from whom they received summaries for hospital planning activities, than they provided to HCFA for Medicare claims. Claims data inaccuracies did not impinge on Medicare reimbursements before the enactment of PPS. Furthermore, Medicare inpatient claims requested only one diagnosis and one procedure through 1981. Discharge abstracting forms have captured at least three and often many more diagnoses and procedures as far back as 1980, according to the HCUP-2 records.

Changes in coding practice are represented in this study by several variables, some of them direct and some indirect measures of coding. Coding completeness is measured by the average number of diagnoses reported on the discharge abstract per hospital quarter (NUMDX). Coding accuracy was measured with five variables:

* Percent of discharge records with at least one missing or

invalid entry for zip code, race, or day of principal procedure


* Percent of records classified into a DRG for which the

principal procedure is unrelated to the principal diagnosis


* Percent of patients with an invalid principal diagnosis


* Percent of Medicare patients assigned to DRGs defined only

for patients 36 years of age or younger (YNGDRG);

* Estimated percent of Medicare discharges "at risk" of

reassignment to a higher-weighted DRG because the hospital

has assigned them to a class of DRGs often incorrectly

assigned to a low-weighted DRG (FLOW).

The three indirect measures are PSLOPPY, YNGDRG, and FLOW. Zip code, race, or day of principal procedure were often missing or invalid in discharge abstracts of the early 1980s. Such errors do not affect the DRG assignment of the patient, but they do add information to the model on general trends in quality of coding.

Since Medicare covers individuals who are young (the disabled, "survivors," and others), assignment to a "young DRG" may be legitimate. Assignment to these DRGS is in error, however, if the patient's age is recorded incorrectly in the hospital's database. Since the proportion of such patients declined steadily (3.25 percent quarterly) during the pre-PPS period, we include this variable as a measure of coding accuracy.

"False low" DRGs are a class of 271 DRGs that could be misassigned where a correct assignment would have been to a higher-cost-weight DRG, according to an unpublished SuperPRO evaluation of hospital coding practices. The variable is computed as the proportion of patients who are in the low-weighted DRG times the probability that the patient would be reassigned from the low-weighted DRG to a higher-weighted DRG. While the DRG proportions are specific to the hospital quarter, the reassignment probabilities are based on national data provided to us by HCFA. Such DRG miscoding would adversely affect hospital revenues.

This is an indirect measure because a higher proportion of patients in these "false low" DRGs may indicate that a hospital under-invested in the supervision or training of its medical records department or that the hospital specialized in these DRGs. There is no direct way to determine the correct DRG assignment for an HCUP-2 discharge, given the confidentiality protections and the secondary nature of the data.

Gaming. Two variables measure the degree to which the hospital might have engaged in gaming. In addition to identifying DRGs that were often false lows, HCFA has also identified 221 DRGs that were "false highs" - DRGs sometimes misassigned where a correct assignment would have been to a lower-cost-weight DRG.(12) Hospitals with high proportions of discharges in these false high DRGs do not necessarily game the reimbursement system to maximize reimbursements, but the odds that they might have engaged in such practices would seem to be higher than the odds for hospitals with comparatively few patients in these suspect DRGS or with relatively few patients in false low DRGs.

We also attempted to measure the degree to which hospitals resequenced diagnosis codes in order to maximize reimbursement. That is, the maximum potential MCI for each hospital quarter was calculated by finding for each patient the diagnosis that maximized DRG reimbursement from among all of the diagnoses listed on the discharge abstract. Then GAMEGAP, the difference between the maximum potential MCI and the hospital's actual MCI each quarter, was calculated. All coding changes would likely raise both the actual and the maximum potential MCIs; however, resequencing should narrow the gap between the two rising MCIs.

PRO activities. While the PROs were designed partly to constrain hospitals' gaming activities, there were differences among the PROs in their methods of monitoring hospitals. Since the superPRO used consistent methods in re-reviewing cases, we used information from both sources to compare the behavior of PROs in two variables. "Denial leniency" is the leniency of the PRO relative to the SuperPRO in denying payments to hospitals for Medicare patients. The more lenient the PRO, the more often the PRO permits patients to be treated in the hospital as opposed to an outpatient setting. "DRG assignment leniency" is the leniency of the PRO in accepting the DRG assignment of the hospital relative to that of the SuperPRO.

More detail on the above variables is available in Table 3. In addition, Table 4 defines the hospital-specific characteristics that were controlled in the equations used to predict the autonomous and PPS-induced variables.
Table 4: Hospital Characteristics and Definitions Used in
Instrumental Variables Equations
 Concept Variable Definition
Type of control Public
over hospital Investor owned
 Private not-for-profit (reference group)
Teaching Five class variables, and those class
activity variables times the number
 of interns and residents per bed (based on
 HCFA counts). The five classes are:
 * Medical school-based hospitals. Ownership
 is the same as the medical school or
 whose chiefs of service and medical
 school department heads are the same
 individuals or appointed by the same
 person(s), according to Association of
 American Medical Colleges (AAMC).
 * Major teaching hospitals. Members of the
 Council of Teaching Hospitals (COTH) but
 not one of the AAMC medical school-based
 hospitals. These hospitals have major
 affiliations with medical schools for
 training purposes according to the
 American Medical Association (AMA)
 * Minor teaching hospitals. Have
 affiliations with medical schools
 (according to the AMA 1985), but are not
 members of COTH.
 * Osteopathic teaching hospitals. Have
 residency training programs approved by
 the American Osteopathic Hospital
 Association (AOHA), but not by the AMA.
 * "Nonteaching" hospitals. Hospitals not in
 the categories above (reference group).
Bed size Number of hospital beds.
 * < 100 beds (reference group)
 * 100-199 beds
 * 200-299 beds
 * 300-399 beds
 * 400-499 beds
 * 500 beds
 Also number of intensive care unit beds
 set up.
Management * Whether the hospital operates under a
structure and management contract.
autonomy * Whether it is part of a multihospital
 * Whether it is a public hospital in one of
 the country's 100 largest cities, making
 it a potential target for patient
HMO Whether the hospital participates in a
 health maintenance organization.
Medicare share Medicare discharges as a percentage of all
 inpatient admissions.
Market Dummy variables for ranges of the Herfindahl
concentration index based on the number of hospital beds
 in the county:
 * Competitive: < .20 (reference group)
 * Oligopolistic: .20-.69
 * Monopolistic: .70-.97
 * Single hospital: [is greater than or equal
 to] .98
Region U.S. census regions excluding states
 from PPS:
 * New England (ME, VT, RI, NH, CT)
 * Mid-Atlantic (PA)
 * South Atlantic (DE, DC, VA, WV, NC, SC,
 GA, FL)
 * East North Central (OH, IN, IL, MI, WI)
 (reference group)
 * East South Central (KY, TN, AL, MS)
 * West North Central (MN, IA, MO, ND, NB,
 * West South Central (AR, LA, OK, TX)
 * Mountain (MT, ID, WY, CO, NM, AZ, UT, NV)
 * Pacific (WA, OR, CA, AL, HI)
Location Non-SMSA (reference group)
 * SMSA population < 100,000
 * SMSA population 100,000-250,000
 * SMSA population 250,000-500,000
 * SMSA population 500,000-1 million
 * SMSA population 1-2.5 million
 * SMSA population [is greater than or equal
 to] 2.5 million
Community * percent of county population aged 65-74
age distribution * percent of county population aged [is
 greater than or equal to] 75
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and


We applied ordinary least squares estimation techniques to equations 1, 2, and 4 shown earlier. The equations performed satisfactorily; all F-values were highly significant. Right-hand-side variables were statistically significant most of the time, and nearly always with the expected sign, if a particular sign was expected.(13) Predicted values for the dependent variables were reasonable.

Consideration of excluded-variables bias is important in this study because we do not have a formal underlying theoretical model to guide specification. We took three precautions to minimize this problem. First, we reviewed the literature to identify as many factors as possible that affect the MCI. Second, by correlating the residuals of equations 1, 2, and 4 with over 100 additional variables available on the AHA Annual Survey or calculated from discharge abstract data, we determined that there were no further statistically significant gains to be made from adding additional explanatory variables. We included the pre-PPS, TEFRA, and PPS era interaction terms in Equation 1 to capture any systematic time-series variation beyond that accounted for by the underlying determinants. Finally, Equation 1 estimates were neither autocorrelated nor heteroskedastic.



Tables 5 and 6 present the results of estimating Equation 1, the effects Of underlying factors on the MCI. The results reported in these tables differ from the conceptualization of Equation 1 in one significant way: we dropped the resequencing variable GAMEGAP. Since the dependent variable Medicare case-mix index is part of the definition of GAMEGAP, inclusion of the latter in the model is econometrically suspect, and the coefficient is unstable and difficult to interpret.
Table 5: Directional Effects on the Medicare Case Mix Index
and Actual Changes in the Underlying Variables from January
1980 through December 1986
 Variable Coefficient Actual Change
ADRG39 -0.0060(**) 1.73
AOTHSGOP -0.0141(**) -1.23
AOTHMDOP -0.0110(**) -1.23
AOPSURG -0.0001 11.91
AINSURG 0.0136(**) 0.35
ATECH 0.0113(**) 0.58
AMEANAGE 0.0031(*) 1.14
ANUMDX -0.0123(**) 0.61
APSLOPPY 0.0001 -10.97
ADRG468 0.0127(**) 0.10
ADRG470 0.0020 0.06
AYNGDRG -0.0316(*) -0.12
AFLOW -0.0094(*) -1.95
AFHIGH 0.0107(**) -1.58
AGAMEGAP deleted
IDRG39 -0.0019 -2.86
IOTHSGOP 0.0125 0.001
IOTHMDOP -0.0018 -3.15
IOPSURG -0.0011 14.43
IINSURG 0.0191(**) 2.75
ITECH 0.0182(**) 0.48
IMEANAGE -0.0008 -0.17
INUMDX 0.0166 0.61
IPSLOPPY 0.0008 3.15
IDRG468 0.0373(*) -0.61
IDRG470 0.0073 -0.13
IYNGDRG -0.1418(**) 0.07
IFLOW -0.0300(**) -1.87
IFHIGH 0.0266(**) 0.47
IGAMEGAP deleted
DENYLEN -0.0004 7.67(dagger)
DRGLEN 0.0021(**) 11.01(double dagger)
FALL 0.0033 1
WINTER 0.0108(**) -1
SPRING 0.0018 0
PRETEFRA*Q -0.0001 -1
ONTEFRA*Q -0.0001 0
ONPPS*Q -0.0009 28
INTERCEPT 0.8601(**)
R2 = 0.7088;
observations = 6,580
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and
(*) p < .05.
(**) p < .01.
(dagger) The change is t = 28 minus 1 = 22 for PRO factors.
Table 6: Contributions of Underlying Influences to Average
Growth in the MCI, January 1980 through December 1986
 Percentage Contribution
 Preexisting Induced Total
Changes in Practice Patterns 25.01 46.26 71.27
Outpatient substitution 15.80 -3.57 12.23
 Percent inpatients in lens -8.49 4.50 -3.99
 Percent inpatients in other 14.13 0.01 14.14
 surgical DRGs with
 outpatient potential
 Percent inpatients in 11.01 4.66 15.67
 medical DRGs with
 outpatient potential
 Ratio of outpatient to all -0.85 -12.74 -13.59
Substitution of Surgical for
Medical Care
 Percent inpatients in 3.88 42.71 46.59
 surgical DRGs
Technology Diffusion
 Percent inpatients in six 5.33 7.12 12.45
 "new" technologies
Aging of the Hospitalized
Mean Age of Inpatients 2.90 0.11 3.02
Changes in Coding Practices -2.05 38.28 36.23
 Number of diagnoses of -6.12 8.22 2.10
Accuracy 17.88 19.87 37.75
 Percent missing or invalid -1.25 2.14 0.89
 Percent in DRG 468 or 1.14 -19.40 -18.26
 percent in DRG 470
 Percent inpatients in young 3.07 -8.50 -5.43
 Percent inpatients in 14.92 45.63 60.55
 "false low" DRGs
 Percent inpatients in -13.81 10.19 -3.62
 "false high" DRGs
 Game gap - - - not included - - -
PRO Activities n.a. 15.99 15.99
Denial Leniency n.a. -2.52 -2.52
DRG Assignment Leniency n.a. 18.51 18.51
Totals 25.86 100.64 126.52
Season/Time Adjustments -26.49
Cumulative Effect 100.00
Source: Division of Provider Studies, Center for General Health
Services Intramural Research, Agency for Health Care Policy and

The results are in two formats - regression coefficients (Table 5), and a measure of the effect on the MCI of the actual change in the underlying variable over the quarters from January 1980 through December 1986 (Table 6). The regression coefficients show the average change in the MCI from a one-unit change in the underlying variable (for example, the effect of one additional diagnosis code for NUMDX).(14) The coefficients are useful for getting a sense of the direction in which underlying variables move the MCI.

The percentage impact on the MCI is the effect from the coefficient times the actual change in the underlying variable from quarter 1 through quarter 28 (also shown in Table 5), divided by the actual change in the MCI for the same period. It shows the relative importance of the effects of underlying variables over this time period. Even though a coefficient can be quite small, the underlying variable can change dramatically over the seven-year period, resulting in a major contribution to the change in the MCI that could not be inferred from the coefficients alone.


The direction of the effects of underlying variables on the MCI are best understood in the context of ways in which the variables relate to the share of patients in high- or low-weighted DRGs, which can raise or lower the MCI. For example, since the lens procedure DRG (39) has a low cost weight (0.4722), an increase in the proportion of inpatients classified into DRG 39 would decrease the average MCI of the hospital. The negative coefficient of this variable in Table 5 reflects this relationship. For the same reason, hospitals that admit larger percentages of patients who could be treated on an outpatient basis (OTHSGOP and OTHMDOP) would be expected to have lower MCIs.

Generally, those variables with statistically significant effects on the MCI are in the expected directions regardless of whether they measure autonomous or induced components:

* INSURG. The higher the proportion of discharges receiving

major procedures that cannot be performed as an outpatient

(high-cost-weight DRGs), the higher the MCI.

* TECH. The higher the share of patients receiving procedures

that may place them in "high-tech" DRGs (also high-cost-weight

DRGs), the higher the MCI.

* MEANAGE. The greater the average age of the hospital's

patients (which results in assignment to higher-cost-weight

DRGs), the greater the MCI.

* DRG468. The larger the percentage of discharges classified

into DRG 468 (cost weight = 2.4542), the larger the MCI.

* YNGDRG. The greater the proportion of Medicare discharges

perhaps misclassified into DRGS for younger patients

lower-cost-weight DRGS), the lower the MCI.

* FLOW. The greater the share of a hospital's discharges in

low-cost-weight DRGS that are often misclassified, the lower

the MCI.

* FHIGH. The more discharges in high-cost-weight DRGS that

are often misclassified, the higher the MCI.

* DRGLEN. The more lenient the PRO in accepting the hospital's

DRG assignment (which if motivated by reimbursements

should be in higher-cost-weight DRGs than otherwise),

the higher the MCI.

One exception is the number of diagnoses (NUMDX), which as they increase on discharge records should provide more opportunity for classifying patients into DRGs with comorbidities, which are higher-cost-weight DRGs. The coefficient on ANUMDX is negative, however.

The absence of other systematic effects during the pre-PPS, transition, and post-PPS periods (insignificant results for PRETEFRA*Q, ONTEFRA*Q ONPPS*Q) suggests that we have captured most of the factors influencing the MCI growth with this specification. The significant increase in the MCI in the winter quarter relative to summer suggests, however, that some underlying factor exists, perhaps the incidence of illnesses, that is still not directly reflected in our model.

Another result from Table 5 is that the autonomous trends are statistically more powerful predictors of the MCI change than are the inducements associated with PPS. Notable by the absence of statistically significant effects on the MCI are the PPS-induced outpatient substitution effects. Evidently, the movement of treatments to the outpatient setting would have continued even without the introduction of prospective payment.

General improvements in coding practices that have no direct bearing on the reimbursement for a case (PSLOPPY) do not influence the MCI; neither do the early attempts by PROS to deny payments for Medicare claims.

Since the induced components are measured as deviations from the autonomous levels, the scales of the two variables are different and consequently the sizes of the coefficients differ. These differences should not be used to assess the relative importance of specific factors. For that, the percentage contributions are computed and shown in Table 6.


The results in Table 6 clearly show that coding practices (including leniencies by the Peer Review Organizations) account for 52 percent of the total change in the MCI over the 28 quarters from January 1980 through December 1986. The "real" changes (including medical practice, aging of the hospitalized population, and seasonal factors) account for 48 percent of the total change in the MCI.

Most of the rise in the average MCI over time was due to factors induced by PPS. Of the specific measured factors that affected the change in the MCI over the 28 quarters, 80 percent(15) of the change was due to induced variables, and 20 percent was due to the autonomous variables (10 percent predating PPS and 10 percent following). These autonomous changes would have occurred even if PPS had never been established. Thus, the majority of the change is attributable to the incentives of prospective payment under Medicare. This refutes our major hypothesis that the autonomous, preexisting trends were a substantial cause of the change in the MCI over the 1980s.

Induced coding changes and PRO "leniencies" account for just over 54 percent of the induced change in the MCI while what we might refer to as induced "real" changes, medical practice and aging of the hospitalized population, account for 46 percent of the change. Of the induced "coding" changes, the net effect of improved accuracy and the degree of DRG assignment leniency have similar effects. Of the induced "real" changes, the leading factors are increases in the use of surgery and growth in new technologies. The aging of patients is an unimportant contributor, judging both from the coefficient and the change in the average induced age of hospitalized patients. Also notable by its insignificance among the PPS-induced effects is the growth of outpatient substitution for inpatient treatments. Although others have argued that the increase in outpatient treatment was stimulated by PPS and strongly influenced the case-mix index, we find no evidence that PPS affected the case-mix index by influencing outpatient substitution.

Among autonomous changes outpatient substitution had the largest effect, accounting for over 60 percent of the secular changes, followed by use of new technologies (21 percent), the growth of surgical treatments (15 percent), and aging of hospitalized patients (11 percent). Coding practices had an imperceptible effect on the MCI prior to PPS.

Coding changes, combining secular and PPS-induced coding trends, raised the MCI because of the increased accuracy in coding that affected the DRG assignments. Our indirect measures of the probability of misassignments to low-weighted DRGs showed a decline in these types of assignments, which had a large effect in raising the MCI; the induced portion of this effect was especially high. The results for potential misassignments to higher-weighted DRGs show that the declining secular trend (which reduces the case-mix index) is nearly offset by the induced trend (which raises the case-mix index). This suggests that opportunistic DRG assignments explain about 10 percent of the rise in the case-mix index during the PPS period. Other coding conventions such as the frequency of using category 468 (operating room procedure unrelated to principal diagnosis) showed that more effort was used to get diagnosis and procedures to coincide, even though the consequence was nearly always to lower the cost-weight assigned to that patient.

One of the surprisingly strong results is the impact of the PROs' monitoring of DRG assignments. The more lenient the PRO, the higher the average MCI of that PRO's hospitals; 16 percent of the rise in the MCI over the 28 quarters was due to the leniency in monitoring activities even though these activities did not start being evaluated until quarter 22.(16)


For a view of the trend in factors affecting the MCI in HCUP hospitals, Figure 1 displays the effect of coding and "real" changes on the MCI year by year. Coding which comprises changes in coding practices and PRO DRG assignment leniency) was increasingly responsible for the rise in the MCI from FY 1982 through FY 1983, both in terms of the absolute rise in MCI index points and in terms of the percentage of the MCI increase attributable to coding. Over FY 1984, the MCI rose by nearly 3.5 percentage points, the largest jump of the six-year period. While coding inflation in 1984 was even stronger than the year before, real factors also strengthened, with the result that coding contributions were a slightly lower percentage of the MCI increase. In the subsequent year, real increases were markedly smaller, again making the continuing coding effects a larger share of the MCI increase. The dramatic differences in the fiscal 1986 figures suggest that the major efforts of hospitals to improve the accuracy of their medical record keeping may have been accomplished by 1986. Furthermore, the pattern of the yearly coding changes supports this hypothesis. The MCI increase attributable directly to coding rose annually through 1984, leveled off in 1985, and then slowed in 1986. This is the pattern that one would expect: increasing adjustments to a new reimbursement system as knowledge and information about how to maximize revenues under the system emerge, and then diminishing effects as adjustments in management information systems are made.

By contrast, the pattern of real changes in the Medicare case-mix index are less clear with respect to the incentives of Medicare prospective payment. There is no sustained increase from real factors between FY 1982 and FY 1984, the start of PPS. Compared with the effects of coding, the effects of real factors are smaller in magnitude and irregular over the six-year period.

Concluding Comment

This article describes an attempt to estimate the relative importance of a variety of influences on the Medicare case-mix index over the period 1980-1986. Real and coding changes between the years 1980 and 1986 each account for about one-half of the change in the Medicare case-mix index for HCUP hospitals.

These results are close to those of PROPAC (1985, 1986, 1987) for comparable years. PROPAC estimated that one-half of the MCI change from 1981 to 1984 was real; our estimate is 63 percent. For FY 1985, PROPAC estimated that real factors explain 39 percent in the MCI, and we estimate 27 percent. For FY 1986, PROPAC estimated 56 percent for real factors; we estimate 58 percent.(17)

Our data indicate that coding changes were a significant part of the reason for the increase in the Medicare case-mix index and that the effects of coding while significant through 1986 were waning by that year. Among other findings, these results also suggest that hospitals were responsive to strong PROs that reviewed and challenged their DRG classifications.

We have used a methodology that not only disaggregates the dimensions of real and coding changes as they affect the Medicare case-mix index, but also disentangles the effects of prospective payment from the effects of preexisting trends. Our major hypothesis, that a variety of forces already emerging in the 1980s were causing the Medicare case mix to rise, is not supported since the autonomous trends explain only 20 percent of the change in the MCI. Our empirical analysis demonstrates strong statistical significance between the autonomous factors and case-mix changes, but in terms of real impact on changes in the case-mix index, particularly after implementation of PPS, the induced factors had by far the largest quantitative impact.




The Statistical Model

The ultimate goal is to run a regression in which the dependent variable, the MCI, is regressed against a series of potential explanations for the change in the MCI (the autonomous variables and the induced variables), and a few purely exogenous variables (season and time variables). The difficulty is that because we want to isolate the impact of PPS on the values of the potential explanatory variables from the effects of general trends, both the general trends and the pure PPS effects on these determinants must also each be estimated. Figure A shows the sequence of the estimation process.

Step 1. For all of the preexisting variables listed in Table 1 that reflect changes in practice patterns, aging of the hospitalized population, or changes in coding practices, we estimated Equation 2, the best-fit regression equations for the pre-PPS period only. The best-fit regression incorporates exogenous community and hospital characteristics such as the age distribution of the population, hospital structural changes, and other variables listed in Table 4. These same exogenous variables are also used in subsequent steps 2 through 4.

Step 2. We then used the results of Step I to smooth the values for the first 11 periods and to project the A[X.sub.i,t1] variables into the TEFRA and PPS periods, quarters 12 through 28. This yields Equation 3, the A[X.sub.i,t] variables. These values assume that neither case-mix reimbursement (TEFRA) nor prospective reimbursement (PPS) affected the trends in any of these underlying variables.

Step 3. Here we begin to estimate the effect that PPS itself may have had on the underlying determinants. In Equation 4, we estimate a best-fit equation across all 28 quarters for each underlying determinant.

Step 4. We use the results of Equation 4 to compute smoothed estimates for the TEFRA and PPS periods for each underlying determinant, which incorporates the effect that PPS (and TEFRA) may have had on the determinant. Equation 5 summarizes the computation.

Step 5. We now compute the pure effect of the prospective payment system (including anticipatory effects during TEFRA) on the values of the underlying determinants by subtracting from [X.sub.i,t2] (computed in step 4) the values of A[X.sub.i,t] (computed in step 2), starting with quarter 12. This yields the PPSI[X.sub.i,t] variables of Equation 6. That is, the induced PPS effects are computed as the smoothed trend based on actual data (the [X.sub.i,t]s) minus the smoothed trend based on the assumption that PPS (and TEFRA) never happened (the A[X.sub.i,t]s). These PPS variables must be set to zero for the pre-PPS period.

Step 6. We now have available estimates of the trend in all underlying determinants under the assumption that PPS did not occur, as well as estimates of the trend in all underlying determinants due strictly to PPS. In addition, PRO activities represent another set of PPS-induced variables for which there are no autonomous counterparts. We also include the purely exogenous variables: season and time periods. All of these variables enter into the final equation, which relates each of these variables to the Medicare case-mix index. (See Equation 1 in the text.)

Relation of the Statistical Model to the Conceptual Model

The conceptual framework of Table 1 is related directly to the equations of the statistical model. In terms of preexisting (autonomous) influences on case mix, the direct influences (changes in practice patterns, aging of the hospitalized population, changes in coding practices) are estimated in equations 2 and 3. The impact of the remaining preexisting influences (the H variables, age distribution of the population, hospital structural changes, Medicare program changes) affect case mix only indirectly through their effects on the direct influences.

Except for PRO activities, the "induced-by-PPS" influences are calculated as differences between total and preexisting trends in Equation 6, using equations 5 and 3 as the inputs. The final induced influence, PRO activities, does not need to be estimated since PROs did not exist in the pre-PPS era, and so appear in the statistical model only once, in Equation 1.


(1.) These numbers, which represent the hospital-based MCI, were obtained from the Prospective Payment Assessment Commission. The average patient-based cost-weight rose 7.0 percent and 14.0 percent over the periods 1981-1984 and 1984-1988, respectively. PROPAC uses patient-weighted index numbers to represent case-mix change, since the overall cost of the PPS program is directly proportional to the patient-weighted average. But the focus in our study is on the hospital-weighted case-mix index, since we are interested in studying how changes in the behavior of hospitals with respect to practice patterns, coding, and so on affect the measured resource intensity or severity of patients in that hospital, and hence the hospital's reimbursements. (2.) See Ginsburg and Carter (1986) and Carter and Ginsburg (1986). (3.) These percentage effects are as reported by Carter and Ginsburg (1986). (4.) That is, the 1981 estimates are based on MEDPAR data (one diagnosis, one procedure, and presence or absence of additional diagnoses and procedures). The 1984 estimates are based on PATBILL data, which provide specific codes for up to five diagnoses and up to three major procedures. By contrast, our discharge abstract data provide up to 11 diagnoses and 8 procedures. By simulating "MEDPAR-equivalent" and "PATBILL-equivalent" data from our data, we concluded that database inconsistencies explained about 0. 6 percent - not 4. 0 percent - of the increase in the MCI over time. This analysis assumes that only the number of diagnoses changed in the PATBILL and that content of diagnoses did not change. (5.) While the OIG study indicates that errors in coding practices had a measurable inflationary effect on the MCI, the results do not separate inflationary coding practices from real case-mix change over time. (6.) The period of case-mix reimbursement includes not only the quarters that hospitals were subject to PPS (CMR2) but also the quarters during which they were reimbursed under TEFRA (CMRI). All hospitals started TEFRA at quarter 12; however, hospitals did not come under PPS until the beginning of their next fiscal year, which varied between quarters 16 and 19. Therefore, since part of the TEFRA period was also a transition to PPS, hospitals were already changing their behavior in anticipation of the new incentives of PPS. (7.) Hospitals were subject to the rules of TEFRA until they officially came under the rules of PPS. While TEFRA may have had its own effect on the MCI, there was a pronounced rise in the MCI during the quarters that hospitals were subject to TEFRA that appears to reflect hospitals' anticipation of the incentives of PPS. Hospitals may have begun to prepare for the new incentives of PPS as early as quarters 12-15. Therefore, since in practice the values of MCI during the TEFRA period are tied to the anticipation of PPS, TEFRA's effect on the MCI must be controlled and omitted from the projections for the future that are based strictly on cost-based reimbursement. (8.) To obtain the best prediction equation for the underlying variables, we tested several functional forms, including linear, quadratic, and logarithmic. Generally speaking, the various functional forms yielded similar results in terms of percent of variance explained and reasonableness of predicted values, so for simplicity and consistency we used linear forms throughout. (9.) To compute stable values for MCI and other patient level variables, hospitals were included only if they discharged at least 50 Medicare patients in each quarter 1-28; this requirement eliminated about 15 very small hospitals. In all, 235 hospitals comprising 4,094,031 Medicare discharges met the inclusion requirements. (10.) The Commission on Professional and Hospital Activities (CPHA), Ann Arbor, Michigan, requested identification as a discharge abstract data supplier in publications based,on CPHA data. Data were supplied only upon the request and authorization of individual hospitals. CPHA disclaims responsibility for any analyses, interpretations, or conclusions from their discharge abstract data. (11.) The sample of 235 hospitals has not been weighted to make it nationally representative. Also, hospitals from any of the four states that were originally waivered from the PPS during the period (Maryland, Massachusetts, New Jersey, New York) were excluded from this analysis. (12.) There were 162 DRGS that overlapped both the "false high" and "false low" categories; they could be reassigned either to a lower-cost-weight DRG or to a higher-cost-weight DRG. (13.) Detailed regression results for the underlying equations are available from the authors on request. (14.) For categorical variables, the coefficient shows the effect of a category on the MCI beyond the effect of the reference group. (15.) The 80 percent figure (100.64/126.52) ignores the seasonal/time effects in the model. (16.) The relatively weak effect of the PROs' denial of DRG payments is due to multicollinearity between these two measures. The average changes in both activities were similar during the study period. (17.) Note that the PROPAC analyses are based on the universe of hospitals and are patient weighted, while ours are based on a sample and are not patient weighted.


American Medical Association. 1985-1986 Directory of Residency Training Programs. Chicago: AMA, 1985. Carter, G. M., and P. B. Ginsburg. "The Medicare Hospital Casc Mix Index: Preliminary Results for 1985." Working Draft. Publication no. WD-3035-1-HCFA. Santa Monica, CA: The RAND Corporation, July 1986. Ginsburg, P. B., and G. M. Carter. "Medicare Casemix Index Increase." Health Care Financing Review 7, no. 4 (Summer 1986): 51-65. Hsia, D. C., W. Krushat, W. Mark, A. B. Fagan, J. A. Tebbutt, and R. P. Kusserow. "Accuracy of Diagnostic Coding for Medicare Patients under the Prospective Payment System." New England Journal of Medicine 318 (11 February 1988): 352-55. Prospective Payment Assessment Commission. "Report and Recommendation to the Secretary, United States Department of Health and Human Services, March 1, 1990." Washington, DC, 1990. _____ . "Technical Appendixes to the Report and Recommendations to the Secretary, United States Department of Health and Human Services, April 1, 1985." Washington, DC, 1985. _____. "Technical Appendixes to the Report and Recommendation to the Secretary, United States Department of Health and Human Services, April 1, 1986." Washington, DC, 1986. _____. "Technical Appendixes to the Report and Recommendation to the Secretary, United States Department of Health and Human Services, April 1, 1987." Washington, DC, 1987. Steinwald, B., and L. A. Dummit. "Hospital Case-mix Change: Sicker Patients or DRG Creep?" Health Affairs (Summer 1989): 35-47. U.S. Government Printing Office. Federal Register 50, no. 170 (3 September 1985):35722-735.
COPYRIGHT 1992 Health Research and Educational Trust
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Goldfarb, Marsha G.; Coffey, Rosanna M.
Publication:Health Services Research
Date:Aug 1, 1992
Previous Article:Hospitalization style of physicians in Manitoba: the disturbing lack of logic in medical practice.
Next Article:A study of the relationship between severity of illness and hospital cost in New Jersey hospitals.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters