Printer Friendly

Perils of Pioneering: Monitoring Medicaid Managed Care.

Several States have used section 1115 demonstrations to introduce statewide, mandatory Medicaid managed care. When HCFA approves section 1115 demonstrations, it requires States to monitor managed care plans on a variety of measures and to report findings to HCFA. HCFA also monitors the demonstrations to ensure that the financial incentives resulting from capitated payment do not result in inappropriate underservice to the vulnerable Medicaid population (Health Care Financing Administration, 1995).

This article reviews the evolution and the adequacy of Federal and State monitoring of four section 1115 demonstrations between 1994 and 1998: (1) QUEST, in Hawaii; (2) SoonerCare, in Oklahoma; (3) Rite Care in Rhode Island; and (4) TennCare, in Tennessee (Wooldridge and Hoag, 1999). Our close observation of these four States in a larger evaluation offered an opportunity for assessing monitoring and for drawing valuable lessons for other States. In this review, we seek to answer the following questions:

* Through what structures and processes does the Federal Government monitor the States?

* Through what structures and processes do the States monitor the managed care plans?

* Is monitoring adequate?

We assess State monitoring in five domains: (1) financial, (2) encounter data, (3) quality assurance and quality improvement, (4) access and provider networks, and (5) grievance systems.

We define adequacy to mean that the monitoring agency sets performance standards, checks that the monitored entity meets the standards, provides regular feedback, and develops data and studies to review outcomes. Adequate monitoring also implies timely implementation of monitoring, allocation of sufficient resources to monitoring, consistently applied processes and standards, and improvements over time. Managed care offers a greater potential for improving quality and access to care than does FFS care because managed care monitoring goes beyond any that occurs under FFS.

We review the structure of Federal and State monitoring and analyze how States monitored finances, encounter data, quality assurance and quality improvement, access and provider networks, and grievance procedures between 1994 and 1998. Next, we discuss Federal monitoring efforts, and finish by discussing lessons learned from the State and Federal monitoring efforts.

DATA

Findings are based on data describing the structures and processes the States and HCFA use in monitoring, collected from documents and interviews with key people in each State and the Federal Government. We interviewed State officials, staff of managed care plans, providers, legislators, and advocacy organizations during site visits in 1995, 1996, and 1998, as part of case studies conducted for a HCFA-funded evaluation. We also drew on ongoing discussions with HCFA staff.(1)

BACKGROUND

The four demonstrations covered 1.7 million Medicaid enrollees as of July 1999. Key features of these demonstrations such as start date, eligibility groups included, types of participating plans and enrollment are shown in Table 1.
Table 1
Key Features of the Four Demonstration Programs

 Program Name and
State Implementation Date

Hawaii QUEST--8/1/94

Oklahoma SoonerCare--4/1/96(2)

Rhode Island Rite Care--8/1/94

Tennessee TennCare--1/1/94

State Key Design Elements at Implementation

Hawaii Eligibility expansion to uninsured up to 300 percent of
 the FPL. Mandatory managed care design using MCOs for
 AFDC, poverty-related, and expansion beneficiaries;
 MCOs cover medical, acute behavioral, and dental care.

Oklahoma No expansion initially. Mandatory managed care design
 for AFDC and poverty-related beneficiaries; MCOs in
 urban areas covering medical, dental, and behavioral;
 PCCM used in rural areas.

Rhode Island Eligibility expansion to pregnant women and children up
 to age 6 under 250 percent of FPL. Mandatory managed
 care design for AFDC, poverty-related and expansion
 beneficiaries; MCOs cover medical, acute behavioral,
 and dental care; extended family planning program
 for postpartum women.

Tennessee Eligibility expansion to uninsured and uninsurable,
 with subsidies up to 400 percent of FPL. Mandatory
 managed care design for all Medicaid-eligibles (except
 QMBs and SLMBs); MCOs cover medical, acute behavioral,
 and dental care.(3)

 Number and Type of Managed Demonstration Enrollment
State Care Plans as of 1999(1) as of July 1999

Hawaii 6 Total; 5 Commercial 118,112
 1 Medicaid Dominant Plan

Oklahoma 4 Total; 2 Commercial, 201,737
 2 Medicaid Dominant Plans

Rhode Island 4 Total; 3 Commercial, 87,717
 1 Medicaid Dominant Plan

Tennessee 9 Total; 5 Commercial, 1,284,264
 4 Medicaid Dominant Plans

(1) Commercial plans serve mostly non-Medicaid members in that State,
while Medicaid-dominant plans are those with only or mostly Medicaid
members in that State.

(2) Under a 1915(b) waiver, Oklahoma implemented HMOs in urban areas
in July 1995.

(3) In the first 3 years of TennCare, MCOs could be health maintenance
organizations or preferred provider organizations.

NOTES: FPL is Federal poverty level. MCO is managed care organization.
AFDC is Aid to Families with Dependent Children. PCCM is primary care
case management. QMB is qualified Medicare beneficiary. SLMB is
specified low income Medicare beneficiary.

SOURCES: (Wooldridge et al., 1996; Ku and Wall, 1997; Ku and Hoag,
1998; State of Hawaii 1997; Ku et al., 2000.)


STRUCTURE OF FEDERAL AND STATE MONITORING

When it approves a section 1115 demonstration, HCFA documents which Medicaid statutes are waived and issues "Special Terms and Conditions" (henceforth called "terms") that set the standards it expects the States to meet to secure Federal financial participation. The terms cover a broad range of areas but are not uniform across the States. Terms differed for various reasons, often because States requested different waivers or were implementing different types of programs.

States are authorized to monitor managed care plans through State laws and contracts with the plans, which must conform with Federal law and regulations and demonstration terms (Rosenbaum et al., 1997). Several different departments share responsibility for plan oversight including those responsible for Medicaid, public health, and insurance (Horvath and Snow, 1996). Some departments monitor contractually-specified performance standards and some regulatory standards specified under law. Figure 1 summarizes how Federal and State laws and regulations, contracts with managed care plans, and self-regulation interact to promote access, quality, and financial stability.

[ILLUSTRATION OMITTED]

As expected when implementing such a large new venture, it took the States time to develop mature oversight structures. Moreover, initial contract performance standards, reporting requirements, and sanctions varied widely in specificity and consistency with Federal terms. All four States modified their initial plan contracts to improve plan performance standards and to comply with Federal requirements.

STATE MONITORING PROCESSES

The agencies operating the demonstrations play a key role in monitoring in all five domains we reviewed.2 Other State agencies also have overlapping responsibilities for monitoring. For example, State insurance agencies monitor health plan finances for regulatory reasons unrelated to the demonstrations. States also use their external quality review organizations (EQROs) to monitor plans, as HCFA requires.3 Our review indicates which agencies are responsible for monitoring each domain.

States use various approaches to monitor plan performance, including regular reviews of documents and data, onsite reviews of processes, and audits or special studies. They provide written feedback to plans and may require corrective action plans.

Monitoring Finances

States set standards intended to ensure that plans have the financial strength to accept the risk inherent to capitation, and that they have provisions for continuity in case of insolvency. Financial monitoring is intended to ensure plan stability and adequate financial reserves through reviews of plans' compliance with the standards.

All the States set financial standards for plans in the licensure regulations. These States also enhance the regulatory standards by including terms in their plan contracts:

* Rhode Island includes financial performance standards.

* Hawaii, Oklahoma, and Rhode Island require performance bonds.

* Tennessee requires plans to submit quarterly and annual TennCare-only income statements.

Table 2 summarizes who monitors plan finances and their monitoring methods.
Table 2
Financial Monitoring, by State

Activity Hawaii

 Regulatory-Related Monitoring

Which regulatory Division of
agency monitors plan Insurance
finances?

Offsite reviews of Yes
quarterly and annual
financial statements by
the regulatory agency?

 Demonstration-Related Monitoring

Who monitors finances Demonstration
for the demonstration? Agency(1)

Additional standards Yes; performance
required in contracts? bond required

Line of business Demonstration only
monitored for
demonstration?

Offsite reviews for Yes; uses own formats
demonstration?

Onsite reviews No
conducted?

Audits conducted? No
Feedback to plans? No

Activity Oklahoma

 Regulatory-Related Monitoring

Which regulatory Department of
agency monitors plan Health
finances?

Offsite reviews of Yes
quarterly and annual
financial statements by
the regulatory agency?

 Demonstration-Related Monitoring


Who monitors finances Demonstration
for the demonstration? Agency(1)

Additional standards Yes; performance
required in contracts? bond required

Line of business Demonstration and
monitored for total business
demonstration?

Offsite reviews for Yes; uses the NAIC
demonstration? format reports

Onsite reviews Yes; every 6 months
conducted?

Audits conducted? No
Feedback to plans? Ongoing informal feedback

Activity Rhode Island

 Regulatory-Related Monitoring

Which regulatory Department of
agency monitors plan Business Regulation
finances?

Offsite reviews of Yes
quarterly and annual
financial statements by
the regulatory agency?

 Demonstration-Related Monitoring

Who monitors finances Demonstration
for the demonstration? Agency(1)

Additional standards Yes: performance
required in contracts? bond required; State
 sets benchmarks on
 profitability, liquidity,
 capital structure, and
 expense analysis based
 on total business;
 quarterly reviews

Line of business Total business
monitored for
demonstration?

Offsite reviews for Yes; uses the NAIC
demonstration? format reports

Onsite reviews Yes; at least
conducted? annually since 1996

Audits conducted? No
Feedback to plans? Yes

Activity Tennessee

 Regulatory-Related Monitoring

Which regulatory Department of
agency monitors plan Commerce and
finances? Insurance and State
 Comptroller of the
 Treasury's Audit
 Division(1)

Offsite reviews of Yes
quarterly and annual
financial statements by
the regulatory agency?

 Demonstration-Related Monitoring

Who monitors finances Department of
for the demonstration? Commerce and
 Insurance and State
 Comptroller of the
 Treasury's Audit
 Division(1)

Additional standards Yes; plans must
required in contracts? submit quarterly and
 annual TennCare-only
 statements

Line of business Demonstration and
monitored for total business
demonstration?

Offsite reviews for Yes; uses adapted
demonstration? NAIC format showing
 Medicaid and total
 business

Onsite reviews Yes; at least annually
conducted? since 1995

Audits conducted? Yes
Feedback to plans? Yes

(1) The demonstration agencies are: the Meal-QUEST Division in Hawaii;
the Oklahoma Health Care Authority in Oklahoma; the Center for Child
and Family Health in Rhode Island; and the TennCare Bureau in
Tennessee.

NOTE: NAIC is National Association of Insurance Commissioners.

SOURCE: (Wooldridge, J., and Hoag, S., 1999.)


The States have taken actions ranging from establishing informal requirements to mandating the correction of deficient plan financial performance. Sometimes, the demonstration agencies have required the plans to increase deposits to address solvency or other financial concerns. However, these States have not used the ultimate sanction of withdrawing licensure.

Hawaii's procedures are the least formal of the four States. For example, as shown in Table 2, Hawaii does not conduct onsite plan reviews, does not audit plans, and does not provide plans with feedback. When demonstration agency staff are concerned about a plan's financial footing, the plan is required to submit financial reports monthly, rather than quarterly. (Two of the plans have been reporting monthly since QUEST began.) Oklahoma can, and has, required formal corrective actions (in 1997, one plan had to correct deficiencies through a formal corrective action plan). Rhode Island required all four plans to negotiate written corrective action plans after its 1996 reviews showed that none of its plans met all the financial benchmarks, and that two did not meet other financial management requirements. It reviewed the plans 6 months later to assess compliance. Tennessee has the most formal financial monitoring process among the four States. It is the only one that audits plans, and it has taken the most formal actions to correct financial performance problems. Due to problems of program underfunding, some plans in Tennessee have had severe financial problems, and one was taken over temporarily by the State. However, the issue of underfunding is distinct from the adequacy of financial monitoring processes, which is the subject of this section.

Although none of these States has had a plan close as a result of financial problems, some of the diverse approaches to monitoring are inadequate, and most States lie near the least rigorous end of a continuum of financial oversight methods. Hawaii's process, which relies only on offsite review, seems inadequate to ensure that plans can fulfill their financial obligations and, therefore, may not ensure financial stability. Oklahoma works co-operatively with the plans and provides timely informal feedback, while holding them to State standards. Rhode Island has chosen to prop up its weakest and least compliant plan with additional State funds and support, rather than lose it. Tennessee, a strong enforcer, has continually strengthened the structure and process of its financial monitoring. Tennessee also appears willing to lose a plan that cannot meet its performance standards. It took the States time to develop and fully implement these processes, and the adequacy of their financial monitoring has improved over time.

Monitoring Encounter Data

Under managed care, States have to change their data role from processing claims data to acquiring, validating, and using encounter data for monitoring. HCFA's terms required that the States develop plans to implement and monitor encounter data collection and collect encounter data for use in monitoring and demonstration evaluations. The States included the same requirements in the plan contracts: plans had to collect and submit encounter data regularly. In some States, plans that did not do so were subject to financial penalties (such as payment withholds). Table 3 summarizes some of the steps the States took to monitor encounter data.
Table 3
Encounter Data Monitoring

Activity Hawaii Oklahoma

Who monitors Demonstration Demonstration
encounter data? Agency Agency and EQRO

Encounter data Yes Yes
plan prepared?

Received technical Yes Yes
assistance from HCFA
for encounter data
system development?

When were regular 1995 then stopped 1996
review and feedback from 1996 to 1998
implemented?

Encounter data validated Planned Partial validation
against medical records? done in 1998

Does State use encounter No No
data to report on quality?

Can encounter data be No No
used for evaluation?(2)

Activity Rhode Island Tennessee

Who monitors Demonstration Demonstration
encounter data? Agency Agency and EQRO

Encounter data Yes Yes
plan prepared?

Received technical No No
assistance from HCFA
for encounter data
system development?

When were regular 1997 1994
review and feedback
implemented?

Encounter data validated Planned Occasionally(1)
against medical records?

Does State use encounter No Yes
data to report on quality?

Can encounter data be Maybe(3) Yes; 1996
used for evaluation?(2)

(1) The State validates encounter data against medical records
extracted in the course of its outcome studies.

(2) Hawaii and Oklahoma have not yet approved their own data. We have
conducted face validity checks of TennCare encounters for 1995 and
1996 and plan to use the 1996 data.

(3) Rhode Island has stated that its encounter data are reasonably
adequate, but we have not reviewed them to assess their suitability
for evaluation purposes (as we have in Tennessee, the other State
where data have become available).

NOTES: EQRO is External Quality Review Organization. HCFA is Health
Care Financing Administration.

SOURCE: (Wooldridge, J., and Hoag, S., 1999.)


States focused initially on resolving problems the plans had in submitting encounter data. In the worst cases, States placed staff at the plans to help overcome problems or had the EQRO work closely with the plan. Only later did States implement regular encounter data review with feedback and corrective action plans. Moreover, the States scaled back and delayed ambitious initial plans for validating encounter data.

The approaches and timing of State actions to improve the quality of encounter data varied widely. Some delays were due to insufficient resources being applied to this activity, some to lack of knowledge about how to establish functioning encounter data systems (such as how to set up the system or what elements needed to be included). Tennessee began providing feedback to the plans early in the demonstration and applied sanctions when plans failed to meet contractual standards. It also sent technical assistance staff to help some plans overcome serious difficulties with provider payment related to encounter data processing problems. Rhode Island deferred review and feedback until plans began submitting data regularly.

The adequacy of State encounter data collection and monitoring can be measured by whether a State:

* Implemented its monitoring plans and collects and reviews data regularly.

* Validates its data.

* Uses the data to conduct outcome studies.

* Provides the data in a format that can be used in HCFA's evaluation.

Only Tennessee meets all four of these adequacy measures. It was the first State to develop a full data review process and is the only one that uses encounter data in its quality monitoring program and that publishes reports using the data. In an independent face validation of Tennessee's encounter data for 1995 and 1996, we determined that the 1996 data appear to be usable for some measures fin particular, for inpatient measures) and for some plans. By the end of 1998, Rhode Island had fulfilled the first two of the four steps and declared that its encounter data were reasonably accurate. Oklahoma regularly reviewed plan data, but had not determined that they had reached a sufficiently high standard for use in monitoring quality. Hawaii had only recently begun to review its encounter data regularly.

It is of considerable concern for monitoring quality and for evaluations of Medicaid managed care that, 5 years into their demonstrations (3 years, in the case of Oklahoma), Tennessee is the only State that had produced usable data early enough for us to assess their adequacy and analyze them.

Monitoring Quality of Care

The primary goal of quality-of-care monitoring is ensuring that beneficiaries receive appropriate care and that quality of care is improving, given that plans' financial incentives may undercut quality and access. The Federal terms required each State to conduct external audits to monitor the plans' performance. All have contracted with an EQRO to conduct special studies; some have used the EQRO for additional activities (which are described in this and later sections). HCFA's terms require the States to develop internal and external audits and outcome studies to monitor plans' quality assurance and quality improvement activities (Health Care Financing Administration, 1993). The terms also require States to include in their plan contracts requirements for internal quality assurance programs, as established by Federal law (42 CFR 434).

All four States require participating plans to operate quality assurance programs and specify what these programs should include (primarily, written descriptions of the program's goals, scope, standards, and activities to be conducted). The States also incorporated in their plan contracts standards for the plans' quality assurance programs. Every State mandated that plans report some type of outcome, although the specific elements varied. Finally, Rhode Island also imposes an external quality standard, under which all

State-licensed HMOs must be certified by the National Committee for Quality Assurance (NCQA) or must have applied for NCQA certification within 2 years of startup. Table 4 summarizes State monitoring processes and their timing.
Table 4
Quality Assurance Program Monitoring, by State

Activity Hawaii Oklahoma

Who monitors quality Demonstration EQRO
assurance programs? Agency

When did onsite quality First year First year for
assurance program urban program;
reviews begin? second year for
 rural program

Is written feedback Yes Yes
provided to plans?

Are corrective action Yes Yes
plans required?

Are plans required to Yes(1) Yes
conduct population
studies?

Have State outcome Yes; EQRO Yes; EQRO
studies been conducted?
By State or EQRO?

When did the State Third year First year for
outcome studies begin? urban program; second
 year for rural
 program

Activity Rhode Island Tennessee

Who monitors quality Demonstration EQRO
assurance programs? Agency

When did onsite quality Third year First year
assurance program
reviews begin?

Is written feedback Yes Yes
provided to plans?

Are corrective action Yes; negotiated Yes
plans required?

Are plans required to Yes Yes
conduct population
studies?

Have State outcome Yes; EQRO and Yes; EQRO and
studies been conducted? others others
By State or EQRO?

When did the State Second year Second year
outcome studies begin?

(1) Hawaii requires "focused" studies but does not mandate that
they be population studies. For example, some plans submitted
reports that documented the process of generating Medicaid HEDIS data.

NOTES: EQRO is External Quality Review Organization. HEDIS is
Health Plan Employer Data and Information Set.

SOURCE: (Wooldridge, J., and Hoag, S., 1999.)


States initially provided technical assistance to plans to establish adequate quality assurance programs. They then began to take the following steps: review whether quality assurance programs were being implemented; provide feedback and quality improvement recommendations; and require corrective action plans.

The States introduced clinical appropriateness monitoring after they had begun monitoring the plans' quality assurance programs. Over time, States have become more sophisticated about this type of monitoring. They now require the plans to conduct studies focused on various clinical areas and targeted populations, and to develop clinically appropriate practice guidelines. Moreover, the States themselves began conducting their own special studies.

All four States require plans to submit corrective action plans when their quality assurance programs are out of compliance with State standards. Oklahoma, Rhode Island, and Tennessee documented the comprehensiveness of both their quality assurance program reviews and their feedback to the plans. The feedback was thorough, focused, and had as its goal future quality improvement.

The States have conducted studies to assess outcomes for given populations across plans. Each State recognized that focused studies based on medical record reviews, encounter data, and other data sources are important for assessing clinically appropriate care provision, as these studies can identify important delivery or quality problems (for example, underservice or inappropriate service). (If medical record reviews uncovered problems, the penalties could be quite severe. For example, Tennessee's contracts provide for suspension if medical records indicate quality problems [State of Tennessee, 1995].) For example, Hawaii's study of Early and Periodic Screening, Diagnosis, and Treatment services uncovered a need to improve lead risk/lead levels, dental screens/fluoride assessments, developmental/behavioral assessments, tuberculosis skin tests, and vision assessment (FMH, Inc., 1997). Oklahoma's EQRO studied birth outcomes and found that plans improved rates of prenatal care in the first trimester between 1995 and 1996, but that females still were not receiving the number of visits recommended by the American College of Obstetricians and Gynecologists (State of Oklahoma, 1997). Oklahoma's study counted only visits paid for by a SoonerCare plan; thus, the smaller-than-recommended number of visits may be a result of females receiving visits prior to their SoonerCare eligibility, as those visits were not counted in the study.

By late 1997, the States were actively monitoring quality assurance programs, requiring plans to conduct focused studies, conducting their own focused studies, providing feedback, and requiring actions to correct identified deficiencies. However, they were slow to initiate these activities. Delays were due to a focus on startup activities, lack of resources, staff turnover, and problems with encounter data. Initially, some contracts did not require plans to conduct their own studies. Rhode Island delayed onsite quality assurance program review until 1996. Hawaii and Rhode Island waited until 2 years into their demonstrations to select EQRO contractors to conduct focused studies, even though it was a Federal requirement to have an external auditor. Although Tennessee selected its EQRO in the first year, the EQRO initially provided technical assistance to help the plans develop adequate quality assurance programs. As a result of these delays, State-sponsored monitoring studies did not take place until at least 2, and sometimes 3, years into the demonstrations (except in Oklahoma). Moreover, only Tennessee has published monitoring studies based on encounter data. Because we believe that States should be publicly accountable for the quality of care delivered under the demonstrations, we consider that States should publish these studies, rather than use them only internally.

Monitoring Access and Provider Networks

One concern that HCFA and the States have about implementing Medicaid managed care programs is how access to providers (and choice of providers) will be affected. Thus, States establish and monitor access performance and provider network standards. Here, we review State monitoring standards for access (such as appointment waiting times and travel times) and primary care networks.

The Federal terms for Oklahoma and Tennessee require them to comply with standards for appointment waiting times and travel times set by HCFA. In addition, all the States require the plans to meet State-specified access standards for appointment waiting times and travel times. The plans, in turn, are to incorporate these and other access standards into their quality assurance programs and to monitor themselves against these standards.

Because of the importance of the provider network to adequate access, the Federal terms set provider network standards. The States, in turn, included provider network standards in their plan contracts. Rhode Island and Oklahoma incorporated the strictest standards, which specified provider-to-population ratios for both primary care and other practitioners. Rhode Island also specified mandatory mainstreaming.(4) Oklahoma specified that providers could not intentionally segregate Medicaid patients from other patients they serve. Network adequacy standards in Tennessee and Hawaii were less specific. Table 5 summarizes the methods that the States used to monitor access and primary care provider networks.
Table 5
Access and Primary Care Provider Network Monitoring, by State

Activity Hawaii Oklahoma

Who monitors Demonstration Demonstration
access agency agency
and provider
networks?

What methods Reviews Reviews satisfaction
do they plan reports survey results.
use to Will review plan
monitor access? reports in future

Is access Plan level Both
monitored at
plan or
program level?

What actions have None None
States taken?

What methods are Reviews of regular Uses GeoAccess(TM)
used to monitor reports; software to assess
provider networks? monthly checks plans' monthly network
 of provider- reports(1),
 to-population In PCCM program,
 ratios GeoAccess[TM] used
 quarterly to assess
 network, and
 demonstration agency
 checks whether providers
 accept new patients.

What are None specified in Urban program: 1:1,750
the primary care the contracts members; Rural (PCCM)
provider- program: 1:2,500
to-population members(2) (ratio
ratio standards? to be prorated
 according to the
 amount of time
 provider is available
 to that plan)

Does demonstration No Yes
agency require
plans to
include other
patients in
provider-
to-population
ratios?

Does demonstration Yes No
agency check whether
providers accept new
patients?

Does demonstration No Checks at plan site
agency conduct visits
additional audits
and/or surveys?

Activity Rhode Island Tennessee

Who monitors Demonstration Demonstration
access agency and separate agency
and provider State regulatory
networks? agency--Division of
 Health Services
 Regulation

What methods Reviews plan reports Reviews plan reports
do they Special study Reviews satisfaction
use to survey results (at pro
monitor access? gram level); uses ZIP-
 Code-based program
 to travel times


Is access Both Both
monitored at
plan or
program level?

What actions have Fines (by regulatory None
States taken? agency); contract
 modifications and
 corrective action
 plans (by
 demonstration
 agency)

What methods are Regular checks of Uses GeoAccess[TM]
used to monitor provider-to- software to assess
provider networks? population ratios plans' monthly network
 reports

What are 1:1,500 members; 1:2,500 members
the primary care 1:1,000 for PCP
provider- teams and sites
to-population (medical residents)
ratio standards?

Does demonstration No No
agency require
plans to
include other
patients in
provider-
to-population
ratios?

Does demonstration Yes Checked once, in
agency check whether special survey
providers accept new
patients?

Does demonstration Checks at plan site Conducted once, in
agency conduct visits special survey
additional audits
and/or surveys?

(1) GeoAccess[TM] is a computer software program that can compare
and assess the geographic adequacy of the provider network (for
example, whether or not there is a provider located within 30 miles
of an enrollee's home).

(2) In addition, in both urban and rural programs, up to 875 members
for each nurse practitioner, physician assistant, or medical
resident affiliated with the PCP is allowed; if the physician
assistant or nurse practitioner has a separate PCP contract,
these standards do not apply.

NOTES: PCCM is primary care case management. PCP is primary
care provider.

SOURCE: (Wooldridge, J., and Hoag, S.,1999.)


States were actively monitoring access by the second demonstration year. Hawaii set Health Plan Employer Data and Information Set (HEDIS) access performance standards for its plans and undertook its first onsite performance review in spring 1995.(5) By that fall (1 year after startup), it prepared a report on plan performance based on plan reports to the State. These reports showed that the plans were monitoring their performance against both State and plan access standards, and that they had made changes to improve access to care and access to member services. Hawaii set a same-day requirement for seeing patients with emergency and urgent care needs and a 3-week requirement for non-urgent care. Plans reported setting and meeting the State emergency standards. Some plans reported a lower performance standard for urgent care--up to 48 hours, instead of within 24 hours, but reported that the majority of patients were seen within 24 hours. One plan set a 6-week standard for seeing non-urgent cases, compared with the State's 3-week standard, and reported that 25 percent of its patients were seen more than 1 month after they had tried to make an appointment. HCFA reported that communication between Hawaii and the plans appeared to be good, but that the State was not spending enough time reviewing plan compliance with the contract (Health Care Financing Administration, 1997). Hawaii is the only one of these four States that has not required a plan to take corrective action on its provider networks.

Oklahoma reported in its second annual report to HCFA that it monitors access in both the capitated and partially-capitated programs (State of Oklahoma, Oklahoma Health Care Authority, 1997). However, that report focused on how the State monitors plan networks, rather than on access to the networks. Oklahoma also monitors access through analysis of member incident and complaint reports (Pasternik-Ikard, 1999). Oklahoma's 1997 survey found that 65 percent of those surveyed in urban areas believed appointment wait times were too long, an increase from 23 percent during the previous year (State of Oklahoma, 1997). Because it was disturbed by these survey results, the State planned to monitor plans' compliance against contract-specified timeframes in 1998.

Within 14 months of startup, Rhode Island had embarked on a study of access and network composition (Birch and Davis Health Management Corporation, 1996a). It identified such deficiencies as non-compliance of plan performance standards with State standards. As a result, it modified the plan contracts in 1996 and drew up a detailed plan for monitoring plan performance in all areas, including access (Birch and Davis Health Management Corporation, 1996b). In both 1996 and 1997, it again reviewed deficient plans and negotiated corrective action plans (covering a variety of areas, not just access) with all of them.

Rhode Island is the only State among these four to have two agencies monitoring access and provider networks. Rhode Island's regulatory agency, the Division of Health Services Regulation, cited a plan for access violations in 1996. The agency also received numerous complaints about access problems in another plan, including problems with the plan's physician capacity and provision of mental health care; lack of documentation about complaints, denials, and appeals; and inadequate infrastructure. The agency's investigation culminated in a consent decree with the plan, although representatives from the State regulatory agency indicated that the plan took 18 months to comply with identified deficiencies. The Division of Health Regulation believes that this delay compromised its standing with other plans in the State; its apparent failure to hold this plan to the State's standards may prevent the other plans from taking the Division seriously. This issue caused friction between the Division of Health Services Regulation and the Rite Care demonstration agency.

Tennessee appears to use the plans' routine reports as its only source for routine reviewing of wait time measures and does no independent verification. (Although TennCare uses its annual member satisfaction survey to review the adequacy of wait times to appointments and wait times in the office, the resulting data are at the program level, not the plan level.) Finally, Tennessee has applied financial sanctions to plans that have not conformed to its primary care provider network standards.

The States' standards for monitoring access and provider networks vary greatly, as do their monitoring methods. In some States, access monitoring does not include assessments of whether plans meet the patient travel and wait time standards (to appointments and in the office); for example, Oklahoma relies on its satisfaction survey results, rather than assessing the plans on these standards. Thus, although the Federal Government set standards for wait times and travel standards, it is unclear whether the demonstrations are in compliance.

Provider network standards are relatively weak in most of the States and thus, do not rule out access problems even when plans meet the standards. We share the concerns of the U.S. General Accounting Office about the States' approaches to monitoring network provider-to-population ratios (U.S. General Accounting Office, 1997). Providers usually see many out-of-plan patients, but Oklahoma is the only State to take this factor into account by including those patients in the provider-to-population ratio. Thus, except in Oklahoma, States' measured provider availability overstates actual availability.

Furthermore, only in Hawaii and Rhode Island do the provider-to-population ratios take into account whether providers accept new patients. Oklahoma and Tennessee do not monitor the proportion of network providers that accept new patients. Tennessee recognized this problem and undertook two network studies to monitor providers' acceptance of new patients. However, it does not conduct this monitoring on an ongoing basis, as Hawaii and Rhode Island do.

Although numeric network standards are designed to protect access adequacy, they do not take into account quality changes that can result from providers leaving the network. There is considerable potential for care disruption when providers leave a plan. Some States have limited the ability of plans to drop providers to protect continuity of care.(6)

Monitoring Grievances

Managed care plans operate under financial incentives that might cause them to limit, deny, or delay members' care. Therefore, the States require that the plans establish formal procedures members can use to complain, grieve, or appeal a decision about care or coverage (Raymond, 1995). The intent of these procedures, collectively referred to as the "grievance system," is to protect members against harm and to hold the plans accountable for their actions. Each of the four States has a different formal definition of the elements of the grievance process. A "complaint" usually refers to the lowest level of member dissatisfaction, often expressed through a telephone call to the plan (by either a member or his or her provider). A "grievance" is a formal written complaint that once sent, requires review within a certain period, with provisions for expedited reviews if the member's physician believes a decision is urgently needed (again, filed by a member or his or her provider). An "appeal" is a member's last chance to have another hearing of the issue by the State, if the grievance has not been decided in the member's favor. Although not formally a part of the grievance process, part of the managed care plans' education efforts must include informing members about their rights to file grievances.

HCFA requires the States to ensure that the plans institute functioning and timely grievance systems, and to send HCFA the plan grievance reports. (Hawaii is the exception, because State reporting was not specified in the initial terms.) The process standard was explicitly specified only in Oklahoma's terms. The States set comparable standards for plans. Grievance standards typically limit the time plans have to review and resolve a complaint or grievance; they also specify procedures for resolution. All States require plans to report grievances periodically that they, in turn, report to HCFA.

Tennessee's grievance procedures underwent a major change in 1996, when a Federal district court ruled that the TennCare grievance and appeals process was inadequate. The Federal ruling required the State to adopt a new grievance procedure that gave members greater access and stronger rights, and that strengthened monitoring. As a result, Tennessee is the only State that includes sanctions for non-compliance in its plan contracts, and the only State in which the demonstration agency does not monitor plans on grievances. It requires plans that do not respond to grievances within the specified time to cover the denied service. Table 6 summarizes grievance-monitoring responsibilities and methods, by State.
Table 6
Grievance System Monitoring, by State

Monitoring Process Hawaii Oklahoma

Who monitors plans? Demonstration Demonstration
 agency agency

State reviews plan Yes Yes
grievance logs and
quarterly reports?

State monitors plans' Planned Yes
grievance systems,
including onsite review?

State sanctions for No No
missing grievance-
response deadlines?

Monitoring Process Rhode Island Tennessee

Who monitors plans? Demonstration Agency outside
 agency demonstration agency
 (since 1997)

State reviews plan Yes Yes
grievance logs and
quarterly reports?

State monitors plans' Yes (Since 1996) Yes
grievance systems,
including onsite review?

State sanctions for No Yes
missing grievance-
response deadlines?

SOURCE: (Wooldridge, J., and Hoag, S., 1999.)


Two States have never found their plans' grievance procedures to be deficient and two have taken action in response to deficiencies. Hawaii has neither taken action against a plan based on its reviews of plan grievance reports nor provided feedback to the plans in this area. Oklahoma has never requested that a plan take corrective actions because of grievances (MacCauley, 1998). In contrast, Rhode Island, in its 1996 plan reviews, found all four plans to be deficient in some aspect of the grievance procedures. Deficiencies identified ranged from serious (one plan not implementing the grievance procedures as specified in the State contract) to minor (one plan requiring some minor wording changes in its grievance policies to reflect the State's exact 1996 contract language). Rhode Island required the plans to correct these deficiencies. Tennessee uses its annual EQRO reviews to give plans feedback on their grievance systems. The State "automatically" takes action against non-compliant plans by requiring them to cover any denied service if they fail to respond to the grievance within the prescribed time.

States did not monitor grievance systems initially, not only because available resources were being devoted to operations when the demonstrations began, but also because they did not recognize the importance of effective grievance systems in offering protection to a vulnerable population that lacked experience with managed care. Over time, they have made it easier for enrollees to complain. Furthermore, most States make the final decision about the validity of a complaint. Except in Hawaii, States are now actively monitoring grievance structures and processes, and giving the plans feedback. Thus, Hawaii's grievance monitoring process is inadequate. HCFA found that some QUEST plans were not reporting complaints or grievances that had been resolved to the member's satisfaction (Health Care Financing Administration, 1997). It also found that the State had not investigated or verified one plan's report of no complaints. HCFA reported that Hawaii does not know whether plans have informed their members about their fights to submit grievances. Rhode Island and Tennessee have established systems that give beneficiaries greater access to the grievance system than HCFA requires.(7) Oklahoma meets the minimum HCFA standard.

Onsite reviews of the plans' grievance systems are necessary to ensure that all the required elements of the grievance system are in place, as is feedback about those processes. Tennessee and Oklahoma have conducted these reviews from the beginning of their demonstrations, while Rhode Island conducted its first onsite reviews of plans' implementation of grievance procedures in 1996.

FEDERAL MONITORING OF STATES

HCFA must monitor the demonstrations to verify that the States comply with the terms, and that quality of care and access to care do not diminish under managed care. Although HCFA had considerable experience (primarily at the regional offices) monitoring section 1915(b) Medicaid managed care programs, the scale and number of the new section 1115 demonstrations was unprecedented, and HCFA accordingly had to make changes in its approaches to monitoring. The responsibility for monitoring section 1915(b) demonstrations lay primarily with the regional office staff, following a central office protocol. The monitoring elements included document review, site visits, and ongoing communication with State staff, but there was wide variation in regional office monitoring practices. Major changes for the section 1115 demonstrations were the development of joint central and regional office monitoring teams headed by project officers in the central office and a systematizing of monitoring approaches.

The project officer, who is based in HCFA's central office in Baltimore, leads this team and is a State's chief point of contact at HCFA. However, the appropriate HCFA regional office usually has the primary obligation for overseeing a State's compliance with the terms and conditions, although there is considerable variation across States in how the roles are distributed across central and regional offices. HCFA monitoring begins with an onsite readiness review before implementation. After implementation, HCFA conducts onsite reviews, offsite document reviews, and conference calls with State staff.

HCFA can modify a State's demonstration terms, as it has done in two of the four States reviewed here. There is little other formal feedback to the States, unless unusual situations or problems arise. HCFA had the opportunity to annually modify the terms of these demonstrations, or could do this at the waiver renewal time (after the demonstrations were running for 5 years). According to HCFA representatives, HCFA is contemplating dropping the annual renewal process during which the terms could be modified in favor of a more collaborative approach. Furthermore, once demonstrations approved prior to August 1997 are extended beyond the initial 5 years, the Balanced Budget Act of 1997 prohibits HCFA from modifying the terms (Federal Register, 1998).

Rather than formal feedback, HCFA provides feedback through more informal channels, such as teleconferences and personal telephone calls. The emphasis on informal communications allows HCFA to convey any concerns to the States quickly. However, some monitoring team members suggested that HCFA could improve its feedback to States by documenting findings more often and sharing them with the States.

HCFA's monitoring processes and oversight procedures have matured, but resource and political constraints and staff turnover have hampered the monitoring process. Early Federal monitoring was focused on operational problems, but as the demonstrations have matured, HCFA has focused on monitoring compliance with the terms and has been particularly attentive to monitoring quality issues. Still, some areas of the Federal monitoring process need improvement. For example, despite substantial informal feedback, there is a dearth of formal feedback on findings; when there is formal feedback, it is often late, making it difficult for the States to take HCFA seriously.

DISCUSSION

These four States implemented their demonstrations at a period when Medicaid managed care and Medicaid managed care monitoring processes were in their infancy. It took them time to implement the basic demonstrations (especially getting enrollment working smoothly) and, initially, these implementation issues took the time and attention that might have been devoted to monitoring in a more mature Medicaid managed care world. Thus, monitoring in these demonstrations was slow to start and all these demonstrations were operating for months or years before thorough monitoring of managed care plans became routine. Over time, States were able to develop, implement, and strengthen monitoring structures, thus coming mostly into compliance with their Federal terms and conditions for operating the demonstrations. Most of the limitations in monitoring we found were procedural rather than structural. In some States, monitoring has been hampered by a lack of resources. Federal monitoring has similarly suffered from a lack of resources, as well as political pressure. After 4 years of operations, we concluded that monitoring was not adequate to ensure access and quality even though monitoring it is critically important for protecting vulnerable Medicaid populations from harm.

Full-Scale Monitoring After a Learning Period

Monitoring was slow to begin. The States had little experience in monitoring managed care, and although HCFA had monitored numerous section 1915(b) demonstrations, it had not refined its approach to address the monitoring needs of the comprehensive section 1115 demonstrations. The States initially focused on overcoming their own and the managed care plans' operational problems, which limited the resources available for monitoring and delayed its implementation. Some managed care plans, especially new ones, had no experience monitoring quality assurance and quality improvement. States sometimes provided technical assistance to overcome plan problems and delayed implementing regular monitoring review and feedback until plans had made progress. Finally, the lack at startup of Medicaid-specific tools, such as a Medicaid HEDIS or a Consumer Assessments of Health Plans Survey Medicaid satisfaction survey, slowed implementation of some aspects of monitoring.

More Resources to Monitor Managed Care

Adequate financial resources and well-trained staff are necessary to support monitoring. However, building a staff with appropriate skills and knowledge of managed care has been a challenge for the States. Some reported that they had to work with and re-train staff from their traditional Medicaid programs, who usually lacked monitoring experience.

States devoted inadequate resources to monitor some of the five domains we reviewed. For example, Hawaii lagged in devoting sufficient resources to monitoring plan finances and plan quality assurance processes. A report by HCFA (1997) criticized the State for insufficient quality improvement resources, a situation that worsened in 1998. Except in Tennessee, which has collected and used encounter data for outcome studies, the resources allocated to collecting and reviewing encounter data were sparse until late in the 5-year demonstration period. Thus, early HCFA review of resources available for implementation does not ensure that States have adequate staff to fulfill all their monitoring commitments.

Although all four States increased their Medicaid administrative expenditures since the demonstrations began, Rhode Island was the only one to do so substantially (Table 7). Both Oklahoma and Rhode Island increased their administrative expenditures modestly as a percentage of total Medicaid expenditures. Oklahoma's expenditures are the highest among the four States. However, Oklahoma's high level of spending was not solely a consequence of managed care, which was not implemented until 1995, as spending historically is high in that State. Rhode Island authorized the biggest absolute (and relative) increase, nearly doubling its administrative budget the year the demonstration was implemented. In Rhode Island, two events contributed to the large administrative increases between 1993 and 1994 and may be distorting the 1993-1996 comparison: (1) Rhode Island established its Management-Medicaid Information System in this period, leading to a significant increase in administrative expenditures between 1993 and 1994; and (2) Rhode Island hired Birch and Davis as a consultant in 1994, which also contributed to the increase seen in this period. Using 1994 as the baseline period, administrative expenditures in Rhode Island decreased by 0.8 percent between 1994 and 1996. Hawaii's and Tennessee's expenses fell as a percentage of total Medicaid expenditures.
Table 7
Administrative Expenditures, in Millions of Dollars and as a
Percentage of Total Medicaid Expenditures: 1993-1996

 1993 1994

State Expenditures Percent Expenditures Percent

Hawaii $18.70 4.30 $17.00 3.40
Oklahoma 92.0 7.40 92.70 7.80
Rhode Island 19.8 2.20 37.20 4.40
Tennessee 90.9 3.10 81.40 2.80

 1995 1996

State Expenditures Percent Expenditures Percent

Hawaii $38.80 5.00 $24.60 3.80
Oklahoma 95.40 7.80 104.70 8.30
Rhode Island 36.60 3.50 36.90 4.70
Tennessee 129.80 3.70 92.70 2.90

 Percentage Change
State 1993-1996

Hawaii 31.7
Oklahoma 13.8
Rhode Island 85.9
Tennessee 2

NOTES: Hawaii, Rhode Island, and Tennessee implemented
Section 1115 demonstrations in 1994. Oklahoma implemented
its section 1115 demonstration in 1996.

SOURCE: Urban Institute (1998) analysis of HCFA-64 data.
Presented in 1996 dollars.


We have not assessed the States' administrative budgets sufficiently closely to determine whether current levels are adequate for conducting all the types of monitoring required, as well as other administrative duties (such as operations).

States are Using Performance Standards

Performance standards play different roles in quality improvement approaches to monitoring than in quality assurance approaches. In the former, the standards are goals toward which the plans must move (and which may be changed after the standards have been met); in the latter, attention focuses on meeting the standards.

Every State's contracts specified performance standards for all five domains reviewed, although the standards varied across States. All four States adopted a similar approach to setting performance standards, choosing to combine some external standards (such as HEDIS or Quality Assurance Reform Initiative) with some newly developed ones.

State Monitoring--Improvements Must be Maintained

Over time, the States have implemented and improved monitoring of managed care plans. States have introduced stronger requirements in their plan contracts, hired staff to monitor the plans, and implemented more thorough reviews of plan performance. However, some States are more advanced than others. Furthermore, the States face the ongoing challenge of encouraging the plans to continue to provide quality services, in part by ensuring that plans do not backtrack after meeting a given standard. Staff turnover, both at the plan and the State levels, also caused some periodic discontinuity in monitoring, although as State and plan monitoring procedures improved and became more institutionalized over time, they were easier to transfer to new staff.

Feedback and Corrective Actions Vary

Monitoring is pointless unless it includes feedback and requires improvements in deficient performance. The extent of feedback and corrective action varied. All four States give plans feedback on encounter data, and quality assurance programs. All the States except Hawaii give feedback and require corrective action in the domains of finances, grievances, and access and provider networks. Thus, Hawaii seems to be lacking in an important dimension of monitoring.

States Have Different Plan Monitoring Styles

Some States adopted a partnership approach to monitoring, based on a quality improvement philosophy in which plans were not always held to performance standards, sanctions were rare, but improvement was required and monitored. However, not holding the plans to standards causes equity problems. In contrast, some States adopted a regulatory approach to monitoring, based on a quality assurance philosophy in which sanctions were applied when plans did not meet the standards. A drawback to this approach is that some performance standards may be unreasonable, even if they are useful as goals. Thus, applying sanctions may be more punitive than quality-enhancing.

In reality, the States apply a mix of both the regulatory and partnership approaches to monitoring although some States are closer to one model than the other. Rhode Island embraces the partnership model in which continuous feedback, assistance, and negotiation of issues are the chief monitoring methods and sanctions are rare. Tennessee comes closest to the regulatory model, in which enforcement is the norm, and penalties are imposed when plans do not meet performance standards. Tennessee does provide assistance and feedback to the plans but negotiates with them less often than does Rhode Island. Oklahoma incorporates elements of both approaches: it negotiates with its plans but also enforces performance standards. Hawaii rarely enforces contract terms and rarely provides feedback and assistance to the plans.

Under the partnership approach, frequent communication limits problems and improves relationships with the plans. Rhode Island, in particular, has a commitment, which includes resources, to use monitoring to improve plan performance. However, this approach can have drawbacks, because States may have to commit substantial resources to supporting weaker plans, and interplan equity may be undermined. For example, Rhode Island has jeopardized its credibility with some plans by working closely with just one plan, Neighborhood Health Plan. Rhode Island gave the plan financial assistance and a risk-sharing deal that no other plan has received; nevertheless, the State's actions clearly communicate that it does not intend that any participating plans should fail.

Benefits of the regulatory approach to monitoring are that plans have strong incentives to comply with State standards, and the States have some assurances that plans will not fail or perform poorly if they follow the rules. Penalties--usually financial ones--are imposed if the standards are not met. Moreover, in a State with many plans (like Tennessee, which originally had more than twice as many plans as did the other States), a regulatory approach may have been the better choice; the partnership approach may be more feasible when there are fewer plans. The disadvantages of the regulatory approach result from the more distant relationship between the State and plans. There is less communication between the plans and the State and the plans may lack incentives to exceed the minimum standards. Moreover, regulation and sanctions do not always produce improvements. As the number of plans in Tennessee dropped from 12 to 9 between 1994 and 1997, the State adopted more elements of the partnership approach.

HCFA's Monitoring

HCFA faces two contradictory forces in monitoring section 1115 demonstrations. Like all Federal agencies, it must be publicly accountable to taxpayers, yet it is under pressure to be more flexible in its relationships with the States. We find that this situation makes it very difficult for the agency to fulfill its duties. HCFA has legal responsibilities to monitor the demonstrations, and a partnership approach was envisioned. The reality is that HCFA is under political and resource pressures to take a laissez-faire approach, in which the States operate the demonstrations as they see fit, rather than take a true partnership approach. HCFA intervenes when it believes that a major problem is occurring, but a partnership implies a two-way dialogue. Thus, the States must share responsibility for ensuring successful partnering.

HCFA's oversight and monitoring of State managed care demonstrations has evolved. HCFA learned from these demonstrations, and the forthcoming final Balanced Budget Act regulations on Medicaid managed care will provide more specificity for Medicaid managed care monitoring by HCFA (Sachs, 1999). We have seen changes in the terms over time, with greater clarification about what HCFA wants. In the future, HCFA could improve its monitoring and promote partnering with the States by providing more formal feedback to the States (either by issuing revised terms in a timely way or working collaboratively with States to solve problems of compliance with terms), and spending more time in face-to-face meetings with the States to work through issues. However, as of late 1998, more resources are needed to monitor and to provide constructive feedback to the States.

CONCLUSIONS

Based on the four States we reviewed, monitoring by HCFA and the States is not yet at the point of ensuring access and quality. Initial State monitoring of plans was clearly inadequate; in response, HCFA and the States have improved the monitoring structure (regulatory and contractual standards), monitoring processes, and resources devoted to monitoring in the five domains examined. Some States have stressed the enforcement of performance standards (a quality assurance approach), others have focused on quality improvement toward standards. To some extent, all four States (and HCFA) implemented elements of both approaches. However, an approach in which frequent communication and feedback is wedded to assurance of standards seems most likely to ensure access and quality. Regardless of the approach taken, and while recognizing that resource and political constraints will be limiting factors, we have shown that HCFA and the States can take many steps to improve their monitoring.

These findings raise two questions about implementing Medicaid managed care monitoring:

* Should these government entities have been more diligent about establishing the right monitoring structures and processes from the inception of the demonstrations?

* How sophisticated should we expect Federal and State monitoring to be, especially relative to the commercial sector?

HCFA and State monitoring of Medicaid managed care in the section 1115 demonstrations took a while to be fully implemented because of both avoidable and unavoidable problems. Avoidable problems are ones that could have been overcome with additional resources (although as a practical matter, resources were necessarily allocated initially to implementing critical demonstration activities, such as enrolling beneficiaries). Lack of experience in these States led to some probably unavoidable delays in full implementation of monitoring, although we would argue that some of the delays were too long. When the demonstrations began, few tools were available for monitoring Medicaid managed care--a signal that monitoring was in its infancy. In the commercial sector, which provides the only standard of comparison, monitoring of managed care plans by private purchasers has been limited to a few large purchasers of care.

In conclusion, we recommend that as States expand their capitated Medicaid managed care programs, they build on the experiences of the States that have pioneered Medicaid managed care monitoring, by identifying what methods are successful in a given social, economic, and political climate, and what level of resources will be required. All States face the challenge of convincing legislators that they need more staff to monitor managed care than they did to run the traditional Medicaid insurance program. We hope that this article, by presenting details of different approaches and methods used in "pioneer" States, will help States to develop effective monitoring in their Medicaid managed care programs.

ACKNOWLEDGMENTS

The authors appreciate the insights of demonstration and regulatory agencies' staff, the four HCFA project officers--Theresa Sachs, Daniel McCarthy, Deborah Van Hoven, and Rose Hatten--and HCFA's regional office staff--Mary Rydell, Arthur Pagan, Richard Pecorella, and Patsy Evans. Our project officer, Penny Pine of HCFA, made important contributions to this report. We also thank several external reviewers, including Edward Hutton, Dr. Harold Luft, Rebecca Pasternik-Ikard, and Tricia Leddy. Marilyn Ellwood of Mathematica and Leighton Ku of the Urban Institute helped collect data for this report, reviewed an earlier draft of the paper, and provided many useful suggestions. Suzanne Felt-Lisk of Mathematica also provided helpful comments on an earlier draft. Anonymous reviewers also assisted us to shorten and clarify the paper.

(1) The same interview protocols, focusing on each of the monitoring areas, were used across States, and the same types of respondents were interviewed in each State. We also drew on monitoring reports prepared by the States and their contractors.

(2) In Rhode Island, many of the responsibilities of the demonstration agency are contracted out to Birch and Davis Health Management Corporation, whose staff essentially act as State staff.

(3) States must contract with an entity that is external to and independent of the State and the plans it contracts with to perform an annual review of the quality of services furnished by plans (Office of the Assistant Secretary for Planning and Evaluation, 1999).

(4) Mainstreaming is the requirement that providers in the plan's network see all patients.

(5) HEDIS is a set of standardized performance measures developed by NCQA (1998).

(6) Tennessee recently modified its plan contracts to limit the plans' ability to drop network providers.

(7) In the case of Tennessee, the legal action was prompted by the legal aid society which has had an important role in shaping the TennCare program.

REFERENCES

Birch and Davis Health Management Corporation: Plan for Monitoring Rite Care Health Plans. Prepared for the Rhode Island Office of Managed Care, Department of Human Services. Birch and Davis, Cranston, RI. May 1996a.

Birch and Davis Health Management Corporation: Rite Care 1996 Access Study Final Report. Prepared for the Rhode Island Office of Managed Care, Department of Human Services. Birch and Davis, Cranston, RI. March 1996b.

Federal Register. Medicaid Program; Medicaid Managed Care; Proposed Rule. Federal Register 63 (188): 52022, September 29, 1998.

FMH, Inc.: Hawaii Health QUEST Medical Early Periodic Screening, Diagnosis, and Treatment Program Study Report. Prepared for Med-QUEST Division, Hawaii Department of Human Services. October 1997.

Health Care Financing Administration: Draft Monitoring Response to Hawaii. Baltimore, MD. August 1997.

Health Care Financing Administration: Draft Medicaid Pre-Implementation Review Guide for Section 1115 Demonstration Waivers. Baltimore, MD. August 20, 1995.

Health Care Financing Administration: Special Terms and Conditions-Tennessee. Washington, DC. November 18, 1993.

Horvath, J. and Snow, K. I.: Emerging Challenges in State Regulation of Managed Care: Report on a Survey of Agency Regulations of Prepaid Managed Care Entities. National Academy for State Health Policy, Portland, ME. August 1996.

Ku, L. and Hoag, S.: Medicaid Managed Care and the Marketplace. Inquiry (35)332-345, Fall 1998.

Ku, L. and Wall, S.: The Implementation of Oklahoma's Medicaid Reform Program: SoonerCare. The Urban Institute. Washington, D.C. October 23, 1997.

Ku, L., Ellwood, M., Hoag, S., et al.: The Evolution of Medicaid Managed Care Systems and Expansions in Section 1115 Projects. The Urban Institute. Washington, DC. May 2000.

MacCauley, Darendia: Oklahoma Health Care Authority, personal communication. Oklahoma. March 26, 1998.

National Committee for Quality Assurance: NCQA's Health Plan Employer Data and Information Set (HEDIS 3.0). Internet address: http://www.ncqa. org/hedis.htm. February 16, 1998.

Office of the Assistant Secretary for Planning and Evaluation, Department of Health and Human Services: Managed Care Terminology. Internet address: http://aspe.os.dhhs.gov/Progsys/Forum/ mcobib.htm. September 10, 1999.

Pasternak-Ikard, Rebecca: Oklahoma Health Care Authority, personal communication. Oklahoma,. July 12, 1999.

Raymond, A. G.: Do Members Have Adequate Rights Within HMOs? Yes. Your Money and Your Life: America's Managed Care Revolution. Funded by the Robert Wood Johnson Foundation. Internet address: http://www.wnet.org/archive/mhc/viewpoints/procon7.html. 1995.

Rosenbaum, S., Shin, P., Smith, B. M., et al.: Negotiating the New Health System: A Nationwide Survey of Medicaid Managed Care Contracts. The George Washington University Medical Center, Center for Health Policy Research. Washington, DC. February 1997.

Sachs, Theresa: Health Care Financing Administration, personal communication. Baltimore, Maryland. November 22, 1999

State of Hawaii: Hawaii QUEST Quarterly Monitoring Report for the Period October 1, 1997, through December 31, 1997. Department of Human Services, Honolulu, HI. 1998.

State of Oklahoma: 1115(a) Research and Demonstration Waiver Annual Report/Continuation Application, July 1, 1996 to June 30, 1997 (Year Two). Oklahoma Health Care Authority. Oklahoma City, OK. October 17, 1997.

State of Tennessee: A Contractor Risk Agreement (Year H contract). Department of Finance and Administration. Nashville, TN. 1995.

U.S. General Accounting Office: Medicaid Managed Care: Challenge of Holding Plans Accountable Requires Greater State Effort. GAO/HEHS97-86. U.S. General Accounting Office. Washington, DC. May 1997.

Urban Institute: Edited HCFA-64 Data. Washington, DC. Urban Institute, June 18, 1998.

Wooldridge, J., and Hoag, S.: The Perils of Pioneering: Medicaid Managed Care Monitoring. Mathematica Policy Research, Inc. Princeton, NJ. December 12, 1999.

Wooldridge, J., Ku, L., Coughlin, T, et al.: Implementing State Health Care Reform: What Have We Learned from the First Year? Mathematica Policy Research, Inc. Princeton, NJ. December 18, 1996.

Reprint Requests: Judith Wooldridge, Mathematica Policy Research, Inc., P.O. Box 2393, Princeton, NJ 08543-2393. E-mail: jwooldridge@Mathematica-MPR.com

The authors are with Mathematica Policy Research, Inc. The research presented in this article was supported by Health Care Financing Administration (HCFA) Contract Number 500-94-0047. The views expressed in this article are those of the authors and do not necessarily reflect the views of Mathematica Policy Research, Inc. or HCFA.
COPYRIGHT 2000 U.S. Department of Health and Human Services
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Wooldridge, Judith; Hoag, Sheila D.
Publication:Health Care Financing Review
Geographic Code:1USA
Date:Dec 22, 2000
Words:10385
Previous Article:Innovations in Section 1115 Demonstrations.
Next Article:Medicaid's Complex Goals: Challenges for Managed Care and Behavioral Health.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters