Will Patients Use a Computer to Give a Medical History?
* OBJECTIVE To determine whether patients would self-administer a computerized medical history and find this an acceptable experience
* DESIGN A survey questionnaire was given to 100 patients selected from the emergency department walk-in clinic waiting area.
* SETTING Charity Hospital emergency room walk-in clinic
* METHODS One hundred patients older than 18 years were selected to give a random sample of the population using the Charity Hospital emergency room walk-in clinic for care. The patients received a questionnaire for rating their experience with the computer. Demographics were collected for all patients, including the 13 who declined participation. The main outcome was the patient's perception of the acceptability of using the computerized medical history. A second important outcome measure was patient refusal to participate in the study.
* RESULTS Our analysis of the acceptability ratings revealed adequate internal validity (Cronbach [Alpha]=0.75). A single total score was created for these ratings. The participants' scores ranged from 2.0 to 4.0, with a mean of 3.3 (standard error of the mean=0.04). We observed an 83% positive experience in the participating population.
* CONCLUSIONS The patients were able to use the computer to enter their medical information. They responded favorably to the experience and appeared to be capable and willing to provide medical information through use of this technology.
* KEYWORDS History; medical indigency; computerized patient interview [non-MESH]. (J Fam Pract 2000; 49:921-923)
The Charity Hospital system in Louisiana provides care for a population of medically indigent patients. The legislature has authorized Louisiana State University Medical Center-New Orleans (LSUMC-NO) to form a health maintenance organization to care for Medicaid patients and those who constitute the medically indigent population. Compiling a medical data set to manage such a population is difficult. In 1997 virtually all the medical data at LSUMC was buried in paper charts that were mostly handwritten by physicians and medical students. When our group found software for a self-administered computerized medical history, we thought this might be the methodology needed to gather essential medical data from our large population of patients. We decided to test this technology by seeking an answer to the question: Can and will a patient at Charity Hospital use a computer to give a self-administered medical history?
The walk-in clinic associated with the emergency room at Charity Hospital in New Orleans was used as the site of our study. It is open 15 hours daily during the week and 10 hours daily on weekends, and approximately 5500 patients visit this clinic each month. Patients were sampled during January, February, and March 1998 by calling the patient's name whose record was at the bottom of the stack of charts of patients to be seen. These patients were invited to participate, and demographic data were taken and a reason for refusal was noted when possible for those who refused.
The computer system used was an IBM-compatible 486 PC with a color monitor and standard keyboard. The interview software was Instant Medical History (Primetime Software, Columbia, SC). The participants completed the brief version of the history algorithm to expedite the interview process, since our goal was not to test the software but to determine the patient's ability to use and opinion of using a computer to give the information.
In the testing area the patient was given a consent form and a questionnaire. The patient completed the consent form and a set of items on the questionnaire before using the computer. The software used gave the patient instructions on how to input answers; the investigator typed in only the patient's age and sex. When asked by the software to provide a reason for the visit, the patient was encouraged to select "New problem or illness" or "Follow up of previous illness." Then the patient was offered more specific choices, such as: "Bone or muscle problem," "Cough, cold," and "Diabetes." The investigators gave support for the patient's use of the computer until selection of a system for the chief complaint was made. At that point the patients generally understood what to do to indicate the desired answer. Finding the letters on the keyboard appeared to be more of a problem than understanding the software. Interaction between the patient and the investigator was by design (none to minimal, while the patient was using the computer). After completing the history the patient answered another set of items on the questionnaire, was given the printout from the computer to share with his or her physician, and was returned to the waiting area. The general atmosphere was purposely cordial for participants and nonparticipants.
We obtained demographic data for all patients approached for our study. Refusal rates were one of the main outcomes of the study. When a patient declined to participate, a reason for refusal was noted if possible. Although literacy level was not assessed directly, illiteracy was noted when the information was volunteered.
Another major outcome was the acceptability rating of using the computer to obtain a medical history. Acceptability was determined using a questionnaire consisting of 12 items. Each item was rated on a 4-point Likert scale with higher scores indicating more acceptability.
We analyzed the data with Microsoft Access (Microsoft Corporation, Redmond, Wash), Microsoft Excel, and the Statistical Package for the Social Sciences, version 7 (SPSS Inc, Chicago, Ill). Continuous variables (eg, age, years of education) were described with means and standard errors of the means (SEM). We analyzed the internal consistency of the acceptability questionnaire using Cronbach [Alpha] and item-total correlations. The effects of categorical variables (eg, sex, race) on the continuous acceptability rating were analyzed using analysis of variance, and the relations among continuous variables were analyzed with Pearson correlation coefficients (r). Significance for all analyses was set at P [is less than] .05.
The sample demographics are presented in the Table. Consistent with the clinic population, the sample was approximately 55% women. African Americans constituted 79%. Approximately a third had not completed high school, another third completed high school, and a little less than a third had continued their education beyond high school. The majority (70%) had been to this clinic previously. Ninety-nine percent spoke English as their primary language. More than half of the participants reported that they had not previously used a personal computer. Thirteen patients (13%) refused to participate. The reasons for refusal varied: 4 patients were unable to read well enough to do the study, 1 refused because of dislike for computers, and 3 felt too sick. The other 5 patients gave no reason for refusal.
TABLE Demographics and Description of the Sample
Variable N Sex Men 45 Women 55 Age, years 18 to 20 5 21 to 30 24 31 to 40 29 41 to 50 22 51 to 60 8 61 to 70 6 [is greater than or equal to] 71 3 Education, years 1 to 4 0 4 to 8 9 9 6 10 5 11 14 12 34 13 10 14 10 15 4 16 4 17+ 2 Ethnic group African American 79 White 17 Asian 0 Mixed 1 Other 3 Is English your first (primary) language? Yes 99 No 1 State in which you were born Louisiana 82 Other 18 Country in which you were born USA 99 Other 1 Have you ever given a medical history using a computer? Yes 1 No 85 Have you been to this clinic before? Yes 18 No 68 Have you used a personal computer before? Yes 51 No 35
The acceptability rating items were highly interrelated. Item-total correlations ranged from 0.11 to 0.58, yielding a Cronbach [Alpha] of 0.75. This represents adequate internal validity and suggests that a single total score (the acceptability rating) is an adequate summary of the items. We averaged the items to obtain acceptability ratings that ranged from 2.0 to 4.0, with a mean of 3.27 (SEM=0.044). Since the maximum possible score was 4.0, this mean corresponds to very positive ratings. Only 4 participants rated the experience below 2.5. Combined with the 13 who refused to participate, only 17% of the subjects rejected or were only slightly positive about the experience.
We examined the effects of demographics and selected patient experiences on the acceptability rating. Sex and race did not have a significant influence. Age had a statistically significant impact (r=-0.27), with increasing age associated with decreasing acceptability. Education (r=0.02) and previous visits to the clinic had no impact on the acceptability ratings. Patients who had used computers before rated the experience slightly more positively (mean=3.8) than patients who reported no previous use (mean=3.6). However, the 0.2 difference may not represent a clinically meaningful increase in acceptance of completing a computerized medical history.
We feel that the positive reactions of 83% of all patients approached about the study is a very strong statement for supporting further examination of software programs for computerized medical interviewing. We have shown that a representative sample of the Charity Hospital patient population was capable of using a computer for a self-administered medical history, and our subjects thought it a good idea. Further study of this clinical tool should not be halted because of fear of patient resistance or refusal. We encourage our colleagues to consider the use of a computerized self-administered patient history as a timesaving cost-effective adjunct to the traditional oral history.
Colonel Gordon Black, MHA, Department of Preventive Medicine and Public Health, LSUMC-NO, prodded the vision that all our patients can use a computer and demanded that we prove it. His death in November 1997 prevented him from contributing to the writing of this article.
[1.] Beckman HB, Frankel RM. The effect of physician behavior on the collection of data. Ann Intern Ned 1984; 101:692-96.
[2.] Smith RC, Hoppe RB. The patient's story: integrating the patient and physician centered approaches to interviewing. Ann Intern Med 1991; 115:470-77.
[3.] Marvel HK, Epstein RM, Flowers K, Beckman HB. Soliciting the patient's agenda: have we improved? JAMA 1999; 281:283-87.
[4.] Guthman RA. New-patient self-history questionnaires in primary care. J Am Board Fam Pract 1998; 11:23-27.
[5.] Slack WV, Hicks GP, Reed CE, et al. A computer-based medical-history system. N Engl J Ned 1966; 274:194-98.
[6.] Kobak KA, Taylor LV, Dottl SL, et al. Computerized screening for psychiatric disorders in an outpatient community mental health clinic. Psychiatr Serv 1997; 48:1048-57.
[7.] Slack WV, Leviton A, Bennett SE, et al. Relation between age, education, and time to respond to questions in a computer-based medical interview. Comput Biomed Res 1988; 21:78-84.
John E. Dugaw, Jr, MD; Kenneth Civello, MD, MPH; Christopher Chuinard, MD, MPH; and Glenn N. Jones, PhD New Orleans, Louisiana
* Submitted, revised, May 9, 2000.
From the Department of Family Medicine, Louisiana State University Medical Center. Reprint requests should be addressed to John E. Dugaw, MD, 14743 Channel Drive, LaConner, WA 98257. E-mail: email@example.com.
|Printer friendly Cite/link Email Feedback|
|Author:||DUGAW, JOHN E. JR; CIVELLO, KENNETH; CHUINARD, CHRISTOPHER; JONES, GLENN N.|
|Publication:||Journal of Family Practice|
|Date:||Oct 1, 2000|
|Previous Article:||Care-Seeking Behavior for Upper Respiratory Infections.|
|Next Article:||Alternative Pharmacotherapy.|