Skip to main content

Main menu

  • Home
  • Content
    • First Release
    • Current
    • Archives
    • Collections
    • Audiovisual Rheum
    • 50th Volume Reprints
  • Resources
    • Guide for Authors
    • Submit Manuscript
    • Payment
    • Reviewers
    • Advertisers
    • Classified Ads
    • Reprints and Translations
    • Permissions
    • Meetings
    • FAQ
    • Policies
  • Subscribers
    • Subscription Information
    • Purchase Subscription
    • Your Account
    • Terms and Conditions
  • About Us
    • About Us
    • Editorial Board
    • Letter from the Editor
    • Duncan A. Gordon Award
    • Privacy/GDPR Policy
    • Accessibility
  • Contact Us
  • JRheum Supplements
  • Services

User menu

  • My Cart
  • Log In
  • Log Out

Search

  • Advanced search
The Journal of Rheumatology
  • JRheum Supplements
  • Services
  • My Cart
  • Log In
  • Log Out
The Journal of Rheumatology

Advanced Search

  • Home
  • Content
    • First Release
    • Current
    • Archives
    • Collections
    • Audiovisual Rheum
    • 50th Volume Reprints
  • Resources
    • Guide for Authors
    • Submit Manuscript
    • Payment
    • Reviewers
    • Advertisers
    • Classified Ads
    • Reprints and Translations
    • Permissions
    • Meetings
    • FAQ
    • Policies
  • Subscribers
    • Subscription Information
    • Purchase Subscription
    • Your Account
    • Terms and Conditions
  • About Us
    • About Us
    • Editorial Board
    • Letter from the Editor
    • Duncan A. Gordon Award
    • Privacy/GDPR Policy
    • Accessibility
  • Contact Us
  • Follow Jrheum on BlueSky
  • Follow jrheum on Twitter
  • Visit jrheum on Facebook
  • Follow jrheum on LinkedIn
  • Follow jrheum on YouTube
  • Follow jrheum on Instagram
  • Follow jrheum on RSS
Research ArticleArticle

Discordance Between Self-report of Physician Diagnosis and Administrative Database Diagnosis of Arthritis and Its Predictors

JASVINDER A. SINGH
The Journal of Rheumatology September 2009, 36 (9) 2000-2008; DOI: https://doi.org/10.3899/jrheum.090041
JASVINDER A. SINGH
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: Jasvinder.md@gmail.com
  • Article
  • Figures & Data
  • Info & Metrics
  • References
  • PDF
PreviousNext
Loading

Abstract

Objective. To study predictors of discordance between self-reported physician diagnosis and administrative database diagnosis of arthritis.

Methods. A cohort of all veterans who utilized Veterans Integrated Service Network (VISN)-13 medical facilities were mailed a questionnaire that included patient self-report of physician diagnosis of arthritis and questions regarding demographics, functional limitation, and SF-36V (a validated version of the Medical Outcomes Study Short-Form 36). Kappa coefficient was used to assess the extent of agreement between self-report of physician diagnosis and administrative database definitions that incorporated International Classification of Diseases (ICD) codes and use of medications for arthritis. We identified predictors of overall discordance between self-report and administrative database diagnosis using multivariable logistic regression analyses.

Results. Among 70,334 eligible veterans surveyed, 19,749 subjects had an ICD diagnosis of arthritis in the administrative database in the year prior to the survey; 34,440 answered the arthritis question and 18,464 self-reported a physician diagnosis of arthritis. Kappa coefficient showed slight to fair agreement of 0.19–0.32 between self-report and administrative database definitions of arthritis. We found significantly higher overall discordance among veterans with more comorbidities, greater age, worse functional status, lower use of outpatient and inpatient services, lower education level, and among single medical-site users.

Conclusion. Low level of agreement between self-report and database diagnosis of arthritis and its significant association with patient demographic, clinical, and functional characteristics highlights the limitation of use of these strategies for identification of patients with arthritis in epidemiological studies.

  • ARTHRITIS
  • SELF-REPORT
  • DATABASE DIAGNOSIS
  • DISCORDANCE
  • PREDICTORS
  • VETERANS

Arthritis constitutes a significant public health problem in the US. In 2001, arthritis and chronic joint symptoms affected an estimated 69.9 million people in the US1 and arthritis was among the 5 most prevalent conditions in the 3.4 million patients receiving healthcare at Veterans Affairs (VA) healthcare facilities2. The 2 most common methods of identifying patients with “arthritis” in large epidemiological studies are patient self-report, as in the Behavioral Risk Factor Survey (BRFSS)1 and National Health Interview Survey (NHIS)3, or a diagnosis from administrative/clinical databases. The accuracy of these methods has been debated. Self-reported arthritis has a sensitivity of 75%, specificity of 66%, and kappa statistic of 0.27–0.48, compared to medical records4–6. Self-report of specific types of arthritis was slightly better, with specificity of 66%–90%, but positive predictive value was 21%–22%, sensitivity 50%–100%, and kappa 0.08–0.46, compared to medical records or American College of Rheumatology criteria6–10.

A few studies have examined the factors associated with discordance between these 2 most common methods of identifying patients with arthritis in epidemiologic studies, self-report and databases5,6,9,11. These studies reported conflicting results and none were done in a veteran population. VA constitutes the largest integrated healthcare system in the US, serving approximately 4.9 million subjects with a budget of $25 billion12, providing healthcare to veterans who are socioeconomically disadvantaged and medically underserved2,13,14. VA also has one of the most sophisticated national clinical datasets, which can be used to answer important clinical questions. Therefore, we wanted to examine the extent of and factors associated with discordance between database diagnosis of and self-reported physician-diagnosed arthritis in a veteran cohort15. We hypothesized (1) that the agreement between patient self-report of physician diagnosis of arthritis and the diagnosis of arthritis as recorded in the VA administrative databases would be poor; (2) that database definitions that combine the use of arthritis medications with ICD-9 codes may be associated with better agreement with patient self-report of physician diagnosis of arthritis; and (3) that higher comorbidity load, lower healthcare utilization rate, and poor functional status would be associated with greater discordance between self-report of physician diagnosis of arthritis and administrative database diagnosis of arthritis.

MATERIALS AND METHODS

Prior Veterans’ Quality of Life (Vet-QOL) study

The original Vet-QOL study was a cohort study with a survey of the VISN-13 patient population, as described15. VISN-13 was a regional network providing healthcare to veterans from all of Minnesota, North Dakota, and South Dakota, and selected counties in Iowa, Nebraska, Wisconsin, and Wyoming. The cohort consisted of all veterans in VISN-13 who had at least 1 outpatient encounter or inpatient stay between October 1, 1996, and March 31, 1998, at a VISN-13 facility and a valid mailing address. A self-administered questionnaire was mailed to each eligible veteran in August 1998, with a second mailing to nonresponders 10 weeks later. Many previous studies have used this dataset for studies of quality of life, healthcare utilization, and mortality outcomes15–18.

The self-administered survey questionnaire consisted of questions about: (1) demographics: sex, education level, and race/ethnicity; (2) self-report of physician diagnosis of arthritis, chronic obstructive lung disease (COPD)/asthma, heart disease, hypertension, diabetes, and depression; (3) generic measure of activity limitation, Katz’s scale of difficulty with 6 basic activities of daily living (ADL): bathing, dressing, eating, getting in or out of a chair, walking, and toileting19; (4) current use of cigarettes; and (5) the SF-36V, a validated version of the Medical Outcomes Study Short-Form 36, adapted for use in the VA outpatient population20–22. SF-36 has been found to be valid, reliable, and responsive to clinical change in patients with arthritis23–28. The SF-36V differs from the SF-36 only in the dichotomous items assessing role limitation due to physical and emotional problems; they were changed to use a 5-level ordinal scale in the SF-36V21,29. Physical and mental component summary (PCS and MCS) scores were generated in the standard fashion from the 8 subscales of the SF-36V and standardized to the US population and norm-based with a scoring range of 0–100 (higher score = better health), a mean of 50, and a standard deviation of 10. Generic measures of activity limitation and quality of life were used, since this was a population-based study.

Retrospective and prospective cohort data were obtained for 1 year before and 1 year after the survey from the VA Patient Treatment File (PTF) and the Outpatient Clinic (OPC) datasets (containing data from inpatient and outpatient encounters, respectively). These datasets have been found to be reliable for demographics and most common diagnoses30 and valid for specific diagnoses31,32. Pre-survey data included demographics (age, marital status, and employment status), pre-survey healthcare utilization (inpatient hospitalizations, the number of outpatient encounters in primary care, specialty medical care, surgical care, and mental health), and percentage service connection. Veterans get service connection if they have disability that resulted from or during active military service (due to an injury or a disease condition) and ranges from 0 to 100%, with higher percentage reflecting more disability and ≥ 50% service connection making them eligible for priority in access to VA healthcare. Post-survey data were only used for the diagnostic codes of arthritis and use of arthritis-related medications to test 4 additional database definitions of arthritis (for sensitivity analyses) to allow for delay in diagnosis documentation (see below).

Data collected for our study

The self-report question for arthritis was “Has your doctor ever told you that you have arthritis (including rheumatoid or osteoarthritis)?”, very similar to the question “Have you ever been told by a doctor that you have arthritis?” used in the BRFSS, the largest survey of US households to date1. In addition to the data collected for the Vet-QOL study, for our study, ICD 9th edition (ICD-9) codes for arthritis and the prescription data were extracted from the VA in-/outpatient and pharmacy datasets, respectively.

The ICD-9 codes for arthritis consisted of the codes for arthritis listed by the National Arthritis Data Workgroup (http://www.cdc.gov/arthritis/data_statistics/pdf/arthritis_codes_2004.pdf), supplemented by an additional search through the ICD-9 coding book: osteoarthrosis (715), rheumatoid arthritis (714), spondyloarthropathy (720, 696, 711.1, 099.3), arthritis not otherwise specified (716.9), spondylosis (721), gout (274), crystal induced arthritis (712, 275.49), infectious arthritis (711, 098.5, 036.82, 056.71, 040.2, 390, 421), arthritis due to endocrine-metabolic disorders (713), hemochromatosis-associated arthritis (275), and unspecified arthritis (716). Pharmacy data were extracted from the prescription file located in the local Veterans Health Information Systems and Technology Architecture (VISTA) data system at each of the VISN-13 facilities. The pharmacy data was searched for the presence of prescriptions with 2 or more refills (i.e., initial prescription + 2 refills) of nonsteroidal antiinflammatory drugs (NSAID) and disease-modifying antirheumatic drugs (DMARD) for the 2-year study period, regardless of presence of arthritis diagnoses. These medications were chosen since they are most commonly used medications for treatment of arthritis; 3-month or longer prescription was chosen since this is the maximum and the most common “days supply” that can be dispensed at the VA and probably represents longterm medication use.

Study participants and case definitions of arthritis and discordance

Veterans were included in our study if they either self-reported a physician-made diagnosis of arthritis or had an administrative database diagnosis of arthritis. The 5 database case definitions based on various combinations of the ICD code and medication use included the following; Definition 1: presence of an ICD code for arthritis in the year before the survey; Definition 2: presence of an ICD code for arthritis in the year after the survey; Definition 3: presence of an ICD code for arthritis in a 2-year period (a year before and a year after the survey); Definition 4: Presence of either an ICD code for arthritis or use of arthritis-medication (NSAID or DMARD) in 2-year period (most inclusive definition); and Definition 5: Presence of both an ICD code for arthritis and use of arthritis medication in 2-year period (most strict definition).

Subjects were classified into 2 groups based on the concordance or discordance between self-report and administrative case definition of presence of ICD code in the year prior to the survey (Definition 1).

Statistical analyses

Kappa statistic33 was used for assessing the degree of agreement between the administrative and self-report case definitions of arthritis and examined the effect of demographic, clinical, utilization, health, or functional status characteristics on kappa agreement in univariate analyses. Results are presented for administrative case definitions 1, 3, and 4 above, since definitions 2 and 5 gave results very similar to definitions 1 and 4, respectively.

Characteristics of the patients in the discordant and concordant groups were compared using chi-squared tests for categorical variables and Student’s t-tests for continuous variables. A multiple logistic regression analysis modeled the odds of discordance using various characteristics including demographic, clinical, healthcare utilization, health status and functional status as predictors or explanatory variables, including: (1) demographics: age in years, sex, race (Caucasian or non-Caucasian), marital status (married or not married), employment status (employed, unemployed, or retired), and education level (< 8th grade, some high school, high school graduate, or beyond high school); (2) self-reported comorbidity: number of self-reported physician diagnoses of the comorbidities COPD/asthma, diabetes, depression, hypertension, and heart disease (expressed in categories of none, 1, 2, or 3–5 comorbidities) and current smoking status (yes/no); (3) healthcare utilization and access measures: inpatient admission in the year prior to the survey (any or none), total number of outpatient visits for the year prior to the survey (aggregate of all primary care, specialty medical care, surgical care, and mental health visits), medical center site use (multiple-site vs single-site user), percentage service connection; (4) health and functional status: PCS and MCS scores of the SF-36V and ADL limitation (no limitation, or a limitation of 1, 2, or 3–6 ADL). For the purpose of analysis, outpatient visits, PCS scores and MCS scores were divided into tertiles for ease of interpretation, since PCS and MCS were normally distributed. All subjects were included in the main logistic regression analysis.

Additional multivariable logistic regression analyses examined factors associated with potential underdocumentation/overreporting and overdocumentation/underreporting of the diagnosis of arthritis, controlling for covariates listed above. Specifically, these outcomes were defined as: (1) an absence of an ICD-9 code for arthritis in the year prior to the survey in those who self-reported a physician diagnosis of arthritis, i.e., potential underdocumentation/overreporting; and (2) an absence of self-report of physician diagnosis of arthritis in those who had an ICD code for arthritis in the year prior to the survey, i.e., potential overdocumentation/underreporting. A p < 0.05 was considered significant.

RESULTS

The survey response rate was 58% (40,508/70,334). The question regarding arthritis was answered by 34,440 (49%) with 18,464 (54%) respondents self-reporting a physician-made diagnosis. Clinical characteristics of these patients are summarized in Table 1. The administrative database search identified 19,749 (28%) subjects with an ICD diagnosis of arthritis in the year prior to the survey.

View this table:
  • View inline
  • View popup
Table 1.

Clinical characteristics all survey responders and those with self-reported arthritis.

Level of agreement between self-report of physician diagnosis and administrative database case definitions of arthritis

Kappa statistic for agreement between self-report of physician diagnosis and the 5 administrative database case definitions identifying an arthritis diagnosis ranged from 0.19 to 0.32, being highest for the case definition of ICD code or use of arthritis medication (Table 2). Table 3 shows the variation in kappa agreement across clinical and demographic variables. The largest range in kappa statistics was observed across the tertiles of outpatient visits, ranging from 0.11 in the veterans in the lowest tertile to 0.27 in the highest tertile of outpatient visits (Table 3).

View this table:
  • View inline
  • View popup
Table 2.

Kappa agreement between self-reported arthritis and 5 database definitions including International Classification of Diseases (ICD)-9 codes.

View this table:
  • View inline
  • View popup
Table 3.

Effect of key variables on kappa statistic between self-report of physician diagnosis and ICD-9 code for arthritis in administrative diagnosis in the year prior to the survey.

Discordant groups and predictors of discordance

Univariate comparisons showed that subjects in the discordant group were slightly older, less educated, less likely to be employed or current smokers, had more comorbidities, had higher functional limitation and lower PCS scores, had less inpatient and outpatient healthcare utilization, and were less likely to be a multi-site user than subjects in the concordant group (Table 4).

View this table:
  • View inline
  • View popup
Table 4.

Characteristics of concordant and discordant groups (and their subgroups) and comparison of concordant and discordant groups.

In the multivariable logistic regression analysis, veterans with greater self-reported comorbidity or older age had higher odds of discordance (Table 5). In addition, veterans with more ADL limitations or worse physical quality of life had higher odds of discordance. Single-site use, no recent inpatient admission, and fewer outpatient visits were also associated with higher odds of discordance.

View this table:
  • View inline
  • View popup
Table 5.

Factors significantly associated with overall discordance, underdocumentation/overreporting and overdocumentation/underreporting of arthritis in multivariate logistic regression analyses.

Predictors of underdocumentation/overreporting

Among those with self-reported arthritis, factors associated with absence of administrative database diagnosis, i.e., underdocumentation/overreporting, were the same as those identified in the main analysis of overall discordance, and exhibited the same direction of association. One additional predictor was identified: a lower percentage of service connection was associated with higher odds of underdocumentation (Table 5). Lower number of outpatient visits had a much stronger association with odds of more underdocumentation than with odds of overall discordance.

Predictors of overdocumentation/underreporting

Factors associated with no self-report among those with an administrative database diagnosis, i.e., overdocumentation/underreporting, were somewhat different from those identified in the main analysis of overall discordance. Older age, worse PCS scores, more comorbidities, and no recent inpatient visit were each associated with lower odds of overdocumentation/underreporting; i.e., the direction of association was opposite to overall discordance (Table 5). In addition, more ADL limitation, single-site use, higher percentage service connection, higher education level, being unmarried, and lower MCS scores were associated with higher odds of overdocumentation/underreporting.

DISCUSSION

Self-report of arthritis had slight to fair agreement with VA administrative database case definitions of arthritis. This range of agreement is similar to the kappa of 0.27 reported for arthritis6 and 0.37 for musculoskeletal disease11, but lower than the kappa of 0.48 reported for hip or knee arthritis4. The greater degrees of agreement for hip or knee arthritis may be due to specification of the site of involvement by arthritis4. This study extends the previous observations of low agreement between self-reported and other database/clinical definitions in non-veteran populations compared to veterans.

The study has many useful implications. First, the finding that self-reported physician diagnosis of arthritis has little agreement with 5 database definitions (incorporating ICD codes and medication data for the first time) indicates that there may be limitations to use of either definition of “arthritis.” The case definition of arthritis that required presence of either an ICD code or use of arthritis medication had the highest agreement with self-reported physician diagnosis of arthritis, but kappa was only 0.32. This implies that studies utilizing ICD codes in VA databases for studies of patients with arthritis may be identifying a subset of patients with arthritis.

Second, the association of lower healthcare utilization, poor physical health status, and poor functional status with overall discordance and underdocumentation implies that sicker patients or those with less physician contact are less likely to have their arthritis recognized, diagnosed, and documented by their physician. Incidentally, this group of patients also constitutes a high-risk group with regard to their health in general. It is possible that discordance is a surrogate marker for another prognostic characteristic that is associated with poor health and yet less physician contact.

Predictors of discordance between patient-report and database diagnosis

The overall discordance between self-report and administrative diagnosis was significantly higher among less frequent users of outpatient or inpatient services, single-site users, older veterans, those with more comorbidities, more ADL limitations, or worse physical health, i.e., frail and elderly patients with fewer health encounters had higher discordance. Two previous studies6,11 reported significant association of increasing age with agreement between self-report and examination11 and of Caucasian race with higher agreement between self-report and medical records6. In contrast to earlier studies, in our study, increasing age was significantly associated with higher odds of discordance. This study extends the findings of correlates of discordance between 2 methods of case identification by including utilization, demographic, and clinical variables simultaneously. These relationships do not imply causality, only that significant associations exist. Our study differed from the earlier studies in patient population and setting (95% male veterans receiving care at VA facilities in Upper Midwest vs population-based survey in Finland11 vs patients with chronic lymphocytic leukemia in Baltimore6) and in the comparison of standard-database ICD diagnoses in this study versus physical examination11 or medical records6 in earlier studies. A strong association of outpatient visits with discordance indicates that more health encounters may lead to better physician-patient communication and thus higher concordance between clinical databases and patient self-report.

Predictors of underdocumentation/overreporting

The previously described association of lower outpatient use34 with more underdocumentation/overreporting in patients with chronic conditions was confirmed in this larger sample of patients with arthritis. Some predictors of underdocumentation differ from a previous study by Kehoe, et al, in a cohort of ophthalmology patients from the Boston area5. They found that fewer annual physician visits and higher education levels were associated with higher specificity of self-reported arthritis5, i.e., less overreporting. In contrast, in our study, education level was not associated and fewer outpatient visits were associated with more underdocumentation/overreporting. The previous study obtained diagnosis from primary care physician charts rather than administrative database records and did not adjust analyses to a variety of important explanatory measures, as did our study, which may explain differences in findings.

Many explanations exist for underdocumentation, including low rate of physician documentation of diagnosis of arthritis35, substantial proportion (22%) of US adults never seeing a healthcare provider for their joint symptoms36, and veterans receiving arthritis diagnosis in non-VA healthcare settings37, which may not be documented in VA records. Patient overreporting of arthritis due to self-diagnosis of any musculoskeletal symptom as arthritis may also be partially responsible for instances of underdocumentation and overall discordance.

Predictors of overdocumentation/underreporting

Two small studies (< 1000 patients) of predictors of underreporting of arthritis diagnosis reported that the type of arthritis diagnosis and presence of ADL difficulties predicted underreporting9 or that none of the examined factors (age, sex, race, education, and physician visits) was associated with underreporting5. The previously published studies included patients with cataract5 or those being followed in rheumatology outpatient clinics9 as compared to a population-based cohort in our study. This study confirmed the association between increasing ADL difficulties and increasing odds of underreporting, and extending this finding from rheumatology clinic outpatients9 to a general population cohort. Race was not associated with underreporting in our study, similar to the earlier study in a cataract population5, thus extending this observation to a general population cohort. Lack of physician communication of arthritis diagnosis, patient trivialization of arthritis diagnosis, and/or interpretation that arthritis (especially osteoarthritis) is an age-related phenomenon rather than a chronic disease may contribute to overdocumentation/underreporting of arthritis. The finding that veterans with lower MCS scores, higher PCS scores, higher education level, lower use of medical resources/access (lower multisite use and lower percentage service connection), and who were unmarried had higher odds of overdocumentation/underreporting adds to the literature.

The prevalence of self-reported physician-diagnosed arthritis by 54% of respondent veterans using VA healthcare may seem high to some, but is similar to 43% reported in the BRFSS survey that had a slightly younger cohort of veterans using the VA healthcare system38.

Limitations

Our study has several limitations including nonresponse bias, an inability to examine agreement by the type of arthritis since we do not have the self-report of type of arthritis, and confounding by unmeasured variables. Given the features of the study population (veterans who are largely elderly and male) and nonresponse bias, the findings may not be generalizable to younger populations or women. A smaller number of women in our sample makes our observations regarding female sex less robust. However, since the sample was a population-based sample of veterans receiving healthcare at VA settings, these findings are at least generalizable to these veteran cohorts. Despite these differences, many results of our study agree with those of other studies in non-veteran, community samples. Thus these results are likely applicable to other elderly populations. The response rate of 57% in our study, although not optimal, is above the average of 54% for such large surveys39. Nonresponders were slightly younger and less likely to be married compared to responders, and we are unsure how this may have affected the discordance. It is possible that attitudes and knowledge regarding arthritis have changed in the last decade since the study was completed, and the concordance is better now, a hypothesis that needs to be tested. Another limitation of not just our study, but this field in general is that there is no “gold standard” definition for diagnosis of arthritis. Therefore, a comparison was performed of the 2 non-gold standard definitions of arthritis commonly used in most large epidemiological studies and surveys. Despite the low concordance between self-report and billing codes, each may be more valuable depending on the particular study. For example, self-report may be the most relevant definition of arthritis if one is interested in assessing the effect of arthritis and arthritis-related symptoms. In contrast, administrative databases may be more appropriate for studying health-care utilization. Each has limitations and may only identify a subset of patients with arthritis. Since we extracted ICD codes and medication data for a 2-year period, some patients with arthritis who were not seen or treated for arthritis may have been missed. This is unlikely, since these patients had at least 1 healthcare encounter in the 18 months before the survey. Although the discordance was subdivided into useful categories such as underreporting and underdocumentation, the sources of errors cannot be determined accurately.

A low level of agreement was noted between self-report of physician diagnosis of arthritis and the administrative database definitions of arthritis. Age, comorbidity, functional limitation, quality of life, and access to and use of medical services influenced the overall discordance between self-report of physician diagnosis and administrative database definition of arthritis using ICD codes. The high degree of disagreement between these 2 methods commonly used in epidemiological studies and the influence of various demographic and clinical factors underscores the limitations that exist with use of these methods in assessing the presence of arthritis in large epidemiological studies.

Acknowledgments

I thank Dr. David T. Felson of Boston University for a very helpful critique of this work.

Footnotes

  • Supported by VA Upper Midwest Veterans Network, VISN-13.

    • Accepted for publication April 6, 2009.

REFERENCES

  1. 1.↵
    1. Centers for Disease Control and Prevention (CDC)
    . Prevalence of self-reported arthritis or chronic joint symptoms among adults — United States, 2001. MMWR Morb Mortal Wkly Rep 2002;51:948–50.
    OpenUrlPubMed
  2. 2.↵
    1. Yu W,
    2. Ravelo A,
    3. Wagner T,
    4. et al.,
    Prevalence and costs of chronic conditions in the VA health care system. Med Care 2003;60:146S–67S.
    OpenUrlCrossRef
  3. 3.↵
    1. Rao JK,
    2. Callahan LF,
    3. Helmick CG, 3rd
    . Characteristics of persons with self-reported arthritis and other rheumatic conditions who do not see a doctor. J Rheumatol 1997;24:169–73.
    OpenUrlPubMed
  4. 4.↵
    1. Haapanen N,
    2. Miilunpalo S,
    3. Pasanen M,
    4. Oja P,
    5. Vuori I
    . Agreement between questionnaire data and medical records of chronic diseases in middle-aged and elderly Finnish men and women. Am J Epidemiol 1997;145:762–9.
    OpenUrlAbstract/FREE Full Text
  5. 5.↵
    1. Kehoe R,
    2. Wu SY,
    3. Leske MC,
    4. Chylack LT, Jr
    . Comparing self-reported and physician-reported medical history. Am J Epidemiol 1994;139:813–8.
    OpenUrlAbstract/FREE Full Text
  6. 6.↵
    1. Linet MS,
    2. Harlow SD,
    3. McLaughlin JK,
    4. McCaffrey LD
    . A comparison of interview data and medical records for previous medical conditions and surgery. J Clin Epidemiol 1989;42:1207–13.
    OpenUrlCrossRefPubMed
  7. 7.
    1. Kvien TK,
    2. Glennas A,
    3. Knudsrod OG,
    4. Smedstad LM
    . The validity of self-reported diagnosis of rheumatoid arthritis: Results from a population survey followed by clinical examinations. J Rheumatol 1996;23:1866–71.
    OpenUrlPubMed
  8. 8.
    1. Ling SM,
    2. Fried LP,
    3. Garrett E,
    4. Hirsch R,
    5. Guralnik JM,
    6. Hochberg MC
    . The accuracy of self-report of physician diagnosed rheumatoid arthritis in moderately to severely disabled older women. Women’s Health and Aging Collaborative Research Group. J Rheumatol 2000;27:1390–4.
    OpenUrlPubMed
  9. 9.↵
    1. Rasooly I,
    2. Papageorgiou AC,
    3. Badley EM
    . Comparison of clinical and self reported diagnosis for rheumatology outpatients. Ann Rheum Dis 1995;54:850–2.
    OpenUrlAbstract/FREE Full Text
  10. 10.↵
    1. Star VL,
    2. Scott JC,
    3. Sherwin R,
    4. Lane N,
    5. Nevitt MC,
    6. Hochberg MC
    . Validity of self-reported rheumatoid arthritis in elderly women. J Rheumatol 1996;23:1862–5.
    OpenUrlPubMed
  11. 11.↵
    1. Heliovaara M,
    2. Aromaa A,
    3. Klaukka T,
    4. Knekt P,
    5. Joukamaa M,
    6. Impivaara O
    . Reliability and validity of interview data on chronic diseases. The Mini-Finland Health Survey. J Clin Epidemiol 1993;46:181–91.
    OpenUrlCrossRefPubMed
  12. 12.↵
    1. Perlin JB,
    2. Kolodner RM,
    3. Roswell RH
    . The Veterans Health Administration: Quality, value, accountability, and information as transforming strategies for patient-centered care. Am J Manag Care 2004;10:828–36.
    OpenUrlPubMed
  13. 13.↵
    1. Hollingsworth JW,
    2. Bondy PK
    . The role of Veterans Affairs hospitals in the health care system. N Engl J Med 1990;322:1851–7.
    OpenUrlPubMed
  14. 14.↵
    1. Wilson NJ,
    2. Kizer KW
    . The VA health care system: An unrecognized national safety net. Health Aff (Millwood) 1997;16:200–4.
    OpenUrlAbstract/FREE Full Text
  15. 15.↵
    1. Singh JA,
    2. Borowsky SJ,
    3. Nugent S,
    4. et al.,
    Health-related quality of life, functional impairment and health care utilization in veterans: Veterans’ Quality of Life Study. J Am Geriatr Soc 2005;53:108–13.
    OpenUrlCrossRefPubMed
  16. 16.
    1. Singh JA,
    2. Nelson DB,
    3. Fink HA,
    4. Nichol KL
    . Health-related quality of life predicts future health care utilization and mortality in veterans with self-reported physician-diagnosed arthritis: The Veterans Arthritis Quality of Life Study. Semin Arthritis Rheum 2005;34:755–65.
    OpenUrlCrossRefPubMed
  17. 17.
    1. Singh JA,
    2. Murdoch M
    . Effect of health-related quality of life on women and men’s Veterans Affairs (VA) health care utilization and mortality. J Gen Intern Med 2007;22:1260–7.
    OpenUrlCrossRefPubMed
  18. 18.↵
    1. Singh JA,
    2. Strand V
    . Gout is associated with more comorbidities, poorer health-related quality of life and higher healthcare utilisation in US veterans. Ann Rheum Dis 2008;67:1310–6.
    OpenUrlAbstract/FREE Full Text
  19. 19.↵
    1. Katz S,
    2. Ford A,
    3. Moskowitz R,
    4. Jackson B,
    5. Jaffe M
    . Studies of illness in aged: The index of ADL: A standard measure of biological and psychological function. JAMA 1963;185:914–9.
    OpenUrlCrossRefPubMed
  20. 20.↵
    1. Ware JE, Jr,
    2. Sherbourne CD
    . The MOS 36-item Short-form Health Survey (SF-36). I. Conceptual framework and item selection. Med Care 1992;30:473–83.
    OpenUrlCrossRefPubMed
  21. 21.↵
    1. Kazis LE,
    2. Miller DR,
    3. Clark J,
    4. et al.,
    Health-related quality of life in patients served by the Department of Veterans Affairs: Results from the Veterans Health Study. Arch Intern Med 1998;158:626–32.
    OpenUrlCrossRefPubMed
  22. 22.↵
    1. Kazis LE,
    2. Ren XS,
    3. Lee A,
    4. et al.,
    Health status in VA patients: Results from the Veterans Health Study. Am J Med Qual 1999;14:28–38.
    OpenUrlAbstract/FREE Full Text
  23. 23.↵
    1. Tuttleman M,
    2. Pillemer SR,
    3. Tilley BC,
    4. et al.,
    A cross sectional assessment of health status instruments in patients with rheumatoid arthritis participating in a clinical trial. Minocycline in Rheumatoid Arthritis Trial Group. J Rheumatol 1997;24:1910–5.
    OpenUrlPubMed
  24. 24.
    1. Kvien TK,
    2. Kaasa S,
    3. Smedstad LM
    . Performance of the Norwegian SF-36 Health Survey in patients with rheumatoid arthritis. II. A comparison of the SF-36 with disease-specific measures. J Clin Epidemiol 1998;51:1077–86.
    OpenUrlCrossRefPubMed
  25. 25.
    1. Husted JA,
    2. Gladman DD,
    3. Farewell VT,
    4. Long JA,
    5. Cook RJ
    . Validating the SF-36 health survey questionnaire in patients with psoriatic arthritis. J Rheumatol 1997;24:511–7.
    OpenUrlPubMed
  26. 26.
    1. Kosinski M,
    2. Keller SD,
    3. Ware JE, Jr,
    4. Hatoum HT,
    5. Kong SX
    . The SF-36 health survey as a generic outcome measure in clinical trials of patients with osteoarthritis and rheumatoid arthritis: Relative validity of scales in relation to clinical measures of arthritis severity. Med Care 1999;37:MS23–39.
    OpenUrlCrossRefPubMed
  27. 27.
    1. Gill TM,
    2. Feinstein AR
    . A critical appraisal of the quality of quality-of-life measurements. JAMA 1994;272:619–26.
    OpenUrlCrossRefPubMed
  28. 28.↵
    1. McHorney CA
    . Health status assessment methods for adults: Past accomplishments and future challenges. Annu Rev Public Health 1999;20:309–35.
    OpenUrlCrossRefPubMed
  29. 29.↵
    1. Kazis L,
    2. Skinner K,
    3. Rogers W,
    4. et al.,
    Health status of veterans: Physical and mental component summary scores (SF-36V). 1998 national survey of ambulatory care patients Executive report. Washington, DC: Department of Veterans Affairs, Veterans Health Administration Office of Performance and Quality.
  30. 30.↵
    1. Kashner TM
    . Agreement between administrative files and written medical records: A case of the Department of Veterans Affairs. Med Care 1998;36:1324–36.
    OpenUrlCrossRefPubMed
  31. 31.↵
    1. Szeto HC,
    2. Coleman RK,
    3. Gholami P,
    4. Hoffman BB,
    5. Goldstein MK
    . Accuracy of computerized outpatient diagnoses in a Veterans Affairs general medicine clinic. Am J Manag Care 2002;8:37–43.
    OpenUrlPubMed
  32. 32.↵
    1. Petersen LA,
    2. Wright S,
    3. Normand SL,
    4. Daley J
    . Positive predictive value of the diagnosis of acute myocardial infarction in an administrative database. J Gen Intern Med 1999;14:555–8.
    OpenUrlCrossRefPubMed
  33. 33.↵
    1. Landis JR,
    2. Koch GG
    . The measurement of observer agreement for categorical data. Biometrics 1977;33:159–74.
    OpenUrlCrossRefPubMed
  34. 34.↵
    1. Madow WG
    . Net differences in interview data on chronic conditions and information derived from medical records. Vital Health Stat 2 1973;57:1–58.
    OpenUrl
  35. 35.↵
    1. Lloyd SS,
    2. Rissing JP
    . Physician and coding errors in patient records. JAMA 1985;254:1330–6.
    OpenUrlCrossRefPubMed
  36. 36.↵
    1. Centers for Disease Control and Prevention (CDC)
    . Adults who have never seen a health-care provider for chronic joint symptoms — United States, 2001. MMWR Morb Mortal Wkly Rep 2003;52:416–9. Erratum in: MMWR Morb Mortal Wkly Rep 2003;52:590–2.
    OpenUrlPubMed
  37. 37.↵
    1. Fleming C,
    2. Fisher ES,
    3. Chang CH,
    4. Bubolz TA,
    5. Malenka DJ
    . Studying outcomes and hospital utilization in the elderly. The advantages of a merged data base for medicare and Veterans Affairs hospitals. Med Care 1992;30:377–91.
    OpenUrlCrossRefPubMed
  38. 38.↵
    1. Dominick KL,
    2. Golightly YM,
    3. Jackson GL
    . Arthritis prevalence and symptoms among US non-veterans, veterans, and veterans receiving Department of Veterans Affairs healthcare. J Rheumatol 2006;33:348–54.
    OpenUrlAbstract/FREE Full Text
  39. 39.↵
    1. Asch DA,
    2. Jedrziewski MK,
    3. Christakis NA
    . Response rates to mail surveys published in medical journals. J Clin Epidemiol 1997;50:1129–36.
    OpenUrlCrossRefPubMed
PreviousNext
Back to top

In this issue

The Journal of Rheumatology
Vol. 36, Issue 9
1 Sep 2009
  • Table of Contents
  • Table of Contents (PDF)
  • Index by Author
  • Editorial Board (PDF)
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word about The Journal of Rheumatology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Discordance Between Self-report of Physician Diagnosis and Administrative Database Diagnosis of Arthritis and Its Predictors
(Your Name) has forwarded a page to you from The Journal of Rheumatology
(Your Name) thought you would like to see this page from the The Journal of Rheumatology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Discordance Between Self-report of Physician Diagnosis and Administrative Database Diagnosis of Arthritis and Its Predictors
JASVINDER A. SINGH
The Journal of Rheumatology Sep 2009, 36 (9) 2000-2008; DOI: 10.3899/jrheum.090041

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero

 Request Permissions

Share
Discordance Between Self-report of Physician Diagnosis and Administrative Database Diagnosis of Arthritis and Its Predictors
JASVINDER A. SINGH
The Journal of Rheumatology Sep 2009, 36 (9) 2000-2008; DOI: 10.3899/jrheum.090041
del.icio.us logo Twitter logo Facebook logo  logo Mendeley logo
  • Tweet Widget
  •  logo
Bookmark this article

Jump to section

  • Article
    • Abstract
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • Acknowledgments
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Info & Metrics
  • References
  • PDF

Related Articles

Cited By...

More in this TOC Section

  • Long-Term Effectiveness and Safety of Denosumab for Osteoporosis in Patients With Rheumatic Diseases
  • Clinically Inactive Disease and Remission in Patients With Juvenile Idiopathic Arthritis Receiving Tofacitinib: Post Hoc Analysis of a Phase III Trial
  • Modifiable Lifestyle Factors, Genetic Susceptibility, and Incident Radiographic Axial Spondyloarthritis
Show more Articles

Similar Articles

Content

  • First Release
  • Current
  • Archives
  • Collections
  • Audiovisual Rheum
  • COVID-19 and Rheumatology

Resources

  • Guide for Authors
  • Submit Manuscript
  • Author Payment
  • Reviewers
  • Advertisers
  • Classified Ads
  • Reprints and Translations
  • Permissions
  • Meetings
  • FAQ
  • Policies

Subscribers

  • Subscription Information
  • Purchase Subscription
  • Your Account
  • Terms and Conditions

More

  • About Us
  • Contact Us
  • My Alerts
  • My Folders
  • Privacy/GDPR Policy
  • RSS Feeds
The Journal of Rheumatology
The content of this site is intended for health care professionals.
Copyright © 2025 by The Journal of Rheumatology Publishing Co. Ltd.
Print ISSN: 0315-162X; Online ISSN: 1499-2752
Powered by HighWire