SEARCH

SEARCH BY CITATION

Keywords:

  • Canine;
  • Infarction;
  • Meningoencephalitis;
  • Tumor

Abstract

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References

Background

The reliability and validity of magnetic resonance imaging (MRI) for detecting neoplastic, inflammatory, and cerebrovascular brain lesions in dogs are unknown.

Objectives

To estimate sensitivity, specificity, and inter-rater agreement of MRI for classifying histologically confirmed neoplastic, inflammatory, and cerebrovascular brain disease in dogs.

Animals

One hundred and twenty-one client-owned dogs diagnosed with brain disease (n = 77) or idiopathic epilepsy (n = 44).

Methods

Retrospective, multi-institutional case series; 3 investigators analyzed MR images for the presence of a brain lesion with and without knowledge of case clinical data. Investigators recorded most likely etiologic category (neoplastic, inflammatory, cerebrovascular) and most likely specific disease for all brain lesions. Sensitivity, specificity, and inter-rater agreement were calculated to estimate diagnostic performance.

Results

MRI was 94.4% sensitive (95% confidence interval [CI] = 88.7, 97.4) and 95.5% specific (95% CI = 89.9, 98.1) for detecting a brain lesion with similarly high performance for classifying neoplastic and inflammatory disease, but was only 38.9% sensitive for classifying cerebrovascular disease (95% CI = 16.1, 67.0). In general, high specificity but not sensitivity was retained for MR diagnosis of specific brain diseases. Inter-rater agreement was very good for overall detection of structural brain lesions (κ = 0.895, 95% CI = 0.792, 0.998, < .001) and neoplastic lesions, but was only fair for cerebrovascular lesions (κ = 0.299, 95% CI = 0, 0.761, = .21).

Conclusions and Clinical Importance

MRI is sensitive and specific for identifying brain lesions and classifying disease as inflammatory or neoplastic in dogs. Cerebrovascular disease in general and specific inflammatory, neoplastic, and cerebrovascular brain diseases were frequently misclassified.

Abbreviations
CI

confidence interval

CSF

cerebrospinal fluid

FLAIR

fluid attenuated inversion recovery

GME

granulomatous meningoencephalitis

MR(I)

magnetic resonance (imaging)

NE

necrotizing encephalitis

NME

necrotizing meningoencephalitis

T1W

T1-weighted

T2W

T2-weighted

TE

echo time

TR

repetition time

Magnetic resonance (MR) imaging has become widely accepted as the best means to noninvasively evaluate nervous system structures, as it provides outstanding soft tissue contrast and resolution.[1] Over the past 2 decades, numerous investigations have been performed to define the MR imaging (MRI) features of various neoplastic, inflammatory, and cerebrovascular brain diseases in veterinary patients.[2-12] Data from these reports are often used by clinicians to make presumptive diagnoses and effect treatment strategies.[13]

Several studies have demonstrated significant overlap in MR signal characteristics and lesion morphology between divergent intracranial etiologies in dogs. In a recent study using conventional, high-field MR to compare gliomas and cerebral infarcts in dogs, as many as 12% of histologically confirmed gliomas were incorrectly classified as infarcts.[14] In a population of 41 dogs with histologically confirmed intracranial neoplasia, MR was approximately 90% sensitive for detecting lesions.[4] In the same report, MR was only 70% sensitive in determining tumor type in dogs with primary brain neoplasia.

Currently, there is insufficient veterinary research to estimate the reliability of MR in the diagnosis of brain disorders of dogs. In one case series of dogs with necrotizing meningoencephalitis (NME),[8] substantial inter-rater agreement was identified in certain aspects of MR lesion detection; however, the study population was small. In a recent abstract, there was strong inter-rater agreement with respect to MR diagnosis in 44 dogs with various brain diseases.1 Although these findings are encouraging and supported by results of human studies, veterinary reports have included small populations and assessed agreement using heterogeneous methodology.

The aims of this study were (1) to estimate sensitivity and specificity of routine, high-field MR to broadly group canine brain diseases as neoplastic, inflammatory, or cerebrovascular; (2) to estimate sensitivity and specificity of MR to diagnose a subset of specific diseases within broad etiologic categories; (3) to investigate the effect of clinical data on the sensitivity and specificity of brain MR in dogs; and (4) to calculate inter-rater agreement for classification of brain disease in dogs. We hypothesized that sensitivity and specificity of brain MR as well as inter-rater agreement would be high for classifying diseases into general etiologic categories. Given limited veterinary data as well as information from human studies,[15] we hypothesized there would be poorer sensitivity for predicting specific diseases, but that the availability of clinical data would enhance diagnostic performance.

Materials and Methods

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References

Case Selection

This retrospective case series was a multi-institutional collaboration. Cases were obtained by review of medical records from 2005 to 2011 at 3 veterinary medical teaching hospitals: Texas A&M University (TAMU), the University of Georgia (UGA), and Washington State University (WSU). Dogs admitted at these institutions with neurologic signs consistent with intracranial disease were included if the following criteria were met: (1) antemortem brain MRI available for review and either, (2) histologic diagnosis of inflammatory (immune-mediated, infectious, or unknown etiology), neoplastic (primary or secondary), or cerebrovascular (ischemic or hemorrhagic) brain disease obtained by either biopsy or necropsy, or (3) clinical diagnosis of idiopathic epilepsy based on age at seizure onset (within 1–7 years of age), history of recurrent seizures (≥2 events separated by at least 1 week), lack of interictal neurologic deficits, normal brain MRI and cerebrospinal fluid (CSF) analysis, and unremarkable complete blood count and serum biochemistry.

Clinical Data Extraction

Standard clinical data obtained from medical records of all dogs included admitting university, age (in years) at admission, body weight, breed, sex, clinical course of neurologic disease (progressive, nonprogressive, spontaneous improvement), and number of days between the onset of intracranial neurologic signs and brain MR acquisition. Neurologic disease was defined as progressive if the dog's clinical signs worsened between onset and MR, nonprogressive if clinical signs remained static between onset and MR, or showing spontaneous improvement if clinical signs abated or appeared to have resolved fully between onset and MR. All data were entered into a standardized database using commercial spreadsheet software.2

Histologic Confirmation

Neurohistopathology at each institution was performed by a board-certified pathologist or anatomic pathology resident under their supervision. Tissue sections were stained routinely with hematoxylin and eosin for histologic examination. In a select number of cases with neoplasia, immunohistochemical staining for glial fibrillary acidic protein, vimentin, cytokeratin, and CD34 were used to further characterize the histologic diagnosis. All tumors were typed in accordance with World Health Organization recommendations.[16, 17] For dogs with infectious encephalitis, standard bacteriology, viral culture, polymerase chain reaction testing, or immunohistochemical studies were used to determine etiology. Cases histologically diagnosed with more than one neoplastic, inflammatory, or cerebrovascular brain disease were also included.

Image Analysis

Brain MR studies were individually assessed for specific requirements to ensure standardization during image analysis. MR study requirements included the following: (1) images available for review in digital imaging and communications in medicine format, (2) minimum MRI field strength of 1.0T, (3) transverse and sagittal plane of spin echo or fast spin echo T2-weighted (T2W) images, (4) transverse plane of T2 fluid attenuated inversion recovery (T2-FLAIR) images, and (5) transverse plane of precontrast and postcontrast spin echo T1-weighted (T1W) images. A 1.0T magnet3 was utilized for all TAMU cases (T2W: repetition time [TR] 2,470–4,385 ms, echo time [TE] 45–99 ms, slice thickness 2–5 mm; T1W: TR 350–850 ms, TE 10–15 ms, slice thickness 2–5 mm; T2-FLAIR: TR 7,500–9,000 ms, TE 119 ms, slice thickness 3–5 mm). A 1.0T magnet4 was utilized for all WSU cases (T2W: TR 1,945–3,900 ms, TE 120 ms, slice thickness 3–3.5 mm; T1W: TR 400–1,424 ms, TE 14–20 ms, slice thickness 3–6 mm; T2-FLAIR: TR 11,000 ms, TE 140 ms, slice thickness 3–4 mm). A 3.0T magnet5 was utilized for all UGA cases (T2W: TR 3,000–4,000 ms, TE 100–116 ms, slice thickness 2–3 mm; T1W: TR 266–950 ms, TE 10–17 ms, slice thickness 3 mm; T2-FLAIR: TR 9,502 ms, TE 120–128 ms, slice thickness 3 mm). Postcontrast sequences were acquired after IV administration of gadolinium-based contrast agent.6 To evaluate contrast enhancement on MR studies acquired with the 3.0T magnet at UGA, transverse precontrast and postcontrast T1-FLAIR sequences (TR 2,470–2,900 ms, TE 8–10 ms, slice thickness 3 mm) were substituted in place of spin echo T1W images. In dogs with serial MR studies, the MR performed nearest the time of histopathologic evaluation was utilized for image analysis. Any additional image planes or sequences were excluded from analysis. Before the MR analysis, 1 investigator (C.W.) anonymized all images by removing case identifiers using imaging software7 and assigned sequential case numbers to each study in a randomized fashion using random card selection.

Two board-certified radiologists (B.Y., S.H.) and 1 board-certified neurologist (J.L.) independently reviewed and analyzed MR images using digital imaging software.8 These investigators were not involved in case selection, review of medical records or MRI studies, or medical record abstraction. The 3 reviewers were asked first to record the presence or absence of an intracranial lesion, defined as an abnormality in brain morphology or tissue signal characteristics. Then, they were asked to evaluate lesions for previously reported pathologic brain MRI characteristics, including lesion topography, signal patterns, shape, number, invasiveness, and association with features such as mass effect, parenchymal or meningeal enhancement,[18] brain herniation, dural or ventricular contact, presence of a dural tail sign,[19] and bone changes.[20] Reviewers prioritized the most likely disease category by following published imaging criteria to aid differentiation of inflammatory, neoplastic, and cerebrovascular brain diseases of dogs. Reviewers were then asked to specify the most likely brain disease represented by the MR abnormalities. Reviewers were permitted to diagnose inflammatory lesions as granulomatous meningoencephalitis (GME),[21] necrotizing encephalitis (NE),[8] bacterial encephalitis,[22] canine distemper virus encephalitis,[23]Neospora caninum,[24] fungal encephalitis,[25, 26] or other/unknown. Neoplastic lesions could be classified by reviewers as meningioma,[5] glioma (oligodendroglial, astrocytic, or mixed-glial origin),[7] choroid plexus tumor (papilloma or carcinoma),[27, 28] ependymoma,[29] histiocytic sarcoma,[30] lymphoma,[4] hemangiosarcoma,[31] pituitary tumor,[3] metastatic neoplasia,[4, 20] or other/unknown. Lastly, cerebrovascular lesions could be classified as ischemic, hemorrhagic, or other/unknown.[9-11]

After evaluating all images, reviewers sent a copy of their completed responses to 1 investigator (C.W.). Reviewers were subsequently given the clinical data obtained from medical records and asked if they wanted to modify their initial response to the following: (1) normal versus abnormal MRI study, (2) most likely etiologic category (neoplasia, inflammatory, or cerebrovascular), and (3) most likely specific diagnosis. Reviewers recorded their new response if they elected to modify their initial response based on the available clinical data.

Statistical Analysis

All clinical and imaging data were entered into a spreadsheet program.4 Clinical data were summarized using frequencies and descriptive statistics. Sensitivity of detecting a brain lesion was estimated as the proportion of nonepilepsy cases correctly identified as having a true lesion. Specificity of detecting a lesion was estimated as the proportion of epilepsy cases correctly identified as not having a lesion. Category-specific sensitivity was estimated as the proportion of histologically confirmed cases within each category correctly identified as having that lesion type. Specificity was estimated as the proportion of cases within the other categories (excluding epilepsy cases) correctly identified as not having that lesion type. For example, neoplasia-specific sensitivity was estimated as the proportion of cases with confirmed neoplasia correctly identified as having a neoplastic lesion. Specificity was estimated as the proportion of lesions confirmed as not having a neoplastic lesion correctly identified. Diagnosis-specific measures of MR performance were calculated for conditions in which more than a single case was identified. Sensitivity was estimated as the proportion of cases correctly identified and specificity was estimated as the proportion of other etiologies within the same broad diagnostic category correctly identified as not having that disease. The design effect[32] was estimated to account for the dependency among repeated observations and used to adjust confidence intervals (CIs). Performance measures were compared with and without clinical data using McNemar's test while accounting for the repeated observations on the same dog.[33] Inter-rater agreement was estimated by calculating the kappa statistic with its associated P-value and CI using standard formulas.[34] Strength of agreement was determined based on the following kappa values: ≤0.20 poor agreement, 0.21–0.40 fair agreement, 0.41–0.60 moderate agreement, 0.61–0.80 good agreement, and 0.81–1.00 very good agreement.[35] Statistical analysis was interpreted at the 5% level of significance and performed by manually entering formulae into the spreadsheet program. Ninety-five percent CI for estimates of sensitivity and specificity were calculated using available software.9

Results

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References

Study Population

The medical records search performed at each of the 3 collaborating veterinary medical teaching hospitals identified 176 cases. Fifty-five cases were excluded because of a deficient medical record or incomplete MR study, resulting in 121 dogs that met the inclusion criteria (27 from WSU, 47 from UGA, 47 from TAMU). Thirty-six percent (44/121) were clinically diagnosed with idiopathic epilepsy. Median age and body weight for dogs with idiopathic epilepsy were 4 years (range, 0.3–13 years) and 25 kg (range, 3–53 kg). There were 23 male castrated, 9 male intact, 10 spayed female, and 2 female intact dogs. Breeds represented included Labrador Retriever (n = 5), mixed breed (n = 5), Bulldog (n = 3), and 26 other breeds with ≤ 2 dogs each. Twenty-five dogs were described as having a progressive and 19 were described as having a nonprogressive clinical course. Median duration between the onset of clinical signs and MR was 90 days (range, 0.5–1,800 days). Two dogs diagnosed with idiopathic epilepsy were euthanized for unknown reasons following the MR scan, and no evidence of gross or histologic brain disease was identified.

Dogs with histopathologically confirmed brain disease comprised 64% (77/121) of the study population. Sixteen percent (12/77) of these cases were diagnosed antemortem using a brain biopsy specimen. Necropsy and postmortem diagnoses were obtained in 84% (65/77) of cases. In total, there were 53 cases of brain neoplasia, 18 cases of inflammatory brain disease, and 6 cases of cerebrovascular brain disease. Median age and body weight were 9 years (range, 3–14 years) and 26 kg (range, 3–46 kg) for the neoplasia group, 5 years (range, 1–10 years) and 9.5 kg (range, 2–60 kg) for the inflammatory group, and 11 years (range, 1–13 years) and 26.5 kg (range, 3–41 kg) for the cerebrovascular group. In the neoplasia group, there were 27 male castrated, 2 male intact, 23 female spayed, and 1 female intact dogs. Within the inflammatory group, there were 8 male castrated, 2 male intact, 5 female spayed, and 3 female intact dogs. The cerebrovascular group consisted of 2 male castrated, 2 male intact, 1 female spayed, and 1 female intact dogs. Breeds represented among the groups with histologically confirmed brain disease included Golden Retriever (n = 10), Boxer (n = 7), mixed breed (n = 7), Boston Terrier (n = 6), Labrador Retriever (n = 6), American Staffordshire Terrier (n = 3), Australian Shepherd (n = 3), Bulldog (n = 3), Pug (n = 3), and 24 other breeds with ≤ 2 dogs each. A progressive clinical course was recognized in 34 dogs with neoplasia, 15 with inflammatory disease, and 5 with cerebrovascular disease. Nonprogressive signs were identified in 17 dogs with neoplasia, 3 with inflammatory disease, and 1 with cerebrovascular etiologies. Clinical improvement was reported in 3 dogs with neoplasia before MRI, but none in the other groups. Median duration between the onset of clinical signs and MR was 34 days (range, 1–280 days) for neoplastic disease, 5.5 days (range, 0.5–90 days) for inflammatory disease, and 3.5 days (range, 1–14 days) for cerebrovascular disease.

Histopathologic diagnoses in the neoplasia group included meningioma (n = 19), glioma (oligodendroglioma: n = 11; astrocytoma: n = 2; unspecified: n = 2; mixed: n = 1), pituitary adenocarcinoma/carcinoma (n = 4), choroid plexus carcinoma (n = 3), invasive nasal adenocarcinoma (n = 2), lymphoma (n = 2), nerve sheath tumor (ganglioneuroma: n = 1; perineurioma: n = 1), ependymoma (n = 1), hemangiosarcoma (n = 1), medulloblastoma (n = 1), metastatic apocrine gland anal sac adenocarcinoma (n = 1), and multilobulated bone tumor (n = 1). Histopathologic diagnoses in the inflammatory group included GME (n = 8), NE (NME: n = 4; necrotizing leukoencephalitis: n = 1), meningoencephalitis of unknown etiology (n = 3), and 2 cases of infectious meningoencephalitis (fungal: n = 1; bacterial: n = 1). Histopathologic diagnoses in the cerebrovascular disease group included hemorrhagic infarct (n = 3) and ischemic infarct (n = 3). All but 3 cases were diagnosed with a single brain disease. An incidental pituitary adenoma was discovered in 1 dog with meningioma and 1 with an unspecified glioma. The 3rd dog, diagnosed with primary nasal adenocarcinoma, developed secondary bacterial meningoencephalitis associated with tumor invasion of the cribriform plate.

MR Detection of Broad Etiologic Categories

Reviewers had a sensitivity of 94.4% (95% CI = 88.7, 97.4) and specificity of 95.5% (95% CI = 89.9, 98.1) for differentiating MR images of dogs with histologically confirmed intracranial disease from epilepsy animals (Table 1). Providing clinical data did not significantly affect sensitivity (= .25) or specificity (= 1.0). Without the availability of clinical data, the sensitivity and specificity were, respectively, 87.4 and 91.7% for classifying brain diseases as neoplastic and 86.0 and 93.1% for classifying brain diseases as inflammatory. Cerebrovascular diseases were detected with a sensitivity of 38.9% and a specificity of 97.7% without medical record data. Without provision of clinical data, 39 of 231 total imaging diagnoses reported by the 3 reviewers for dogs with histologically confirmed brain disease were false negative misclassifications. Specifically, the incorrect responses according to disease category included no abnormalities (n = 10), unknown (n = 6), bacterial infection (n = 2), and GME (n = 2) among the neoplastic group; ischemic infarct (n = 4), nasal adenocarcinoma (n = 2), histiocytic sarcoma (n = 1), and unknown (n = 1) among the inflammatory group; and glioma (n = 5), no abnormalities (n = 3), fungal infection (n = 2), and unknown (n = 1) among the cerebrovascular group.

Table 1. Comparison of the sensitivity and specificity of routine, high-field MRI in overall lesion detection and the categorical differentiation of neoplastic, inflammatory, and cerebrovascular etiologies in dogs with histologically confirmed brain disease with and without reviewer knowledge of clinical data.
Disease CategoryTest PropertyNo. of DogsPerformance Rating without Clinical Data (95% CI)Performance Rating with Clinical Data (95% CI)P-Value
  1. CI, confidence interval; SP, specificity; SEN, sensitivity; MRI, magnetic resonance imaging.

All lesionsSEN7794.4 (88.7, 97.4)95.7 (90.6, 98.2).25
SP4495.5 (89.9, 98.1)96.2 (90.9, 98.6)1.0
Neoplastic lesionSEN5387.4 (78.5, 93.1)90.6 (82.2, 95.4).074
SP2491.7 (81.6, 96.7)81.9 (70.5, 89.8).023
Inflammatory lesionSEN1986.0 (70.1, 94.5)80.7 (67.7, 89.5).37
SP5893.1 (87.0, 96.6)95.4 (90.4, 98.0).29
Cerebrovascular lesionSEN638.9 (16.1, 67.0)38.9 (18.3, 63.9).48
SP7197.7 (93.8, 99.2)98.1 (94.9, 99.4)1.0

The specificity for detecting neoplastic lesions was significantly lower when medical record information was provided (81.9%, 95% CI = 70.5, 89.8) compared with assessment blinded to case details (91.7%, 95% CI = 81.6, 96.7) (= .023). The provision of clinical data did not significantly change any other determinations of sensitivity or specificity within disease categories. The provision of clinical data did not dramatically change the proportion (37/231) or type of reviewer misclassification.

MR Detection of Specific Etiologies

The estimated sensitivity and specificity for detecting specific diseases varied greatly (Table 2). In general, the specificity of MR testing was consistently high across all tumor types examined, ranging from 93.7% (95% CI = 86.0, 97.2) for glioma to 99.3% (95% CI = 95.8, 100) for nerve sheath tumor. Sensitivity, however, was consistently much lower for all tumor types. The highest sensitivity estimates were associated with glioma at 84.4% (95% CI = 66.4, 94.2) and pituitary tumor at 83.3% (95% CI = 50.9, 97.1). All other neoplastic diseases had sensitivity for detection in the 50–70% range, except lymphoma with a sensitivity estimate of 0% (95% CI = 0, 48.3). Qualitatively, the effects of clinical data on sensitivity and specificity of MR tumor typing appeared small (Table 2). MR diagnostic performance among the inflammatory and cerebrovascular brain diseases showed similar trends to the neoplastic diseases in sensitivity and specificity estimates when clinical data was unknown. Knowledge of clinical data failed to result in a significant difference in MR performance measures when diagnosing specific neoplastic, inflammatory, and cerebrovascular brain diseases in dogs.

Table 2. Comparison of the sensitivity and specificity of routine, high-field MRI in the diagnosis of various histologically confirmed neoplastic, inflammatory, and cerebrovascular brain diseases in dogs based on the neuroimaging diagnoses of reviewers with and without knowledge of clinical data.
Disease CategoryHistologically Confirmed DiagnosisTest PropertyNo. of DogsPerformance Rating without Clinical Data (95% CI)Performance Rating with Clinical Data (95% CI)
  1. CI, confidence interval; SP, specificity; SEN, sensitivity; MRI, magnetic resonance imaging; CPT, choroid plexus tumor; ACA, adenocarcinoma; NST, nerve sheath tumor; GME, granulomatous meningoencephalitis; NE, necrotizing encephalitis; Hemo., hemorrhagic; Isch., ischemic.

  2. a

    Includes oligodendroglioma, astrocytoma, and unspecified glioma subtypes.

  3. b

    Does not include incidental pituitary tumors diagnosed with concurrent brain neoplasia.

  4. c

    Includes necrotizing meningoencephalitis and necrotizing leukoencephalitis.

NeoplasiaMeningiomaSEN1959.6 (42.6, 74.8)64.9 (47.3, 79.4)
SP3394.9 (88.1, 98.1)93.9 (86.8, 97.5)
GliomaaSEN1584.4 (66.4, 94.2)91.1 (77.9, 97.1)
SP3793.7 (86.9, 97.2)93.7 (86.0, 97.5)
CPTSEN366.7 (30.9, 91.0)66.7 (30.9, 91.0)
SP4993.9 (85.9, 97.7)94.6 (86.4, 98.2)
Pituitary tumorbSEN483.3 (50.9, 97.1)83.3 (50.9, 97.1)
SP4897.2 (90.4, 99.4)97.2 (90.4, 99.4)
LymphomaSEN20 (0, 48.3)0 (0, 48.3)
SP5098.7 (93.7, 99.8)98.7 (94.8, 99.8)
Nasal ACASEN266.7 (24.1, 94.0)66.7 (24.1, 94.0)
SP5098.7 (93.7, 99.8)98.7 (93.7, 99.8)
NSTSEN250.0 (1.00, 99.0)50.0 (1.00, 99.0)
SP5099.3 (95.8, 100)99.3 (95.8, 100)
InflammatoryGMESEN850.0 (29.6, 70.4)50.0 (29.6, 70.4)
SP1187.9 (70.9, 96.0)84.8 (67.3, 94.3)
NEcSEN553.3 (21.6, 82.9)66.7 (32.6, 90.1)
SP1492.9 (78.3, 98.3)92.9 (79.4, 98.1)
CerebrovascularHemo. InfarctSEN344.4 (7.6, 87.5)33.3 (9.00, 69.1)
SP388.9 (50.7, 99.4)88.9 (50.7, 99.4)
Isch. InfarctSEN322.2 (1.7, 70.6)66.7 (30.9, 91.0)
SP3100 (62.9, 100)100 (62.9, 100)

Without provision of clinical data, meningiomas were misclassified in 23 of 57 total responses. Incorrect responses included various other primary and secondary brain neoplasms (n = 10), no abnormalities (n = 7), unknown (n = 5), and bacterial infection (n = 1). Gliomas were misclassified in 7 of 45 reviewer responses given without knowledge of clinical data. Glioma misclassifications included unknown (n = 3), meningioma (n = 1), ependymoma (n = 1), GME (n = 1), and bacterial infection (n = 1). Errors in the classification of inflammatory brain disease and cerebrovascular disease occurred to a greater degree relative to brain neoplasms. Though GME and NE were often correctly reported to be inflammatory in nature, reviewers consistently failed to identify the specific underlying etiology. When clinical data were unknown by reviewers, responses were incorrect in 12 of 24 cases of GME (4 ischemic infarct, 2 unknown, 2 protozoal infection, 2 bacterial infection, 2 NME) and 7 of 15 cases of NE (3 unknown, 3 GME, 1 fungal infection). In contrast, cerebrovascular disease including both ischemic and hemorrhagic infarcts was often misclassified as an inflammatory or neoplastic brain disease by reviewers. Hemorrhagic infarcts were misclassified in 5 of 9 cases (3 glioma, 2 fungal infection) and ischemic infarcts were misclassified in 7 of 9 cases (3 no abnormalities, 2 glioma, 1 hemorrhagic infarct, 1 unknown) by reviewers without knowledge of clinical data.

Inter-Rater Agreement

Inter-rater agreements, both with and without clinical data, were good to very good for overall detection of structural brain lesions (without clinical data: κ = 0.895, 95% CI = 0.792, 0.998, < .001; with clinical data: κ = 0.906, 95% CI = 0.803, 1.0, < .001) and for lesions classified as neoplastic (without clinical data: κ = 0.771, 95% CI = 0.616, 0.927, < .001; with clinical data: κ = 0.779, 95% CI = 0.624, 0.935, < .001) (Table 3). When clinical data were unknown by reviewers, inter-rater agreement was moderate for inflammatory lesions (κ = 0.564, 95% CI = 0.304, 0.823, < .001) and fair for cerebrovascular lesions (κ = 0.299, 95% CI = 0, 0.761, = .21). The provision of clinical data reduced inter-rater agreement for inflammatory lesions (κ = 0.211, 95% CI = 0, 0.471, = .11) but did not substantially change agreement for other categories.

Table 3. Comparison of inter-rater agreement among 3 reviewers with and without knowledge of clinical data in the detection and etiologic classification of brain lesions on routine, high-field MRI in a population of 121 dogs with histologically confirmed brain disease.
Reader AssessmentNo. of Dogs (Replicates)Without Clinical DataWith Clinical Data
Kappa Value (95% CI)P-ValueKappa Value (95% CI)P-Value
  1. CI, confidence interval; MRI, magnetic resonance imaging.

Any brain lesion121 (363)0.895 (0.792, 0.998)<.0010.906 (0.803, 1)<.001
Neoplastic lesion53 (159)0.771 (0.616, 0.927)<.0010.779 (0.624, 0.935)<.001
Inflammatory lesion19 (57)0.564 (0.304, 0.823)<.0010.211 (0, 0.471).110
Cerebrovascular lesion6 (18)0.299 (0, 0.761).2050.065 (0, 0.527).783

Discussion

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References

The present study showed that routine, high-field brain MR was a highly sensitive and specific test with very good inter-rater agreement for overall detection of brain lesions, particularly in the detection of neoplastic brain disease in dogs. However, when classifying brain lesions into etiologic categories of inflammatory and cerebrovascular disease, the sensitivity of MR and inter-rater agreement both appeared to decrease. These findings are in agreement with 1 study that reported significant variability in MR characteristics of nonneoplastic disease, yet identified 7 distinct MR signs that were significantly associated with neoplastic brain disease.[20] In another study comparing MRI findings of neoplastic and nonneoplastic canine brain disease, as much as 47% of presumed cerebrovascular accidents were misdiagnosed as gliomas by reviewers who retrospectively reviewed MR images without knowledge of basic case information such as signalment and clinical history.[14] Those authors had postulated that failure to provide clinical data may have affected the reviewers' interpretations and implied that reviewers might otherwise have had fewer misdiagnoses.

The study reported here helped clarify the potential impact of clinical data on reviewers' MR interpretations with the finding that provision of clinical data did not appear to significantly improve the sensitivity or specificity of MR in the detection or etiologic classification of canine brain disease. In some instances, there was an apparent association between reviewer knowledge of clinical data and a greater proportion of false positive and inconsistent responses between the reviewers. For example, specificity for the detection of neoplastic disease with knowledge of clinical data was significantly lower than specificity blinded to clinical data (= .023). Additionally, the provision of clinical data reduced inter-rater agreement for the identification of inflammatory disease from 0.564 (95% CI = 0.340, 0.823, < .001) to 0.211 (95% CI = 0, 0.471, = .11). Clinical data have previously been associated with reviewer bias in MR interpretations, including false negative diagnoses of glioma[4] and histiocytic sarcoma[30] based on inflammatory CSF results in 2 dogs with ambiguous MR lesions. Although the reasons for this effect remain unclear, it seems that clinical data can confound reviewer assessments by introducing information that changes presumptions about underlying etiology. For example, rapid clinical onset is probably more common with cerebrovascular disease than brain neoplasia, but can be seen in both disease processes.

We also estimated the diagnostic performance of MR in the etiologic subclassification of several types of neoplastic, inflammatory, and cerebrovascular brain diseases in dogs. The specificity estimates were consistent with those of the broader etiologic categories and remained relatively high for diagnosing various neoplastic, inflammatory, and cerebrovascular brain diseases. In contrast, the sensitivity estimates for diagnosing specific types of canine brain disease varied greatly both within and among the 3 different etiologic categories and revealed that certain diseases in each category were associated with a relatively large proportion of false negative responses by reviewers. Although the sensitivity of MR for detecting neoplastic brain lesions approached 90% in this study and was the highest for diagnosing glioma also at approximately 90%, brain lymphoma was associated with the lowest sensitivity (0%). Although the use of MR to broadly differentiate brain lesions of different etiologies such as neoplastic versus nonneoplastic disease can aid clinical decision making, clinicians should recognize the limitations of MR even in diagnosing common brain diseases in dogs. In this study, MR was only 59.6% (95% CI = 42.6, 74.8) sensitive for detecting meningioma and 50.0% (95% CI = 29.6, 70.4) sensitive for detecting GME.

Assessment of inter-rater agreement is essential in determining the reliability of a diagnostic test. If results are not repeatable between reviewers, it is challenging to interpret results even if sensitivity and specificity are high. In this study population, inter-rater agreement was excellent (κ = 0.895; 95% CI = 0.792, 0.998) for the detection of any brain lesion. This result closely parallels agreement data in a study that utilized 5 radiologists to detect MR lesions in dogs with histologically confirmed brain disease (0.67 ≤ κ < 0.86).1 In this report, inter-rater agreement for the detection of neoplastic and inflammatory diseases was significant, but kappa value ranged from 0.221 to 0.779 depending on the provision of clinical data. In cerebrovascular brain lesions, inter-rater agreement was not significant but sample size was small (n = 6) and the 95% CI was wide (0, 0.761). It is uncertain whether cerebrovascular brain diseases are less reliably assessed on standard, high-field MR compared with those that are neoplastic or inflammatory, especially considering the overlap in 95% CI for kappa values.

A quality assessment tool for diagnostic accuracy studies has been developed for use in human medical research.[36] This system evaluates reports for methodological weaknesses such as reviewer bias, lack of a true reference standard, unavailability of clinical information, incorporation bias, and use of an inappropriate patient spectrum. We utilized reviewer blinding, dogs without lesions (epilepsy cases), and dogs with histologically confirmed brain disease to reduce potential limitations.

The importance of judging the validity of a diagnostic test against the gold standard cannot be overstated. Unfortunately 2 limitations inherent to using histopathologic diagnoses obtained on necropsy or biopsy following abnormal MRI include selection bias and use of an inappropriate patient spectrum.[37] Selection bias can occur when the gold standard diagnosis is not obtained independent of the diagnostic test under evaluation. Cases that have undergone surgical biopsy or necropsy to satisfy the histopathologic requirement could represent an inappropriate patient spectrum if their inclusion falsely increases the incidence of abnormal MR images and selects for cases affected with more severe forms of disease. Estimated specificity for lesion detection could have been inflated in the present study because the diagnosis of epilepsy required lack of a structural lesion on MR examination. Another aspect of this study methodology that might be susceptible to selection bias and use of an inappropriate patient spectrum was the evaluation of MR for overall and broad etiologic lesion detection, which raises the possibility of inflated estimates of sensitivity and specificity. Other measures of performance in this study, however, should not be impacted by these elements. In retrospective veterinary and human MR studies,[38] it is inherently challenging to avoid selection bias and use of an inappropriate patient spectrum because of the need to select cases in which MR is commonly used before obtaining histopathologic confirmation. Still, our results recapitulate the approximate 90% sensitivity for MR diagnosis of neoplastic brain disease reported in a recent study of 40 dogs with histologically confirmed brain tumors.[4]

Avoidance of reviewer bias by blinding is critical for the assessment of diagnostic performance. Previous studies evaluating the diagnostic performance of MR in dogs with intracranial disease are sometimes ambiguous in describing their methodology, thus making it difficult to assess the effects of reviewer bias. It might be expected, however, that the greater the reviewer's knowledge about what they are assessing on MR, the greater the potential for reviewer bias to affect reported estimates of diagnostic performance. For example, reviewers in Thomas et al[6] were aware that all cases had primary brain tumors. This test review bias could be the reason for the likely inflated 100% sensitivity reported by the authors, making it difficult to interpret the clinical relevance of estimates of MR diagnostic performance in dogs with neoplastic brain disease. Other studies have limited the potential for reviewer bias by including a broad spectrum of intracranial disease etiologies.1 Reviewers in the present study were aware of the inclusion of dogs with epilepsy in addition to dogs with either neoplastic, inflammatory, or cerebrovascular brain disease. We attempted to mitigate the potential for test review bias to alter the estimation of MR diagnostic performance by using a heterogeneous group of intracranial diseases, anonymizing and randomly ordering cases, and selecting reviewers from multiple institutions.

In summary, routine, high-field brain MR is a relatively sensitive and specific test with very good inter-rater agreement for overall, neoplastic, and inflammatory brain lesion detection in dogs presented with intracranial disease. The relative decreases in the sensitivity of MR and inter-rater agreement for detecting cerebrovascular brain lesions are concerning and may indicate the need to perform sequences such as diffusion weighted imaging/apparent diffusion coefficient maps to enhance reviewer detection of these lesions.[14] Within the inflammatory and neoplastic disease groups, the sensitivity and specificity for identifying etiologies was variable. This finding highlights the need to obtain biopsy samples to definitively determine tumor type or identify certain inflammatory disease processes. Our findings regarding the effects of clinical knowledge on reviewers' interpretations of brain MR are concerning and may indicate that initial MR interpretations should be performed blinded to clinical information. The authors postulate that although clinical data may further support a reviewer's interpretation when the level of confidence in the MRI diagnosis is high, such information may not enhance a reviewer's ability to correctly diagnose MR lesions when the reviewer is relatively uncertain about the imaging diagnosis. Finally, given the challenges inherent to the study of brain disease in dogs, including variable MR protocols, difficulty in obtaining histopathologic diagnoses, and erroneous or incomplete medical record-keeping, the authors invite researchers in private referral hospitals and academic institutions to consider contributing to the creation of a mutually accessible national or international multicenter database[15, 39] to better enable evidence-based research into this still-seminal area of veterinary research.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References

The authors thank Alisha Selix and Matthew Nobles (Texas A&M University, College Station, TX) for assistance in database entry and image transfer. No grants or other financial support have been provided for this study.

Footnotes
  1. 1

    Leclerc MK, D'Anjou MA, Blond L, et al. Interobserver agreement and accuracy in the interpretation of brain magnetic resonance imaging in dogs. In: Research Abstract Program of the 2010 American College of Veterinary Internal Medicine (ACVIM) Forum. J Vet Intern Med 2010;24:660–795.

  2. 2

    Excel 2010, Microsoft Corp, Redmond, Washington, DC

  3. 3

    Magnetom Expert Plus, Siemens Medical USA, Malvern, PA

  4. 4

    Gyroscan NT Intera, Philips Medical Systems, Best, the Netherlands

  5. 5

    Signa HDx, General Electric Healthcare, Milwaukee, WI

  6. 6

    Magnevist, 469.01 mg gadopentetate dimeglumine/mL, 0.1 mmol/kg dose, Bayer Healthcare Pharmaceuticals, Wayne, NJ

  7. 7

    ClearCanvas Workstation 2.0 SP1, ClearCanvas Inc, Toronto, Ontario, Canada

  8. 8

    eFilm 2.1 Veterinary, MERGE Healthcare, Cleveland, OH

  9. 9

    Epi Info, version 6.04, CDC, Atlanta, GA

References

  1. Top of page
  2. Abstract
  3. Materials and Methods
  4. Results
  5. Discussion
  6. Acknowledgments
  7. References