SEARCH

SEARCH BY CITATION

Keywords:

  • Colorectal cancer;
  • audit;
  • kappa statistic

Abstract

Aim  Human involvement in the collection and entering of information into a database leads to a degree of error. The aim of this study was to assess the concordance between two individuals blinded from each other who independently collected information on the same set of patients and entered it into a colorectal neoplasia database.

Method  A colorectal research nurse and a surgeon independently maintained an electronic database on all new patients admitted with colorectal neoplasia under the surgeon over a 5-year period. Twenty-three key endpoints were selected from the database in order to determine the agreement between the two observers. The κ statistic (for nominal and ordinal data) and the concordance correlation coefficient (for interval data) were used to determine the level of agreement between the two data sets.

Results  Both observers recorded 432 new referrals during this period. There was only complete concordance between the two databases with respect to the number of new patients and returns to theatre within 30 days. Nonetheless, there was almost perfect concordance between the two data sets for a majority of the endpoints. The most important areas of variance were in the length of stay (κ = 0.78), the American Society of Anesthesiology grade (κ = 0.41), emergency surgery (κ = 0.36), nodal staging (κ = 0.54) and time to recurrence (κ = 0.77).

Conclusion  This study highlights a number of important areas of data inaccuracy in a prospective colorectal database. The inaccuracies were due to observer bias, issues of data interpretation, or just difficulty in collecting the information accurately.