SEARCH

SEARCH BY CITATION

Keywords:

  • monitors;
  • review environment;
  • veterinary;
  • viewing software;
  • workstation

Abstract

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES

Digital imaging systems are imaging chains where the diagnostic utility of the system is limited by its weakest link. Digital image display has had a major impact on the imaging chain. Radiologist performance can be greatly impacted by monitors and software choices. Many factors should be considered when purchasing monitors or an entire workstation and workstation rank or importance is of primary concern. The workstation should be likened to the studio or perhaps more appropriately the surgical suite of the radiologist. It should be comfortable, useful, and enable the radiologist to practice their trade to the best of their ability.


Image Display

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES

Use of digital imaging modalities is veterinary medicine is increasing and soft copy or electronic review of images is becoming more important. Workstation utility, image display, and subsequently radiologist performance are affected by hardware, software, and connectivity concerns. This article focuses on image display hardware and software leaving connectivity for the digital imaging and communication in medicine (DICOM) and picture archival and communications systems (PACS) segments of this supplemental issue.1,2

Radiologist performance can be judged by two important factors, diagnostic accuracy and productivity. The role of the radiologist is much more active in the soft copy review environment than it has been in analog radiograph review. The use of digital imaging tools at the level of the workstation enables the radiologist to enhance their own performance either through improved accuracy or increased productivity.

Historically, soft copy review was hampered by lower display quality when compared with light boxes and film. More sophisticated computer monitors and tremendous advances in image review software make this an almost obsolete concern. Although limited veterinary specific information about image display exists, information from human medical imaging can be applied to our field. The American College of Radiology (ACR) has recommendations for soft copy review and teleradiology.3 Currently, regulations for display hardware and software reflect a desire to maintain a standard of practice that is “as good as” analog film review. Although meeting the ACR recommendations is a minimum, it is important to improve the standard of practice by implementing technological advances as they develop.

Hardware

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES

Recommendations for workstation monitors are based on experiences reported by medical doctors.4–15 Monitor quality is evaluated using three parameters: resolution, luminance, and contrast ratio. Monitor recommendations vary based on the importance of the viewing workstation and the modalities displayed. Primary review stations, where radiographs are first reviewed and a diagnosis is made, are different than secondary or tertiary review stations, located in exam rooms or surgery suites. Primary review stations should have the best possible monitor quality and be housed in a room with controlled ambient lighting.

The other important consideration is which modalities will be evaluated at a given workstation. The monitor must be of sufficient quality to support evaluation of the modality with the largest image matrix. For example a monitor that is dedicated to MRI evaluation could be of lesser quality than a monitor used for diagnostic radiology interpretation (Fig. 1). At this time recommendations exist for monitor quality3 however, it is important to realize that advances in monitor and graphics card technology greatly exceed the pace of any regulatory body.

image

Figure 1.  Dedicated MRI review station using two, landscape mode, side by side color monitors. These monitors are 1900 × 1250 pixel LCD monitors (SyncMaster 243T, Samsung Electronics Co. Ltd., Ridgefield Park, NJ). These monitors are not used for primary review of large pixel matrix images such as digital radiographs.

Download figure to PowerPoint

Active matrix liquid-crystal display (LCD) monitors have largely replaced cathode ray tube (CRT) technology in diagnostic imaging. Historically, CRT and early LCD monitors were too dim, lost brightness over time, and were of inadequate resolution to display large pixel matrices. Technical advances, particularly in LCD monitors, addressed these shortcomings resulting in a wide variety of readily available medical-grade monitors. Compared with CRT monitors, LCD monitors have similar or superior performance in diagnostic accuracy and read time.5–7,11,12,16 Liquid crystals are molecules that have variable physical properties (transparency) in the presence of an external electrical field. LCD monitors have a backlight and operate by controlling the transparency of each pixel through modification of the external electrical field around the liquid crystal component. These technologies and the availability of a variety of different monitor quality indices have created a large cost range in consumer grade and medical grade monitors making this an important point for investigation in purchasing decisions.5

Monochrome or gray scale LCD monitors are considered standard of practice in medical imaging. By comparison, consumer grade monitors are generally color monitors with less brightness, smaller pixel matrices, less sophisticated graphics cards, and they are not calibrated to the DICOM grayscale display function standard (GSDF). The GSDF is quality assurance measure that ensures uniformity in image display between calibrated monitors. Medical grade, monochrome monitors are more expensive than consumer grade color monitors inciting interest in radiologist performance (accuracy and productivity) using the two different types. It has been reported that diagnostic accuracy can be maintained using lower resolution color monitors if software tools such as magnify, window, and level are used.8,9 Reading time and operator fatigue, however, are increased.9,17

Medical grade, gray scale monitors have increased resolution, bit depth, and luminance compared with color monitors. A single shade of gray on a color CRT monitor is comprised of the three primary colors and uses three pixels, decreasing the overall screen resolution. Active matrix LCD monitors display color through sub-pixel modification where each displayed pixel is divided into three sub-pixels each having an added color filter (red, green, and blue). This decreases light transmission from the backlight to the viewing surface. Instead of using this color modification, medical grade gray scale monitors use the same technology to increase the bit depth of gray scale display. The displayed gray shade is actually a combination of different gray shades at the sub-pixel level, i.e., increase the number of gray shades that each pixel can display.

Conventional view boxes are about 10 times brighter than a high luminance gray scale monitor and gray scale monitors are around two times brighter than color monitors.18 Monitor brightness is termed luminance and is measured in foot-lamberts (ft-L or the SI unit candela [cd] /m2)(3.42 cd/m2=1 ft-L) and should be a minimum of 50 ft-L (ACR/NEMA technical standard).

Using lower brightness monitors, reviewers showed a trend toward decreased diagnostic accuracy, increased study read time and increased time to reach an imaging conclusion.15,19,20 All monitors degrade over time and quality assurance steps are recommended to ensure maintenance of the gray scale display function standard.21,22 Medical grade GSDF are calibrated to ensure uniformity among monitors. When compared with a calibration for nonperceptually linearized display (e.g., Society for Motion Picture and Television Engineers test pattern) the DICOM calibrated systems resulted in increased observer performance determined by decreased time to search images.23 Quality assurance procedures can be set to run automatically with little input from the end user. Although LCD monitors degrade over time, a recent study showed that the changes expected in three megapixel (MP) monochrome, medical grade LCD monitors ≤2.5 years old did not decrease diagnostic accuracy, or increase lesion search or decision dwell time for radiologists.24

Spatial resolution (pixel matrix) is a parameter that can be use to compare monitors. Spatial resolution is most commonly described in terms of MPs. Personal computers range from 0.75 to 2 MP (e.g., 1024 × 768–1600 × 1200). Medical grade monitors range from 2 to 5 MP (e.g., 1600 × 1200–2560 × 2048). Comparisons of diagnostic performance between low-resolution monitors, high-resolution monitors, and printed hard copy had variable results. Older research indicated that hard copy was more accurate than both types of soft copy display and that a significant difference did not exist between the two types of soft copy display.25 Several studies have supported the latter statement particularly when the zoom function is used on lower resolution monitors,8–10 however, some evidence exists that higher resolution monitors are better for the differentiation of low contrast details.4,13 Although 5 MP monitors are currently available, they are considerably more expensive and their use is generally restricted to mammography.

Dynamic range or contrast ratio is the ratio of luminance between the most white shade displayed and the blackest shade displayed. A higher ratio is preferable. Incorporated into this ratio is overall luminance of the monitor where a brighter monitor will often have a better contrast ratio or wider dynamic range. Medical grade monitors have contrast ratios from 600:1 to 1000:1. Contrast ratio, and therefore overall monitor performance, will be effectively reduced by high ambient lighting and off angle viewing using LCD monitors.5,26

The number of monitors used depends on the primary use of the workstation. One must consider the types of studies being evaluated and the number of views being evaluated at one time. For example, a two-view feline thoracic examination would be amply examined using a two-monitor system but a multiple joint equine prepurchase examination may be better evaluated on four or even an eight-monitor system (Fig. 2). Factors to consider include the number of previous examinations that will be evaluated at one sitting. It may be important to have an additional color monitor for computer programs such as the hospital information system (HIS), radiology information system (RIS), or voice recognition software. As the number of monitors increases, the faces of the LCD monitors must be oriented to avoid off-incidence decreases in luminance.27

image

Figure 2.  Primary large animal radiograph review station. Four, 3 MP, medical grade monochrome (GSDF calibrated) are positioned in landscape mode in a semi-circular configuration. An additional color monitor enables the simultaneous viewing of other programs such as the web based Hospital Information System (HIS). To run five monitors, two additional graphics cards must be added to the computer. GSDF, grayscale display function standard.

Download figure to PowerPoint

Viewing Environment

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES

The radiograph review environment should always be discussed in concert with monitor quality and radiologist performance and is even more important for soft copy review. In comparison to conventional view boxes, computer screens have lower luminance, decreased spatial resolution, decreased dynamic range and are more affected by viewing angle and reflected light. Ambient lighting should be low, adjustable, and indirect. High ambient lighting effectively decreases the luminance of the monitor through reflection, which in turn decreases the overall contrast ratio of the monitor18,28 making low contrast details more difficult to see (e.g., small pulmonary nodules). Primary read workstations should not be housed in rooms with unshaded windows. It was suggested that white lab coats should not be worn while interpreting radiographs to reduce glare.7,19 Additionally, the clear plastic cover over the front of many monochrome monitors, designed to protect the screen, can increase glare.29 Off-angle viewing of LCD screens results in lower luminance and therefore lower contrast ratio. This becomes important in workstations where multiple monitors are used or when multiple radiologists view the same screen. Off-angle viewing was shown to cause a significant decrease in diagnostic accuracy of mammogram interpretation.30

The effect the soft copy review environment on radiologists is not known. Computer vision syndrome symptoms include headaches, blurred vision, neck pain, fatigue, eye strain, dry, irritated eyes, and difficulty refocusing the eyes. These symptoms can be further aggravated by improper lighting conditions (e.g., bright overhead lighting or glare) or air moving past the eyes (e.g., overhead vents).31 It was suggested that reading both hard and soft copy images produces more operator fatigue that just one format or the other.29 There is correlation between increased reading time and symptoms of computer vision syndrome or radiologist fatigue.29 Situations that are known to cause increased review time include, lower resolution monitors, lower luminance monitors, and monitors not calibrated to the GSDF.9,10,20,23,32

Image Display Software

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES

Radiology viewing software must be used to view digital radiographs effectively. There are many different versions of Windows and Mac based software that range in price from downloadable freeware to expensive programs sold on a per license per annum basis. Most of the medical viewing programs are DICOM viewers with the capability of handling all of the different digital imaging modalities. Some of the important advantages of digital imaging are made possible through software that includes display tools to optimize the evaluation of digital radiographs. The ACR recommends a minimum standard for DICOM viewer software, which includes simply controls for window and level (analogous to contrast and brightness), pan and zoom, flip, rotate and measuring tools. Almost all of the commonly used viewing programs exceed the minimum standard offering many additional features to increase diagnostic utility. These features vary with the software program and are best explored by using the software. A complete discussion of DICOM viewing software is beyond the scope of this article; however, it is in the best interest of the purchaser or end user compare different software programs to determine the utility of that program for their own specific needs or the needs of their hospital.

Soft copy radiograph evaluation is affected by the way in which the radiographs are displayed on the computer screen. Optimally, one radiograph should be displayed per 2–3 MP monitor so that the information displayed on the screen most accurately depicts the information stored in the digital matrix. Stated in another way, “squeezing” a large image on to a small monitor will result in the loss of diagnostic information. The image review software functions of window, level, and zoom should always be used because they increase diagnostic accuracy.4,9,25 Cropping tools or automatic collimation detection should be used at the time of image production to minimize white areas surrounding the exposed portion because this reduces excessive back lighting.

Features that improve the clinical utility of software programs include the hanging protocol, the default image resolution, and the user interface. Hanging protocols require that specific keywords be incorporated into the DICOM header information so the software can recognize them and hang or display the radiographs or other images in a specified order and orientation. Having an appropriate hanging protocol can greatly increase radiologists' productivity. The default resolution should be as high as possible because important diagnostic decisions are made very rapidly upon first seeing a radiograph. The user interface is a key feature in increased or decreased radiologist productivity. A radiologist should be able to use the image viewing system with little or no training and the system should be user friendly.29 In one study, the reviewers' eyes were fixed on the menu options of a software program for 20% of the total time spent reviewing bone radiographs.32 Software options that either stack the remaining images or display the remaining images in a thumbnail format can greatly aid in study evaluation.27 The user interface differs between software programs and this is an important point of comparison in purchasing decisions.

Disclosure of Conflicts of Interest: The authors have declared no conflicts of interest.

REFERENCES

  1. Top of page
  2. Abstract
  3. Image Display
  4. Hardware
  5. Viewing Environment
  6. Image Display Software
  7. REFERENCES
  • 1
    Ballance D. DICOM and the network. Vet Rad Ultrasound 2008;49:S29S32.
  • 2
    Wright M, Ballance D, Robertson ID, et al. Introduction to DICOM for the practicing veterinarian. Vet Rad Ultrasound 2008;49:S14S18.
  • 3
    A.C.R. American College of Radiology. ACR technical standards for teleradiology: American College of Radiology, Reston, VA, 2002.
  • 4
    Bacher K, Smeets P, De Hauwere A, et al. Image quality performance of liquid crystal display systems: influence of display resolution, magnification and window settings on contrast-detail detection. Eur J Radiol 2006;58:471479.
  • 5
    Badano A. PACS equipment overview: display systems. RadioGraphics 2004;24:879889.
  • 6
    Balassy C, Prokop M, Weber M, et al. Flat-panel display (LCD) versus high-resolution gray-scale display (CRT) for chest radiography: an observer preference study. Am J Roentgenol 2005;184:752756.
  • 7
    Batchelor J. Monitor choice impacts diagnostic accuracy and throughput. http://Auntminnie.com May 4, 2002. http://www.auntminnie.com/index.asp?Sec=rca&Sub=scar_2002&pag=dis&ItemId=53236 (accessed September 11, 2007)
  • 8
    Doyle AJ, Le Fevre J, Anderson GD. Personal computer versus workstation display: observer performance in detection of wrist fractures on digital radiographs. Radiology 2005;237:872877.
  • 9
    Graf B, Simon U, Eickmeyer F, et al. 1K versus 2K monitor: a clinical alternative free-response receiver operating characteristic study of observer performance using pulmonary nodules. Am J Roentgenol 2000;174:10671074.
  • 10
    Herron JM, Bender TM, Campbell WL, et al. Effects of luminance and resolution on observer performance with chest radiographs. Radiology 2000;215:169174.
  • 11
    Kotter E, Bley TA, Saueressig U, et al. Comparison of the detectability of high- and low-contrast details on a TFT screen and a CRT screen designed for radiologic diagnosis. Invest Radiol 2003;38:719724.
  • 12
    Pal S. LCD just as good as conventional monitors for chest CR. http://Auntminnie.com April 30, 2002. http://www.auntminnie.com/index.asp?Sec=sup&Sub=res&Pag=dis&ItemId=53184 (accessed September 11, 2007).
  • 13
    Peer S, Giacomuzzi SM, Peer R, et al. Resolution requirements for monitor viewing of digital flat-panel detector radiographs: a contrast detail analysis. Eur Radiol 2003;13:413417.
  • 14
    Saunders RS Jr, Samei E. Resolution and noise measurements of five CRT and LCD medical displays. Med Phys 2006;33:308319.
  • 15
    Song K, Lee J, Kim H, et al. Effect of monitor luminance on the detection of solitary pulmonary nodule: ROC analysis. SPIE Conf Image Perception Perform 1999;3663:212216.
  • 16
    Prokop M, Neitzel U, Schaefer-Prokop C. Principles of image processing in digital chest radiography. J Thorac Imaging 2003;18:148164.
  • 17
    Hirschorn D, Dreyer K. Do I Still Need a Grayscale Monitor? Imaging Economics: Imaging Informatics, February 2006. http://www.imagingeconomics.com/issues/articles/2006-02_10.asp (accessed September 11, 2007).
  • 18
    Bushberg J, Seibert J, Leidholdt E, et al. The essential physics of medical imaging, 2nd ed. Philadelphia, PA: Lippincott Williams and Wilkins, 2002.
  • 19
    Krupinski EA, Johnson J, Roehrig H, et al. Use of a human visual system model to predict observer performance with CRT vs LCD display of images. J Digit Imag 2004;17:258263.
  • 20
    Krupinski E, Roehrig H, Furukawa T. Influence of film and monitor luminance on observer performance and visual search. Acad Radiol 1999;6:411418.
  • 21
    Ly CK. SoftCopy display quality assurance program at Texas children's hospital. J Digit Imag 2002;15 (Suppl 1):3340.
  • 22
    Seto E, Ursani A, Cafazzo JA, et al. Image quality assurance of soft copy display systems. J Digit Imag 2005;18:280286.
  • 23
    Krupinski E, Roehrig H. The influence of a perceptually linearized display on observer performance and visual search. Acad Radiol 2000;7:813.
  • 24
    Krupinski E, Roehrig H, Fan J. Does the age of liquid crystal displays influence observer performance? Acad Radiol 2007;14:463467.
  • 25
    Otto D, Bernhardt TM, Rapp-Bernhardt U, et al. Subtle pulmonary abnormalities: detection on monitors with varying spatial resolutions and maximum luminance levels compared with detection on storage phosphor radiographic hard copies. Radiology 1998;207:237242.
  • 26
    Haak R, Wicht MJ, Hellmich M, et al. Influence of room lighting on grey-scale perception with a CRT-and a TFT monitor display. Dentomaxillofac Radiol 2002;31:193197.
  • 27
    Reiner B, Siegel E, Krupinski E. Digital radiographic image presentation and display. Chicago, IL: RSNA, 2003.
  • 28
    Weiser J, Romlein J. Monitor minefield. Imaging Economics: Imaging Informatics, April 2006. http://www.imagingeconomics.com/issues/articles/2006-04_09.asp (accessed September 11, 2007).
  • 29
    Krupinski E, Kallergi M. Choosing a radiology workstation: technical and clinical considerations. Radiology 2007;242:671682.
  • 30
    Fifadara D, Averbukh A, Channin D, et al. Effect of viewing angle on luminance and contrast for a five million pixel monochrome display and a nine million pixel color liquid crystal display. J Digit Imag 2004;17:264270.
  • 31
    Blehm C, Vishnu S, Khattack A, et al. Computer vision syndrome: a review. Surv of Ophthalmol 2005;50:253262.
  • 32
    Krupinski E, Lund P. Differences in time to interpretation for evaluation of bone radiographs with monitor and film viewing. Acad Radiol 1997;4:177182.