Surpassing the lateral resolution limit by a factor of two using structured illumination microscopy

SHORT COMMUNICATION

Authors


M. G. L. Gustafsson. Tel: + 1 415 476 2489; fax: + 1 415 476 1902; e-mail: mats@msg.ucsf.edu

Abstract

Lateral resolution that exceeds the classical diffraction limit by a factor of two is achieved by using spatially structured illumination in a wide-field fluorescence microscope. The sample is illuminated with a series of excitation light patterns, which cause normally inaccessible high-resolution information to be encoded into the observed image. The recorded images are linearly processed to extract the new information and produce a reconstruction with twice the normal resolution. Unlike confocal microscopy, the resolution improvement is achieved with no need to discard any of the emission light. The method produces images of strikingly increased clarity compared to both conventional and confocal microscopes.

Introduction

The lateral resolution of the optical microscope is fundamentally limited because of the finite wavelength of light. This limit was understood a century ago (Abbe, 1873), and microscope technology approached it shortly thereafter, but the limit remains largely unchallenged. Although some recent non-linear concepts show promise (Klar & Hell, 1999), only one established technique has the ability to go beyond the limit in principle, namely confocal fluorescence microscopey (Minsky, 1961; Pawley, 1995). In practice, unfortunately, its lateral improvement is at best minor (confocal microscopy is of course a useful and popular technique, but this is mainly due to its ability to reject out-of-focus light, not to its lateral resolution properties). The reason is that a confocal microscope detects extended-resolution information only weakly (Gu & Sheppard, 1992), and only if operated with a pinhole that is significantly smaller than the Airy disk (Wilson, 1995). Such a small pinhole discards much of the desired in-focus emission light together with the unwanted out-of-focus light. Practical biological samples are often too weakly fluorescent to yield a usable signal level if much emission light is wasted; to admit more light a larger pinhole is typically used, and little or no lateral resolution enhancement is then possible.

It is, however, possible to achieve lateral resolution well beyond the classical limit without discarding any emission light, namely by using laterally structured illumination in a wide-field, non-confocal microscope (Gustafsson et al., 1997; Heintzmann & Cremer, 1998). In this method, the sample is illuminated with spatially structured excitation light, which makes normally inaccessible high-resolution information visible in the observed image in the form of moiré fringes. A series of such images is processed to extract this information and generate a reconstruction with improved resolution. Preliminary experiments have confirmed the validity of the physical principle (Heintzmann & Cremer, 1998; Gustafsson, 1999). This article demonstrates the full capability of the method. It describes an efficient and rapid microscope implementation that surpasses the resolution limit by a factor of two, in complete agreement with theoretical expectations. The method is demonstrated here in two dimensions (2D), and the straightforward modifications necessary for 3D imaging are discussed. Image comparisons, using both test objects and complex biological structures, demonstrate strikingly superior effective resolution compared to existing conventional and confocal microscopes.

Concept

The concept behind structured illumination microscopy can be easily understood in terms of the well-known moiré effect. If two fine patterns are superposed multiplicatively, a beat pattern – moiré fringes – will appear in their product (Fig. 1a). In our case, one of the patterns being superposed is the unknown sample structure − more precisely, the unknown spatial distribution of fluorescent dye – and the other pattern is a purposely structured excitation light intensity. As the amount of light emitted from a point is proportional to the product of dye density and local excitation light intensity, the observed emission light image is the product of the two patterns and will thus contain moiré fringes. As is clear from Fig. 1(a), such moiré fringes can be much coarser than either of the original patterns, and in particular may be easily observable in the microscope even if one (or both) of the original patterns is too fine to resolve. If the illumination pattern is known, the moiré fringes contain the information about the unknown structure. Thus one can gain access to normally unresolvable high resolution information about the sample by observing its appearance under carefully controlled illumination patterns.

Figure 1.

Concept of resolution enhancement by structured illumination. (a) If two line patterns are superposed (multiplied), their product will contain moiré fringes (seen here as the apparent vertical stripes in the overlap region). (b) A conventional microscope is limited by diffraction. The set of low-resolution information that it can detect defines a circular ‘observable region’ of reciprocal space. (c) A sinusoidally striped illumination pattern has only three Fourier components. The possible positions of the two side components are limited by the same circle that defines the observable region (dashed). If the sample is illuminated with such structured light, moiré fringes will appear which represent information that has changed position in reciprocal space. The amounts of that movement correspond to the three Fourier components of the illumination. The observable region will thus contain, in addition to the normal information, moved information that originates in two offset regions (d). From a sequence of such images with different orientation and phase of the pattern, it is possible to recover information from an area twice the size of the normally observable region, corresponding to twice the normal resolution (e).

Extended resolving ability

To understand the quantitative capabilities of the method, it is useful to think of the sample structure in reciprocal space, that is, its Fourier transform. In that representation, low resolution information resides close to the origin, while higher resolution information resides further away. A conventional microscope can only resolve sample structures with a line spacing coarser than a certain diffraction limit d0, which is about 0.2 µm for the best available objective lenses. Equivalently, it can only detect information that resides within a circular region of radius 1/d0 around the origin of reciprocal space − the observable region (Fig. 1b). Essentially the same circle defines the set of patterns that it is possible to create in the illumination light. Structured illumination does not alter this physically observable region, but it moves information into the region from the outside, and thereby makes that information observable.

As a specific example, consider an illumination light structure that consists of a sinusoidal stripe pattern. Its Fourier transform has only three non-zero points (Fig. 1c). One of these points is at the origin and the other two are offset from the origin in a direction defined by the stripe direction of the pattern, and by a distance proportional to the inverse line spacing of the pattern. When the sample is illuminated with this structured illumination light, the image that is seen through the microscope contains, in addition to the normal image, moiré fringes corresponding to information whose position in reciprocal space has been offset by those same amounts. In particular, the observable region now contains not only the usual information that itself resides there, but also information that originates in two offset regions (Fig. 1d). The parts of those offset circles that fall outside the normally observable region represent new information that is not accessible in a conventional microscope.

The observed image is a sum of these three contributions, and it is not possible to separate them using a single image. However, the coefficients by which they are added together depend on the (known and controllable) phase of the illumination light structure. By recording three or more images of the sample with different illumination phase, the three components can be separated through simple arithmetic, and the information restored to its proper position. [The procedure is largely analogous to that applied in the axial direction in standing wave fluorescence microscopy (Bailey et al., 1994).] If the distance of offset is chosen to be as large as possible (i.e. the illumination pattern is chosen as fine as possible), it is possible to access information out to double the normal resolution in the pattern direction. By repeating this one or more times with the pattern orientated in different directions, one can gather essentially all the information within a circle twice as large as the physically observable region (Figs 1e and 2). With this information, an image of the sample can be reconstructed at double the normal resolution.

Figure 2.

The reconstruction procedure in reciprocal space. (a–b) Fourier transforms of a microscope image with normal illumination (a), and of a single image of the same object taken with structured illumination (b). Both images contain information only within the observable region, but the structured illumination image includes displaced information from other regions superimposed on the normal information [arrows in (b)]. From a sequence of seven to nine such images, seven different information components can be computationally separated (c), and recombined at their proper positions (d). The recombined data set is then apodized and re-transformed to real space. The reconstruction in (d) contains information twice as far out from the origin as does the conventional image, resulting in doubled resolution in the re-transformed image.

Experimental methods

Overview

A simple structured illumination microscope has been constructed. The illumination light is passed through a line-patterned phase grating located in a secondary image plane of the microscope. The microscope objective lens projects a demagnified image of this grating, with a line spacing close to the diffraction limit of the objective lens, onto the sample. The orientation and phase of the resulting striped illumination pattern is controlled through rotation and lateral translation of the grating.

Microscope

Laser light was spatially scrambled, coupled through a multi-mode optical fibre, and linearly polarized. A linear transmission phase grating was placed in a secondary image plane of the microscope and illuminated from the fibre. The orientation of the grating lines was parallel to the polarization vector of the light. Diffraction orders + 1 and − 1 from the grating were retained; all other orders were blocked. (The phase grating diffracted 80% of the incoming light power into these two orders.) The two beams were focused so as to form images of the fibre end face near opposite edges of the rear aperture of the objective lens. This caused a high-contrast stripe pattern with a 0.23 µm line spacing to be projected onto the sample (the theoretical resolution limit of the objective was about 0.22 µm). The modulation depth of the stripe pattern was measured to be 70–90%. The grating was mounted on a rotatable, closed-loop piezoelectric translation stage, allowing adjustment of its orientation and lateral position, and thereby of the orientation and phase of the illumination stripe pattern. The polarizer co-rotated with the grating so as to maintain s polarization, for maximum pattern contrast.

It is not necessary to use laser light; the lower coherence of conventional light sources would in fact be an advantage. However, because the illumination light passes through only a small subset of the back focal plane of the objective, the optical invariant (area times solid angle) of the illumination train is less than in a conventional microscope, so conventional extended sources, which have larger values of the optical invariant, would be used inefficiently. This is not a problem for lasers, for which the optical invariant can be near zero. The laser light was scrambled (using a rotating holographic diffuser; Physical Optics Corporation) to decrease its spatial coherence, which has two advantages: it limits the illumination structure to a finite axial extent, which is helpful in 3D imaging, and it makes the system much less sensitive to unwanted interference fringes caused by dust or stray reflections. Fibre coupling was used here to improve reproducibility, but is not essential.

Sample preparation

A drop of fluorescent polystyrene microspheres (red Fluospheres, Molecular Probes, Eugene, OR) suspended in water was air-dried on a cover slip and imaged under glycerol. HeLa cells were grown on cover slips, fixed, labelled with rhodamine phalloidin, and mounted using standard protocols (Weiner et al., 1999).

Acquisition

Three images were acquired using a cooled CCD camera, with the phase of the illumination pattern shifted 120° between each. This procedure was repeated twice with the orientation of the pattern rotated by 60° and 120°, yielding a total of nine images. Each image was acquired at the normal, low resolution pixel size; the pixel density in the output file was doubled during the processing.

Comparison confocal data were acquired on a Leica TCS confocal microscope operated with a pinhole size equal to 1/4 the size of the Airy disk, with the sample located immediately below the cover slip, using a laser power well below the saturation level, and averaging of eight scans. The excitation and emission wavelengths were 546 ± 7 nm and 605 ± 25 nm for conventional, 568 nm and all wavelengths above 590 nm for confocal, and 532 nm and 605 ± 25 nm for structured illumination microscopy. All images were acquired with planapochromatic 100× NA 1.4 oil immersion objectives.

Processing

For each pattern orientation, the three information components were separated using image arithmetic, and Fourier transformed. The precise orientation, line spacing, modulation depth and initial phase of the illumination pattern were determined by comparing the different components in the regions of reciprocal space in which they overlap (see Fig. 1e). The displaced components were then moved to their true positions in reciprocal space, combined through a weighted average (Fig. 2d), and the resulting reassembled image re-transformed to real space.

The reciprocal space moves were actually performed by multiplying by the equivalent cosine wave in real space, so as to not constrain the translation vector to whole pixels in the discrete Fourier transforms. Reassembly was done by multiplying each component by the known (previously measured) 2D optical transfer function (OTF) of the microscope, adding the components pointwise where they overlap, and dividing their sum by the sum of the squares of the OTFs plus a small constant. This procedure is equivalent to OTF-compensating each component, averaging them with weights assigned according to their noise variances, and suppressing low signal-to-noise peripheral regions as in a wiener filter. The pattern contrast (modulation depth) and phase for each orientation entered as constant complex factors multiplying the OTFs of the displaced components. The reassembled image was apodized with a cosine bell (i.e. the high spatial frequencies were rolled off to decrease ringing artefacts caused by the sharp edges of the new observable region) before re-transforming to real space.

Results and discussion

The resolution performance of this system was compared to conventional and confocal microscopy by acquiring images of the same test sample, a cluster of fluorescent microspheres. The results completely confirm the expected resolution gain compared to conventional microscopy (Figs 3a and c). The effective lateral resolution of the structured illumination images also surpasses that achieved in practice by confocal microscopy even under ideal conditions (Fig. 3b).

Figure 3.

A cluster of fluorescent microspheres of nominal diameter 121 nm, as imaged by conventional (a), confocal (b), and structured illumination (c) microscopy. The confocal microscope was operated under maximum resolution conditions. The apparent size (FWHM) of isolated beads is approximately 130 nm in the structured illumination image, compared to 290 and 210 nm for conventional and confocal microscopy.

The 2D observable region of structured illumination microscopy is in theory no larger than that of confocal microscopy. The difference in effective resolution between Figs 3(b) and (c) is instead due to stronger signal levels within the observable region, that is, to a greater efficiency at detecting high-resolution information. This leads to a larger region over which the signal is recoverable, that is, to a larger effective observable region. It should of course be kept in mind that the processing that produced Fig. 3(c) partly compensated for the OTF, while the data in Figs 3(a) and (b) are unprocessed, but such filtering cannot recover a signal that is weaker than the noise level.

The performance on biological samples was evaluated by imaging the actin cytoskeleton in HeLa cells (Fig. 4). Here again, structured illumination microscopy strikingly increases the image clarity. Actin fibres whose separation is well below the resolution limit of the conventional microscope are easily separated using structured illumination. The apparent widths (full width at half maximum, FWHM) of the fine protruding actin fibres is improved by more than a factor of two, from about 290 nm for the conventional microscope to about 115 nm.

Figure 4.

The actin cytoskeleton at the edge of a HeLa cell, as imaged by (a, c) conventional and (b, d) structured illumination microscopy. (c, d) Enlargements of the boxed areas in (a) and (b), respectively. Fibres separated by less than the resolution limit of the conventional microscope are well resolved using structured illumination (d). The apparent widths (FWHM) of the finest protruding fibres [small arrows in (a, b)] are lowered to 110–120 nm in (b), compared to 280–300 nm in (a).

Both acquisition and processing are quite rapid. The total exposure time for the data in this report was 1–10 s, and processing required about 30 s for a 512 × 512 pixel final image, including extensive parameter fitting. Once most system parameters are known, further images could be processed in near real time.

We would like to emphasize that the computer processing method used here is completely linear and deterministic. It is known that non-linear, constrained deconvolution algorithms can enhance apparent resolution by making use of a priori information about the sample, such as the fact that the dye density is non-negative (Agard et al., 1989; Carrington et al., 1995). No such algorithms were applied to these images, thus any resolution gains from such methods would come in addition to the improvements reported here.

The 2D form of structured illumination microscopy used for this demonstration paper is appropriate only for flat samples, but the same procedure can be applied in 3D simply by acquiring a focal series of images and using the 3D OTF in place of the 2D OTF during processing. In fact, the resolution performance of wide-field microscopes in the third dimension can itself be enhanced by laterally structured illumination similar to that used here (Neil et al., 1997; Neil et al., 1998). A minor complication is that the lateral illumination pattern that is optimal for axial performance differs from the one used here, mainly in having a two times coarser line spacing. Simultaneous lateral and axial enhancement could be achieved several ways, for example using a compromise pattern of intermediate spacing, using both kinds of patterns in sequence and combining the data, or using a slightly modified illumination pattern that contains both kinds of frequency component. The last approach can enhance the axial sectioning ability further by making the illumination vary axially as well as laterally − this effect is similar to that of illuminating the coarser grating with fully incoherent light (Neil et al., 1997), but maintains the maximum lateral resolution enhancement and strong pattern contrast. The axial resolving power using the modified pattern would be equivalent to that of confocal microscopy.

For even greater 3D resolution, laterally structured illumination can be combined with I5M, an interferometric method that enhances axial resolution drastically by observing and illuminating the sample from both directions using two opposing objective lenses (Gustafsson et al., 1995; Gustafsson et al., 1999). The two technologies are compatible and it should be straightforward to combine them (Gustafsson et al., 1997). The combined technique would achieve about 100 nm resolution in all directions.

As an alternative to the procedure of repeatedly applying one-dimensional patterns with different orientation, one could instead use a single, more complex 2D illumination pattern that has structure in more than one direction. With such a pattern, it is possible to improve resolution in all lateral directions without rotating the pattern; the drawback is that a larger number of information components are then superposed in each image, so that a correspondingly larger number of images at different phases is required to separate them.

Although its most obvious applications are in biology, structured illumination microscopy is equally applicable to any field where fluorescent structures are encountered, or to any other imaging mode where the intensity, as opposed to the coherent amplitude, of the illumination interacts with the sample.

In conclusion, it is possible to exceed the fundamental resolution limit of the wide-field fluorescence microscope by a factor of two using structured illumination. Complex biological samples can be imaged at a substantially higher level of detail than is possible in practice with either conventional or confocal microscopes. This advance significantly expands the class of questions that can be answered by light microscopy.

Acknowledgements

I wish to thank John Sedat and David Agard for their help and support, and Daniel Kalman for supplying the HeLa cells. This work was supported in part by the Sandler Family Foundation, and by the NIH through grants GM-25101 and GM-31627.

Ancillary

Advertisement