No benefit to structured illumination microscopy for scattering imaging

Comments
Add One

Super-resolution microscopy has received a lot of interest in the past few years, culminating in the 2014 Nobel Prize in Chemistry for the development of the STED and STORM/PALM family of techniques. Around the same time, an interesting (and mischievously titled) commentary appeared in Nature Photonics, claiming to resolve (ho ho) a misconception about a third approach to super-resolution – SIM or ‘structured illumination microscopy’. This is a technique which can be used to improve the resolution by a factor of two. The paper argues that structured illumination microscopy only provides true resolution enhancement for fluorescence imaging and none at all for scattering imaging. This is despite recent papers making claims – and apparently providing experimental evidence – to the contrary.

To understand the paper’s argument as to why this is the case[1.Wicker, Kai, and Rainer Heintzmann. "Resolving a misconception about structured illumination." Nature Photonics 8, no. 5 (2014): 342-344.], we have to go back to the basic principles of resolution and SIM. Super-resolution techniques are seeking to overcome the diffraction limit, the fundamental determinant of the best possible resolution of a linear optical system. One way of thinking about the resolution is to say that the system has a finite pass-band for spatial frequencies, and that spatial frequencies higher than this pass-band (the frequencies corresponding to smaller details in the sample) are lost.

When we image a sample with a lens then the amplitude of light at the plane of the lens is actually the Fourier transform of the sample (this follows quite straightforwardly from scalar diffraction theory). It means that low spatial frequencies pass through the centre of the lens, and high spatial frequencies are at the edge. We can see that the maximum angle that light can be scattered through before it misses the lens therefore defines the maximum spatial frequency we can collect. We call the sine of this angle the numerical aperture; a microscope objective with a higher numerical aperture collects a greater range of spatial frequencies and hence provides better resolution. The effect on the image of this finite frequency range (or resolution) is a blurring effect.

A convenient way to describe the blurring is by the ‘intensity point spread function’ which, as the name implies, describes the intensity image we would obtain from an infinitesimally small point in the sample. If we consider the sample to be a collection of a large number of these points, then the image is simply the sum of the point spread functions corresponding to each of these points. Mathematically, the effect of the imaging system is therefore to convolve the object with the point spread function.

However, there are important difference between elastic scattering (coherent) imaging and inelastic fluorescence (incoherent) imaging. In coherent imaging, we preserve phase information, meaning that it matters what angle we illuminate the sample from.  In fact, the range of sample spatial frequencies that can be collected by the aperture of the microscope objective depends on the angle of illumination. To illustrate this, imagine we illuminate head-on, along the axis. The centre of the lens then collects the zero spatial frequencies, and the two edges collect the +k and –k spatial frequencies. But now if we illuminate at the maximum angle possible through the lens, we collect the zero spatial frequency at that maximum angle, +k at the centre of the lens, and +2k at the far edge of the lens. So, by illuminating over a full range of angles, we collect twice the range of spatial frequencies compared with on-axis illumination.

This isn’t the case for incoherent imaging, such as fluorescence, where it’s only the intensity of light hitting the sample that matters – the fluorophore emits in all directions regardless. So with fluorescence imaging, whatever angle we illuminate from, we get the same resolution as if we were illuminating from all angles.

So that’s the limit on resolution, but how does SIM work? The idea is to illuminate the sample with a grid pattern – a series of light and dark lines. We then see a beating effect between the spatial frequency of this illumination pattern and the spatial frequencies of the fluorophore intensity distribution. This beating means that higher spatial frequencies – ones that would normally be lost – are in a sense encoded in lower spatial frequencies which can pass through the imaging system. Of course this encoded information is then mixed up with real lower spatial frequencies, so we have to acquire multiple images with small shifts of the grating in order to sort this all out (a process known as demodulation). A final complication is that the resolution enhancement is also only along the direction perpendicular to the grating, so we then have to acquire sets of images with the grating at different angles – typically requiring nine images in total.

However, the point that is sometimes missed is that this theory is only valid for incoherent imaging, where the sample responds to the intensity of the incident light. In this case, the resulting intensity image is simply the product of the illumination pattern and the point spread function. In the spatial frequency domain, this is equivalent to a convolution, meaning that we extend the range of spatial frequencies passing though the system.  However, in coherent imaging we only get to multiply the complex amplitudes, and the resulting intensity pattern is the square of the coherent summation of the complex amplitude from all the different scatterers. In this case, diffraction theory shows that we don’t capture higher spatial frequencies than if we had simply illuminated from a range of angles. This is unsurprising if we remember that one way of generating a grid pattern is to interfere beams entering the microscope at two different angles.

So why have papers reported resolution improvements beyond the Abbe limit when using SIM for scattering imaging? Wicker and Heintman offer the explanation that those authors applied inappropriate methods to determine the resolution of their systems. Both papers reported imaging a single, sub-resolution diameter microsphere and plotting a cross-section. They then consider this to be the intensity point spread function of their system, and claim that the full width as a measure of their resolution. But Wicker and Heintman say that this isn’t valid for a coherent imaging system, where the image is not a convolution of the intensity point spread function and the sample. Rather, we must take the convolution of the coherent (complex amplitude) point spread function with the complex amplitude of the sample. Only then can we square the resulting complex amplitude to obtain the image intensity. If we have two sources close by (such that their amplitude point spread functions have significant overlap) then there will be interference and the image will differ from that obtained in an incoherent system. Therefore, the intensity point spread function measured for a coherent system doesn’t tell us whether we can resolve two points, and isn’t a meaningful measurement of resolution.  This is demonstrated in the paper by numerical simulations, showing that the use of SIM in coherent imaging leads to the apparent reduction in the width of the point spread function from a single point microsphere, but doesn’t improve the ability to discriminate between two neighbouring microspheres.

Given that most applications of SIM are in fluorescence imaging, this commentary isn’t going to drastically impact the field of super-resolution imaging. But it’s a fascinating reminder of why it can be essential to truly understand the fundamentals of what we are doing if we want to produce good science.

References

Leave a Comment

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>