Image Corrections

Show/Hide Code

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
import matplotlib.pyplot as plt
import numpy as np
import xarray as xr

import starfish
import starfish.data
from starfish.types import Axes

experiment = starfish.data.ISS(use_test_data=True)
image: starfish.ImageStack = experiment['fov_001'].get_image('primary')

image_2d = image.sel({Axes.CH: 0, Axes.ROUND: 0, Axes.ZPLANE: 0})

plt.imshow(np.squeeze(image_2d.xarray.values))
plt.show()

Illumination Correction

The goal of illumination correction is to remove uneven illumination of the image caused by non uniform illumination of the field of view, characteristics of the sensor, (like vignetting), or orientation of the tissue’s surface with respect to the light source.

The simplest forms of illumination correction are called “prospective correction” and are based on background subtraction. This involves taking additional images using the microscopy apparatus to help calibrate. These can either be acquired by averaging a series of images captured with no sample and no light (dark image), or with no sample and light (bright image).

Starfish can apply this type of background correction by exposing the ElementWiseMult Filter. The user is responsible for transforming their calibration images into the correct matrix to correct for background, and then ElementWiseMult can apply a transformation to correct any uneven illumination.

The below plot shows a single plane of an in-situ sequencing experiment.

Show/Hide Code

import matplotlib.pyplot as plt
import numpy as np
import xarray as xr

import starfish
import starfish.data
from starfish.types import Axes

experiment = starfish.data.ISS(use_test_data=True)
image: starfish.ImageStack = experiment['fov_001'].get_image('primary')

image_2d = image.sel({Axes.CH: 0, Axes.ROUND: 0, Axes.ZPLANE: 0})

plt.imshow(np.squeeze(image_2d.xarray.values))
plt.show()
../../_images/sphx_glr_plot_image_corrections_001.png

Out:

  0%|          | 0/16 [00:00<?, ?it/s]
100%|##########| 16/16 [00:00<00:00, 265.38it/s]

This image was corrected before it was sent to us, but we can introduce an uneven illumination profile. Below we mock an extremely severe 200% decrease in illumination from left to right.

lightness = np.linspace(4, 1, image_2d.xarray.sizes[Axes.X])
gradient_data = np.tile(lightness, reps=(image_2d.xarray.sizes[Axes.Y], 1))
gradient = xr.DataArray(
    data=gradient_data[np.newaxis, np.newaxis, np.newaxis, :, :],
    dims=(Axes.ROUND.value, Axes.CH.value, Axes.ZPLANE.value, Axes.Y.value, Axes.X.value)
)

# introduce the gradient, overwriting the ImageStack
data = image_2d.xarray.values / gradient.values
image_2d = starfish.ImageStack.from_numpy(data)

# display the resulting image
plt.imshow(np.squeeze(image_2d.xarray.values))
plt.show()
../../_images/sphx_glr_plot_image_corrections_002.png

Out:

/home/docs/checkouts/readthedocs.org/user_builds/spacetx-starfish/checkouts/latest/starfish/core/imagestack/imagestack.py:413: UserWarning: ImageStack detected as float64. Converting to float32...
  warnings.warn(f"ImageStack detected as {array.dtype}. Converting to float32...")
/home/docs/checkouts/readthedocs.org/user_builds/spacetx-starfish/envs/latest/lib/python3.6/site-packages/skimage/util/dtype.py:135: UserWarning: Possible precision loss when converting from float64 to float32
  .format(dtypeobj_in, dtypeobj_out))

  0%|          | 0/1 [00:00<?, ?it/s]
100%|##########| 1/1 [00:00<00:00, 289.24it/s]

The illumination profile has increased the intensity of the background in the right side of the image. This is problematic for many spot finding methods that set thresholds for peak intensities globally across the image; spots can be incorrectly excluded in low-illumination areas, and this spatial phenomenon can lead to incorrect spatial hypotheses.

We use starfish’s ElementWiseMultiply to multiply the image with a gradient. Here, it’s just the same gradient we divided the image by. However, in typical microscopy experiments this should be derived from black or bright images taken to calibrate the microscope, and the correction is likely to be more more complex than a simple gradient.

ewm = starfish.image.Filter.ElementWiseMultiply(mult_array=gradient)
corrected_image_2d = ewm.run(image_2d, in_place=False)

the image should now be returned to normal

plt.imshow(np.squeeze(corrected_image_2d.xarray))
plt.show()
../../_images/sphx_glr_plot_image_corrections_003.png

Chromatic Aberration

Chromatic Aberration refers to the failure of a lens to focus all colors to the same positions. Because multiplex spot counting experiments tend to leverage fluorescence signals from different spectral bands and restrict each color channel to its own image, the resulting problems can be complex to detect.

Starfish currently exposes some basic registration effects, and these can be enough to correct for very minor chromatic aberrations. Additionally, non-multiplex approaches may only require that spots find their way into the correct cell, providing some flexibility to ignore minor aberrations.

However, most multiplex experiments will require some kind of correction beyond what is provided by starfish’s basic translation. At this point in time starfish does not provide tooling for the correction of chromatic aberrations, and as such users must correct for these types of errors in their data prior to submitting it to starfish. However, we would be very excited to receive code contributions of filters that solve these problems (see contributing to starfish)

To read more about types of chromatic aberration that can appear in microscopy data, see wikipedia

pass

Deconvolution of Optical Point Spread Functions

Deconvolution is a technique that enables a user to reverse any optical distortion introduced by the microscope. Deconvolution is accomplished by assuming that the path of light through the instrument is perfect, but convolved with a “point spread function”. By deconvolving the image and the point spread function, the distortion can be removed.

The point spread function can be determined in several ways. Ideally, it is approximated during calibration of the microscope, in which case it can be removed by the Richardson-Lucy algorithm (API: DeconvolvePSF)

Learning the PSF from image data

The simplest way to learn the PSF from spot-based data is to learn the shape of the spots. If we assume that the spots are derived from point sources of light, the average spot shape can form the basis for estimating the PSF needed by the richardson-lucy (see above). This depends on the data having a relatively uniform spot shape, which not all image-based transcriptomics and relatively few image-based proteomics experiments adhere to.

# TODO incorporate Nick’s vignette.

It can also be estimated from the frequency properties of the experimental data, called “blind deconvolution”, for which variants of both Richardson-Lucy and the Weiner filter have been proposed. Starfish does not provide any tooling for blind deconvolution.

To read more about image deconvolution, see this article: <https://en.wikipedia.org/wiki/Deconvolution#Optics_and_other_imaging>

Starfish

pass

Example Image Correction Pipeline

TODO put together a worked example in the gallery and link it here.

Gallery generated by Sphinx-Gallery