Aberrated Image Recovery of Ultracold Atoms#

%matplotlib inline
import matplotlib.pyplot as plt
import seaborn as sns; sns.set()
import numpy as np
import pandas as pd
import scipy as sp
import skimage as ski
import sklearn as skl
from sklearn import cluster

Overview#

https://raw.githubusercontent.com/illinois-ipaml/MachineLearningForPhysics/main/img/Project_UltraColdAtoms.png

Overview#

Over the past 20-30 years the groundwork has been laid for precise experimental control of atomic gases at ultracold temperatures. These ultracold atom gas experiments explore the quantum mechanics of their underlying atomic systems to a diverse set applications ranging from the simulation of computationally difficult physics problems [1] to the sensing of new physics beyond the standard model [2]. Most experiments in this field rely on directly imaging the ultracold gas they make, in order to extract data about the size and shape of the atomic number density distribution being imaged.

The goal of this project is to introduce you to some relevant image processing techniques, as well as to get familiar with an image as a data element. As you will demonstrate, images are a kind of data with a very large number of features, but where almost all of those features within some region of interest are highly correlated. Of interest in this project is how the effects of real imaging systems distort the information contained in an image, and how those effects can be unfolded from the data to recover information about the true density distribution.

Capturing all possible kinds of aberrations and noise present in real experimental data of ultracold atom images is outside the scope of the simulated test data in this project. Instead, we limit ourselves to a few key effects: optical aberrations in the form of defocus and primary spherical aberrations, pixelization from a finite detector resolution, and at times toy number density noise simulated as simple Gaussian noise.

Data Sources#

Data is available here on the course GitHub, along with ImageSimulation.ipynb, the notebook used to generate the data. You are encouraged to read through that notebook to see how the simulation works and even generate your own datasets if desired.

You are welcome to access the images however works best for you, but a simple solution is to download the dataset folder to the same path as this notebook (or the one you plan to do your analysis in) and use the following helper function to important an image based on the true parameters of the underlying density distribution—found in the image file name.

def Import_Image(dataset_name, imaging_sys, atom_number, mu_x, mu_y, mu_z, sigma_x, sigma_y, sigma_z, Z04, seed):
    """Important an image following the project file naming structure.
    
    :param dataset_name: str
        Name of the dataset.
    :param imaging_sys: str
        Name of the imaging system (e.g. 'LoNA' or 'HiNA').
    :param atom_number: int
        Atom number normalization factor. 
    :param mu_x: float
        Density distribution true mean x.
    :param mu_y: float
        Density distribution true mean y.
    :param mu_z: float
        Density distribution true mean z.
    :param sigma_x: float
        Density distribution true sigma x.
    :param sigma_y: float
        Density distribution true sigma y.
    :param sigma_z: float
        Density distribution true sigma z.
    :param Z04: float
        Coefficient of spherical aberrations.
    :param seed: int
        Seed for Gaussian noise.
    :return: 
    """
    dataset_name = dataset_name + '\\'
    file_name = imaging_sys + '.atom_num=%i.mu=(%0.1f...%0.1f...%0.1f).sigma=(%0.1f...%0.1f...%0.1f).Z04=%0.1f.seed=%i.tiff' % (atom_number, mu_x, mu_y, mu_z, sigma_x, sigma_y, sigma_z, Z04, seed)
    im = ski.io.imread(dataset_name + file_name)
    return im

Questions#

In this part of the project you will attempt to use two kinds of blind deconvolution algorithms to recover the measurement response, or PSF, of a cold atom imaging system. Sometimes in experiment we can’t reliably create gases far below the Rayleigh criterion in order to directly measure the PSF. Additionally, some optical aberrations can’t be corrected for. Spherical aberrations represent a source of systematic error that may need addressing. The inability to make resolution limited objects also poses a problem when trying to bring an image in focus, as several planes will appear to be all in focus at once. The ability to suss out the measurement response without much knowledge a priori is therefore an incredibly valuable tool for the fast implementation of new imaging systems.

The images in this dataset have been generated with the same ImageSimulation.ipynb available at the data source link. Again we are utilizing the high numerical aperture (\(\text{NA} = 0.4\)) imaging system with an \(M=20\times\) telescope projecting the image on to the same \(\text{PS}=13\) um detector. The images are limited to small isotropic gases (\(\sigma = (1.0 \pm 0.5)\) um), still above the resolution limit, therefore we will crop a smaller region of interest of only \(25\) um about the center of the gases and work with less data. As hinted at above, a moderate amount of spherical aberrations will be present in the response (\(Z_0^4=0.3\)) and images will be taken about with object centered at planes \(\pm 5\) um about the true best focus.

Also available in the dataset is a truePSF.tiff image which contains measurement of the imaging system response of an object much smaller than the Rayleigh criterion, i.e. the true PSF of the imaging system. Import this as a separate data object, which you will compare to your fitted responses. Note that the PSF here is also ‘pixel limited’ meaning we can’t resolve the first minima of the diffraction pattern.

Question 01#

Implement a linear deconvolution of the data in order to estimate the PSF of the aberrated imaging system used to generate the data. Display images of the fitted response and a few images before and after deconvolution with the fitted response. A useful starting place would to implement a regression algorithm from the sklearn toolbox with Tikhonov regularization.

How well does your fitted response compare to true PSF? Use an explicit metric.

Question 02#

Read through chapter 5 of [3], and implement a similarly structured convolutional neural network (CNN) to perform a blind deconvolution of the PSF in the dataset here. [4] and [5] may also provide useful frameworks.

Compare with the response you get from the linear deconvolution in question 01.

References#

  • [1] S. S. Kondov, W. R. McGehee, J. J. Zirbel, and B. DeMarco, Science 334, 66 (2011).

  • [2] W. B. Cairncross, D. N. Gresh, M. Grau, K. C. Cossel, T. S. Roussy, Y. Ni, Y. Zhou, J. Ye, and E. A. Cornell, Physical Review Letters 119, (2017).

  • [3] C. J. Schuler, Machine learning approaches to image deconvolution. Eberhard Karls University of Tübingen (2014).

  • [4] Herbel, J., Kacprzak, T., Amara, A., Refregier, A., & Lucchi, A. (2018). Fast point spread function modeling with deep learning. Journal of Cosmology and Astroparticle Physics, 2018(07), 054.

  • [5] R. J. Steriti and M. A. Fiddy, “Blind deconvolution of images by use of neural networks,” Opt. Lett. 19, 575-577 (1994)


Acknowledgements#

  • Initial version: Max Gold with some guidence from Mark Neubauer

© Copyright 2023