Imperfect cameras and how to fix them
I began working on ESA's Sentinel-4 instrument simulator project the moment I joined S[&]T back in May last year. Since I came directly from doing a M.Sc. in Lattice Quantum Chromo Dynamics (which involved heavy numerics), and was used to programming in C++ using the IDE Qt Creator and programming in Python, I was pleasantly surprised to find out that the project I was assigned was done using C++ and Qt Creator with a dash of Python on the side.
As mentioned, the project involved creating an Instrument Data Simulator (IDS) for the Sentinel-4 satellite’s Ultraviolet-Visible-Near-infrared (UVN) spectrometer. The Sentinel-4 will be a geostationary (35 786 km) satellite that is going to monitor trace gases in the atmosphere (e.g. pollution) above Europe once launched in the coming years. In contrast, the Sentinel-5P, which S[&]T NL is heavily involved in, is a LEO (around 800 km) satellite in a near-polar orbit, thus covering the entire earth. Moreover, the spectrometer instruments of the two satellites are quite similar.
In order to know that the correction software of the satellite is working properly pre-launch, one needs to simulate the output data of the satellite. The output of our IDS is used as test input to the actual imagery correction processor.
Inside the UVN spectrometer there are two or detectors we are simulating, it's the UVVIS (Ultra-Violet and VISual) spanning 305-500 nm, and the NIR (Near-InfraRed) spanning 750 nm to 775 nm. As with all optics, these are susceptible to a range of phenomena that need to be accounted for. For the past six months, we've been working implementing a simulation of the phenomenon called straylight.
Straylight is something which is exhibited in any system of lenses, and causes light to be scattered in hard-to-predict ways. For normal commercial cameras this effect is mostly ignored and is hardly noticeable since it only accounts for 2-3% of the signal, but in the case of scientific imaging spectrometers straylight needs to be subtracted from the signal before more detailed analysis of the signal is undertaken (such as detecting trace gases).
In the left figure, we see the light path is intended. In the center we see an example of light hitting the sensor due to unintended reflections of the mirror itself, while the right one we see straylight that is scattered of surfaces in inside the apparatus. Source: EUMETSAT
To simulate the stray light signal, we were tasked to implement three algorithms to cover different phenomena.
- Convolutional straylight: this computes aims to rectify near-field straylight in the spectral and spatial algorithm(i.e. would be localized smearing of signal), by subtracting region on the detector convolved by a predefined kernel.
- Uniform straylight: computes stray light cause by scattering from optical surface roughness or particulate contamination in form of an airy disk and other aberrations.
- Ghost straylight: ghosts in the context of straylight is when a portion of the intensity in one "source" region or pixel, is transferred to a "target" region or pixel. This occurs when (a small fraction of) light is reflected by lense surfaces, prisms or grisms (where most light is transmitted), and when light is reflected by the detector itself (where most light is absorbed).
The top image is the input signal, and the bottom is the straylight for the UVVIS detector. Notice that in the upper part of the images(northern region) there exists virtually no signal. This is due to the fact that we are imaging empty, deep space in this region (there are stars in this region, which we however don’t simulate), thus no light comes from there, and the whole detected signal is due to straylight from the earth. The bright horizontal lines are clouds, and the relatively dark homogeneous area in the lower part of the image is a stripe of the Sahara desert.
Perhaps what has been the most interesting in the project so far, has been to derive the convolution kernels, uniform straylight matrix and ghosts signals based of more basic simulations. Having just completed a M.Sc. in Computational Physics, I suddenly found myself in need of many of the tools I had learned of the past few years.
To derive the convolutional and uniform kernels, we had been given simulated, ray traced images for monopositional, monochromatic input signal for each of the UVVIS and NIR detectors.
A simulated stray light signal. The center is a where the actual signal exist, while the stray light of the signal is spread outwards before dying completely.
From these, we needed to separate out the different components of the signal, as they correspond to different types of straylight (except ghost straylight). What we do is to mask a region corresponding to the estimated size of a given effect, and solve the masked out area for the biharmonic equation.
This may sound strange at first, but the advantage is that solving the biharmonic equation for a masked out surface will ensure that the ensuing surface is smooth. That is, it is continuous under derivation.
The numerical method of choice which we landed on was the conjugate gradient method, which for me was something I already was familiar with thanks to my degree.
Finally, by taking different linear combinations using different masks, we extracted the kernels to be used. For the uniform straylight we constructed a matrix-to-matrix 4D tensor, which basically means that it holds information for how each pixel transfer a small portion of signal to the rest of the signal.
Similar was done for the convolutional straylight, where kernels were extracted in a similar fashion, and then applied to different regions on the detector. As a proof of concept, I constructed blocks from Voronoi tessellation, and assigned random convolution kernels to each of them.
The idea behind this is that each kernel can carry information of how much the signal is smeared out as straylight in each block or region, thus we can tune the smearing to whatever physical detector measurement we may be getting.
Ghost straylight is perhaps the simplest stray light to illustrate, as a part of the signal on the detector is transferred to some other location. This can be either within the wavelength boundaries of the detector, or it can come from the outside of the detector.
Different wavelength ranges of the input signal on the left are stretched and translated to other parts of the ghost output signal on the right.
To summarize, straylight is something that is inescapable when using a camera. It's light that should not be there, and can be remedied in several ways with some methods seen here.
We are currently in the phase of wrapping up our work with the IDS for the Sentinel-4 satellite mission, and we expect to dial down our work on it for the next few years. This was my first project to work on in S[&]T, and has served as a soft landing from academia into industry. In other words, I'm already looking forward to the next project.
About the author
Mathias Vege is a scientific software engineer at Science & Technology in Norway, who holds a Master's degree in Computational Physics from the University of Oslo. He has experience with large scale parallelization and high performance supercomputing clusters with code repositories on many thousands of line code. Mathias is one of the key engineers working on the Sentinel-4 project.