Image Restoration Project

ABSTRACT


As part of the Research and Development effort at the Space Application Centre (SAC), ISRO (Indian Space Research Organization), there have been several studies to measure a modulation transfer function of a complete imaging system from the actual imagery. Images used for deriving the MTF compromised of various random images including a specially designed target. This paper focused on two particular algorithms for deriving the system characteristics (pertaining to its MTF). The first one focused on deriving transfer function from linear step edges, while the second one focused on derivation of MTF from an array of point sources. These algorithms are tested against standard image restoration measures, and the results have been documented. The work also illustrates means to measure the response of a sensor.


In the second phase of the documentation, results are applied to number of images and compared with other research works. Fine tuning in terms of the filters and parameters used, such as Signal-To-Noise Ratio (SNR) which yield superlative output are discussed, with proper justification. As a conclusion to the project, future extension of the system in existence and further research in the area are explored.


Subject Terms: Image Processing, Image Restoration, Point Spread Function, Modulation Transfer Function, Sample-Scene Phasing.





Project Definition Profile / Problem Statement


There have been several studies to measure a modulation transfer function of an imaging system from the actual imagery. A lot of work has been put into measuring the Modulation Transfer Function (MTF) of the optical system or the electronic system or other parts of the Imaging Structure. In contrast to this work, my project at SAC, ISRO focused on deriving the MTF of the imaging system as a whole from the imagery captured by it. Images used for deriving the MTF compromised of various random images including a specially designed target. We decided to freeze on two particular algorithms for deriving the system characteristics.


In the first algorithm, High resolution satellite image acquired over a linear ground edge, with sufficient contrast, is analyzed to derive the Line Spread Function (LSF) of the imagery system, in both the directions, along track (parallel to the satellite motion direction) and across track (perpendicular to the satellite motion). Both the LSFs are subsequently pooled to give a normalized Point Spread Function (PSF), and this PSF is then transferred to Frequency domain for application to the imagery. Results are also applied to other images acquired over the same imaging system to record their response.


In another effort, the Point Spread Function (PSF) was measured using a two dimensional array of squares constructed specially at a target site. The design of the target creates ¼ pixel shifts of “point sources” throughout the instantaneous field of view (IFOV) of the satellite. This shifting allows the exploitation of sample-scene phasing (quarter pixel shift) to effectively resample the PSF at one-fourth of the IFOV and thereby avoid aliasing in the MTF. The method for sorting the image data, “assembling” the PSF and technique for correcting the rotation of the target relative to the sensor scan direction are implemented. The results are further extended to apply the PSF to the image and document the results against certain standard image restoration parameters.


In the first of the two attempts, results are achieved without any priori knowledge of the imagery or other parameters such as sensors or digitizers. In the second attempt, the sensor parameters and the satellite features are kept in mind, the target site is specially constructed to suit the algorithm, and the results are derived using this priori knowledge. In the first algorithm, the target area is again specially designed to give good contrast. However in this case, the algorithm uses no priori knowledge. The algorithm remains equally effective against other edges, which may be found in natural features. The construction of the target however leads to optimal results.


In the final phase of the documentation, results (Transfer Functions derived using the algorithms) are applied to number of images obtained from the same imaging system and the improvement is documented. The deviations from the algorithm, the importance of certain parameters such as Signal-To-Noise Ratio and other like issues are also discussed. As a conclusion to the project, future extensions of the present work are explored.





Reference


01) B. C. Forster and P. Best, 1994, “Estimation of SPOT P-mode point spread function and derivation of a deconvolution filter”, ISPRS Journal of Photogrammetry and Remote Sensing, 49(6) pp: 32-42.


02) C. Pinilla Ruiz and F. J. Ariza Lopez, 2002, “Restoring SPOT images using PSF-derived deconvolution filters”, International Journal of Remote Sensing, Vol. 23, No. 12, pp: 2379-2391.


03) Robert F. Rauchmiller, Jr. and Robert A. Schowengerdt, April 1998, “Measurement of the Landsat Thematic Mapper modulation transfer function using an array of point sources” Paper 2407, Society of Photo-Optical Instrumentation Engineers, OPTICAL ENGINEERING / Vol. 27 No. 4. pp: 334-343.