‘Sea-thru’ removes water from underwater images
A key part of my work is colour-matching. When possible I take a follow-up visit to a chapel with a digital copy of the processed stained glass image so that I can compare the image in situ. While it is essentially impossible to create a matched set of colours (as the colour temperature of the daylight shining through the glass is so changeable), I do endeavour to ensure that the intensity and saturation is appropriate when comparing one colour with another. Being able to apply an algorithm to ensure colour accuracy would be a fantastic bonus. It is a process that has been developed for underwater photography…
Scientific America recently published an article about oceanographer and engineer Derya Akkaynak (University of Haifa, Israel) who created an algorithm that can ‘remove the water’ from underwater images. Such images typically suffer from colour shift that increases with depth as well as light backscattering that creates a haze.
The process was detailed in a paper presented in June (Akkaynak Sea-thru Paper) and requires distance information to work: multiple photographs of the same scene are taken from various angles, which are then used to estimate the distance between the camera and the objects in the scene, from which the water’s light attenuating impact is removed. The process could prove invaluable for oceanographic researchers.
Ocean Vision. Scientific American 321, 6, 16 (December 2019)