PhotographyStarting Underwater PhotographyUnderwater PhotographyUnderwater VideoVideography

No, You Can’t Fix the Colors in Software Afterwards

or to be precise: you can’t do it very well, in all but the most ideal conditions.

Wot am I talking about? Of course light balance correction in underwater photography, and to some degree also underwater videography (although it’s even harder there).

Light, Software and the Iron Law of Computer Science

The problem is that the deeper you dive underwater, the more the warm colors (red, orange) are absorbed by water. Then you’ll end up with images which are blue-green-grayish, and which simply lack an appeal to the human eye (in most cases; black and white images, for instance, can also look awesome in some situations). But generally, you’d want to have a lot of warm colors in your image.

Typically, underwater photographers will use strobes or lights to get these warm colors. The artificial lights compensates for the lack of light underwater, and specifically for the lack of warm light, which is missing due to the water’s specific absorption of sunlight. But, you can also fiddle around with software, and add some red, so there is no need for a strobe, isn’t it? In reality, that works only to a limited degree, and physics is to blame, not the fact that your camera or software are not good enough.

Let’s look at the spectrum of white light, at the water surface, on a sunny day: white light is really an even mix of all colors from red via orange, yellow, green, blue to violet. That’s what you as a photographer want to shine on your subjects:

If the light in the place where you are has lost some of the warm colors, the spectrum will more look like the one below. This might be the case on a sunny day in the tropics at a depth of maybe 5 to 10 meters. There is still a lot of red left, just less than on the surface.

In the case of such a spectrum, you can just dial up the warm colors in software, and your image will look roughly like it was illuminated by surface white light. Your image will look natural and appealing. A limited amount of color correction is possible. There are dedicated software packages which do just that, and you can also use the levels or curves function in the image processing software of your choice to achieve the effect (I use GIMP).

The situation is different if the light has a spectrum like that shown below. There is very few if any red left – this is a situation akin to maybe a depth of 20 meters, or deeper. There is barely any red light left. It’s not pitch-dark at these depths, but the color composition of the light has significantly changed.

So, if you take a photograph with no or almost red in it, you won’t be able to tweak the colors in software enough to make red appear. As usual, we can’t ever bypass ….

the Iron law of Computer Science: Shit in, shit out!

If you have shit data, the best piece of software can’t rescue it. If you have a photograph without red, the best image processing program can’t bring any red back. You might be able to rescue some red hues, but there won’t be any significant resolution of these reds: there won’t be many shades of red in the picture. It will typically look unnatural and un-appealing. You can amplify only the red colors which your camera has recorded, but you can’t get the red colors of your subject back which your camera’s sensor never recorded (your image data is shit). You can’t pull information about red hues out of your (camera’s) ass.

The problem is even worse with videos, since the codecs generally used to compress the individual image frames compress more strongly than the algorithms for creating a jpg, and your data in the video file contains even less information about colors which are barely present in the scene. A RAW image file, in contrast, contains most data, but even if you use that (and hence your image data contains everything which your sensor recorded), if there was no red in your image, you can’t create it via image correction.

The video below explains the relationships between depth and light, and how to correct for blue-ish imagery:

 

Why This is ALSO Important

There is the superstitious belief going around that flash photography is harmful, or even lethal, for fishes. At least in the case of seahorses and frogfishes that’s NOT the case, and there are two excellent scientific papers to prove it. I outline the whole situation in this bog post of mine:

Debunked: Flash Photography Kills Seahorses

Now, how is this related to color correction in underwater photography? The folks who wrongly believe that they shouldn’t use strobes on seahorses often argue that they just can “fix the colors in software” after they take the shot. As argued above, that generally doesn’t work, at least not too well. Hence they do something that does not work well (color correction of images taken without a strobe), to solve a problem (seahorses stressed by strobes) which doesn’t exist. A bit odd.

I’m always there if anyone needs to have his thinking checked for consistency and scientific realism. You are welcome.

Best Fishes,

Klaus