I have been pondering the way in which the method by which digital images are captured is affected by what is being photographed. As part of my workflow, I render 1:1 versions of the images and then quickly weed out the ones that are not sharp. This needs you to be able to see some detail in the shot that shows whether the sharpness is there. I have found that, if a Southwest Airlines 737 is in the new color scheme, something odd happens.
Digital image sensors actually capture one of three colors. Each pixel is sensitive to a certain color – either red, green or blue – courtesy of a filter. They colors are arranged on the sensor in a pattern called a Bayer pattern. The camera then carries out calculations based on what the pixels around each location see to calculate what the actual color should be for each location. This process is known as de-mosaicing. It can be a simple averaging but more complex calculations have been developed to avoid strange artifacts.
When I photograph the new Southwest scheme, something strange occurs around the N number on the rear fuselage. It looks very blotchy, even when every other part of the airframe looks sharp and clear. I am wondering whether the color of the airframe and the color of the registration digits are in some way confusing the de-mosaicing algorithm and resulting in some odd elements to the processed image that weren’t there in real life. If any of you have photographed this color scheme, can you see whether you had something similar and, if you did or didn’t, let me know what camera you were shooting with so we can see if it is manufacturer specific or not.