In this blog post, we explore the nature of those black bars that appear when you photograph a TV or monitor screen with a camera, their scientific cause, and a simple solution.
Have you ever tried photographing a TV or computer screen with a camera? Didn’t you notice something strange? You often see black bands appearing when you take a photo with your phone, even though the screen looked perfectly normal to the naked eye. This can be frustrating because the camera fails to capture the clear image right in front of you. So, what exactly is going wrong?
To understand this phenomenon, you first need to know that the tiny light bulbs making up the display aren’t emitting light continuously. As the screen continuously changes, the display must constantly render a new image. To achieve this, countless light bulbs, or LEDs, rapidly cycle on and off in short intervals. These bulbs blink in horizontal rows, each row forming a ‘scanning line’. The display rapidly alternates between odd and even scanning lines to render the image. Therefore, the black bands are actually part of these intersecting scanning lines, representing the display’s actual screen state.
However, our eyes don’t perceive this at all. This is due to the phenomenon known as the ‘afterimage’ in the eye. An afterimage is the persistence of a visual image in the visual system after the light stimulus has been removed; simply put, it’s the phenomenon where an image you just saw lingers and flickers briefly before your eyes. This persistence lasts for about 1/16th of a second, and visual changes faster than this cannot be perceived. This phenomenon also plays a crucial role in moving image media like movies and animations. In fact, the video we see is merely countless static images rapidly played in sequence. However, thanks to the persistence effect, we fail to perceive individual frames and instead perceive continuous motion. Similarly, for displays, the afterimage conceals the flickering scan lines.
Let’s briefly return to the topic of displays. In displays, there’s a term called the ‘refresh rate (Scanning Rate)’, which refers to the number of times a new screen is displayed per second. Most displays on the market have a refresh rate of 60Hz, meaning the bulb flickers every 1/60th of a second. In other words, while our visual system remembers the stimulus for about 1/16th of a second after the image disappears, the display flickers at a much faster rate of 1/60th of a second. So, even during the very brief moment the LED light turns off, the afterimage fills that gap. This prevents our eyes from seeing the moment the scanning lines cross, allowing us to perceive a continuous image.
However, cameras have no such thing as afterimages. A photograph is captured in just that very brief instant of the shutter click. Therefore, cameras, which take pictures in extremely short time units down to hundredths or thousandths of a second, capture the entire crossing of the scan lines. In that fleeting moment, the lit LED appears as the original screen, while the unlit LED appears as the ‘black band’ we saw.
So, how can we prevent these scan lines from appearing? While cameras lack the phenomenon of afterimages, they have a feature called ‘exposure’. Common examples of using this exposure function include photographing fireworks spelling out words or when photographers capture the diurnal motion of stars in the night sky. Just as our eyes dilate their pupils to gather more light in dark places, cameras also keep the aperture (the pathway for light) open for a longer time in dark conditions to accumulate light. This duration is called the ‘exposure time’. Literally ‘accumulating’ light, any movement during this exposure time is faithfully captured on the photograph. This can serve the role of afterimage, which cameras lack.
Ultimately, you just need to increase the exposure time. A longer exposure time ensures the aperture remains open long enough for all LEDs on the display to turn on and off. Since the afterimage in our eyes lasts about 1/16th of a second, setting the exposure time to 1/10th of a second or longer yields a smooth, clear display photo. A longer exposure time prevents the appearance of black bands caused by flickering. Of course, as exposure time increases, more light is captured, making the screen brighter. Therefore, you must appropriately adjust brightness using controllable factors like aperture value and ISO value.
Thus, the black bands invisible to our eyes were not ‘strange things’ but the display’s true appearance—the flickering ‘scan lines’. It was simply that our eyes were tricked into not noticing them. The human eye is a true all-purpose camera, capable of automatically adjusting all functions like contrast, depth, and focus. Yet, when it comes to displays, perhaps the camera was seeing things more accurately than our eyes?
With the advancement of cameras, we have delved deep into the world of digital images. The world captured by cameras records reality in a different way than the human eye, and this difference sometimes provides new visual experiences. Ultimately, what we ‘see’ depends not just on what our eyes perceive, but on how we recognize and interpret it. In this sense, devices like cameras become tools that expand our vision.