Friday, 3 January 2020

Why Fireworks Look So Bad on TV

Watching fireworks on television is nothing like watching them in real life. Televised fireworks aren’t very attractive compared with the experience of watching them live, with dull colors and muted explosions. Best-case, they look flat and uninspiring. Worst-case, they can look like badly rendered CGI. A recent Wired UK article dives into why fireworks are so hard to get right, even on modern television sets with high-end color processing and 4K displays.

Fireworks, it turns out, are a nearly worst-case scenario for technology to capture. They combine a dark sky and black backgrounds with high-intensity, high-contrast light, which makes them very difficult to capture in the dynamic range that exists in modern TV sets. For those still angry about the fact that LCDs replaced CRTs, yes, CRTs can offer better dynamic range than LCDs, but that doesn’t fully address the issue in this case. Some of the most common chemical compounds in fireworks emit wavelengths of light that aren’t “within the physical possibility of being seen” on many TVs, presumably because most TV broadcasting is still in SDR and uses the older Rec. 709 color gamut.

Rec. 709 versus Rec. 2020. Image by Wesley Knapp

A TV with 100 percent Rec. 709 coverage (equivalent to SRGB’s color gamut) can only reproduce the colors inside the left-hand triangle, while a display capable of full Rec. 2020 would have a much larger range of colors. At present, however, even top-end displays are only hitting about 80 percent of Rec. 2020, with 85-90 percent expected by the end of the year. Unfortunately, where fireworks are concerned, this isn’t the only problem. The Wired UK article refers to the 50fps recording speed of European TV cameras, before noting that fireworks explode much faster than this in real life. A TV broadcast isn’t fast enough to capture how fireworks really look, and the MPEG-2 and H.264 standards use encoding methods that can result in a further loss of detail. Black areas of the screen are also more prone to compression artifacts in general, which can exacerbate image quality problems.

I thought the issue might be exaggerated, but looking at videos of New Years fireworks, you can definitely see the problems. By about 1:30 in this video, the light reflecting off the haze of smoke makes it much harder to see the fireworks exploding overhead. If you’ve ever seen fireworks in real life, you’ll know the Mark 1 Eyeball does a better job of naturally compensating for these factors. I can’t claim to have done an exhaustive search on YouTube for the best fireworks video, but the video above was substantially better than some of the alternatives. This video shows the problem with frame rate and compression artifacts fairly well, for example. The video above may not be best-case, but it’s far from worst-case.

OLED panels may offer better viewing conditions, with their perfect blacks, but the limits of the color gamut and the video stream are going to weigh against watching any fireworks celebration. Granted, most people probably don’t do that very often — the 4th of July would be the only time I’d think Americans might tune into fireworks — but it’s interesting to see a discussion of all the reasons why the content tends to look so poor. It’s not just one technological factor but several interlocking issues that collectively produce the problem.

Now Read:



from ExtremeTechExtremeTech https://www.extremetech.com/electronics/304045-why-fireworks-look-so-bad-on-tv

No comments:

Post a Comment