The rapid development of LEDs is now leading to a greater quest for consistency and improved colour rendition during production. David Fox reports for IBC365.
The area of broadcast and location lighting has been transformed over the past few years, from an era of large, power-hungry lights to smaller, slimmer, lightweight fixtures that use a lot less power.
This is down to rapid development of light emitting diodes (LEDs), which have gone from being cheap and nasty, producing unnatural skin tones, to high-quality full-spectrum lights that not only offer luminous skin tones, but often a wide array of colours and effects.
One of the keys to this change has been our ability to properly measure the quality of the light. There has long been the CRI (colour rendering index), but this only measures what the eye can see, which is great for theatrical or museum lighting, but not for the digital sensors in modern cameras.
Instead there is now the Television Lighting Consistency Index, which has been adopted by the EBU and was devised to work with typical camera sensors. Any light with a rating of about 85 to 100 will result in errors so small that a colourist would not need to correct them. In fact, anything above 50 should be OK to broadcast with a bit of work in the grading suite. Most new lights now come with a TLCI rating, with most of those near enough to the top that you won’t have to worry about grading.
“If a light source [has a TLCI rating that] is 90 or over it just works and everything is fine,” says Alan Roberts, the colour scientist and former BBC engineer who was the driver behind the TLCI standard.
He believes that the standard is “pushing the…
Sign up for FREE access to the latest industry trends, videos, thought leadership articles, executive interviews, behind the scenes exclusives and more!
Already have a login? SIGN IN