Automated Brand Color Accuracy for Real-Time Video

Published On: April 26, 2020Categories: ,

Sports fans know their team?s colors and recognize inconsistencies in brand color display when their beloved team?s colors show up incorrectly. Repetition and consistency build brand recognition and that equity is valuable and worth protecting. Accurate display of specified colors on screen is impacted by many factors including the capture source, the display screen technology, and the ambient light which can range broadly throughout the event due to changes in time and weather (for outdoor fields) or mixed-artificial lighting (for indoor facilities). Changes to any of these factors demand adjustments of the output in order to maintain visual consistency. According to the industry standard, color management is handled by a technician who manually corrects footage from up to two dozen camera feeds in real-time to adjust for consistency across camera feeds. In contrast, the AI-powered ColorNet system ingests live video, adjusts each video frame with a trained machine learning model, and outputs a color corrected feed in real-time. This system is demonstrated using Clemson University?s orange specification, Pantone 165.? The machine learning model was trained using a dataset of raw and color-corrected videos of Clemson football games. The footage was manually color corrected using Adobe Premiere Pro. This trains the model to target specific color values and adjust only the targeted brand colors pixel-by-pixel without shifting any other surrounding colors in the frame, generating localized corrections while adjusting automatically to changes in lighting and weather. The ColorNet model successfully reproduces the manually-created corrections masks while being highly computationally efficient both in terms of prediction fidelity and speed. This approach has the ability to circumvent human error when color correcting while constantly adjusting for negative impacts caused by lighting or weather changes on the display of brand colors. Current progress is being made to beta test this model in live-time broadcast streaming during a live sporting event with large-format screens in partnership with Clemson University Athletics.

Emma Mayes | Clemson University | Clemson, South Carolina, United States
John Paul Lineberger | Clemson University | Clemson, South Carolina, United States
Michelle Mayer | Clemson University | Clemson, South Carolina, United States
Andrew Sanborn | Clemson University | Clemson, South Carolina, United States
Hudson Smith | Clemson University | Clemson, South Carolina, United States
Erica Walker | Clemson University | Clemson, South Carolina, United States

Topics

Share This Paper

$15.00