This project explores a method for enhancing virtual streams of live events by automating LED lighting fixtures in a viewing room to enhance the video signal being displayed in that room. The method involves real-time analysis of the video footage, using K-means clustering to identify dominant colors in frame, and then a series of processing to maintain temporal consistency in color choices and select colors that are appropriate to be used by a lighting fixture; that is to say generally bright and saturated. The algorithm is tested in a virtual visualization of a 10-light marquis superimposed on a live video feed, and shows promise, demonstrating effective color matching and dynamic lighting effects all generated automatically and in real-time.
Four frames from a rendering of a live concert recording at Coachella. Chosen lighting ambience colors are displayed in "marquis" form around the top edges of the screen. The bottom edge of the screen shows detected colors for this frame alone. Note that chosen colors reflect temporally consistent choices, as well as disregarding colors that would not work as lighting (higher saturation or brightness is required). The algorithm is designed to induce variability to keep lights dynamic over time even when colors stay static.
