I’ve had this idea rattling around in my head for a while now: Project coloured light behind the TV that aligns with the colors on the screen. It should theoretically allow for a more immersive viewing experience. This is not new – it’s been done before by companies like Philips, Samsung and Panasonic. However, these were high-end TVs with integrated capabilities. There are some other 3rd party solutions that connect inline to the signal of the TV to capture the colour info, but with HDMI encrypted communication (HDCP) becoming mainstream, this is not an option. I wanted to be able to retro-fit this solution onto any TV with any signal, and that’s what I’ve accomplished with this project. No PC required! Once I got all the parts, the build only took a few hours.
Here’s how it works:
- A Raspberry Pi with a camera is placed about 3-5 meters away from the TV. The camera is aimed at the center of the TV.
- A Python script running on the Pi takes a snapshot at the lowest resolution possible – 32×32 pixels
- The Python script evaluates a small cluster of pixels in the center of the screen and averages their RGB values.
- The averaged RGB values are transmitted via Xbee
- An Arduino with an Xbee shield and RGB LED shield receives the RGB values and adjusts the colour on an RGB LED light strand via an RGB Shield.
- The RGB LED strands is affixed to the back of the TV and projects light on the wall behind the TV.
- This process repeats about twice per second
- When the TV is off, nothing is transmitted and the Arduino turns off the LEDs
Key Challenges/Lessons Learned:
- Aiming the camera is a critical step in making this work. I found that my aim was off initially, and it was actually picking up the glow of the RGB LED which created a wicked feedback loop that made everything look green. I FTP’ed into the Pi while it was running and downloaded the image to figure out exactly where it was pointing so I could re-aim.
- The wall behind my TV is painted green. So, when I project a coloured light, it does not appear to be the correct colour. I temporarily fixed this by putting up large sheets of white paper behind the TV. I’ll see what I can do to tune the software to show the proper color.
- In order to avoid heavy computation of lots of pixels, some sort of diffusion was required. I tried to use wax paper and other translucent material, but they did not work. I also tried playing with some of the features of the camera including “blur”, but that did not work well either. Ultimately, the answer was to drop the resolution and let the camera handle the diffusion for me. Now, 1 pixel represents a 10x10cm area of my TV screen.
- The Pi Camera seems really flaky. It would worked fine during prototyping, but when I reconnected the ribbon cable to assemble the case, it stopped working. After much fiddling, I could not get it to work consistently, so I contacted Adafruit Tech Support. I left it off while waiting for a response and then tried it again after a few days – it suddenly started working perfectly. I have no explanation. I am not a fan of the ribbon cables used to connect to the Pi’s CSI interface. However, I wanted to use that instead of a USB camera in order to get better performance.
- The world needs more/better options for the Pi camera case! I scoured the Internet and was only able to find a few decent options. The one I did pick got bad reviews, but I was able to mod it to make it work. I used a tapered reamer drill bit to slightly enlarge the hole. I also used hot glue to keep the Pi in place on the pegs. Finally, I glued the case shut since it kept popping open.
- I’m considering just 3D printing my own case since it needs to be set at a specific angle to center the camera on the TV. Right now, I have folded paper underneath which is functional for testing, but ugly. Also, the Xbee USB dongle is sticking out and would be nice to protect/hide it.
- I modified the /etc/rc.local file to kick off a shell script that ran the Python script. I ran it as a forked process using the “&” suffix which worked well. Inside the shell script, I started with a 10 second pause to give the pi time to boot. It was a bit of a pain to get it all working since my Linux skills are a bit rusty, but turned out well in the end.
Here is a picture of what the Pi actually sees – super low, 32×32 pixel, resolution diffuses the picture for easier processing.
Raspberry Pi’s have been around for about 4 years now, but I must admit that I’ve struggled to find projects that justify their capabilities. Most of my needs are well met by a Microcontroller board like an Arduino and a Pi is overkill. For me, Raspberry Pi’s main differentiated value (compared to a traditional microcontroller board) comes in its ability to work with peripherals like video. So, I was excited to finally find a project worthy of the advanced capabilities of a Pi. Although I had an older Pi (2A), I decided to spend some extra money and buy a Pi 3 in order to get the increased processing power and integrated WiFi for remote control.
Overall, this was a bit more expensive than I would have liked, but the end result looks great and I learned a lot along the way.