This is a story of how a simple hardware hack got surprisingly out of hand.
Summer of Hacks 2019
It all started in the summer of 2019 during the Oxford Summer of Hacks. One of the events put on was a hardware hack day, where beginner tinkerers could learn about programming real things with the help of more knowledgeable people in the room. I was somewhere in the middle of the scale: I know how to program so I managed to teach a kid and his dad to write some code for a robot powered by a BBC micro:bit, but I needed help when it came to the most basic of hardware tasks.
I had a vague idea of what I wanted to work on during this hack day: after reading a blog post from the Raspberry Pi Foundation I ordered a small screen thinking I could get it to display the Tube status. However, that didn’t arrive in time, so I had to improvise with some LEDs instead.
My first task that day was to get a red and a green LED to be controlled independently from my Raspberry Pi (that I won at the Hackference Hackathon). The red one would show the status of the Central Line, and the green one would represent the District Line. It was easy to get the data from Transport for London: they make the data available for free in a format that’s easy to parse.
In not much time I’d managed to wire up the LEDs to two separate pins on the Raspberry Pi, and I had a thing that worked!
Getting started with Neopixel
By this point, I’d run out of LED colours, but I still had 9 more Underground lines to display so I decided to borrow someone’s string of 50 addressable LEDs. This string had three wires: one for power, one for ground and the other for data. I used a Python library provided by Adafruit that allowed me to set colours for any individual LED on the string. If I wanted to make the first bulb light up orange, there’s not much code involved:
import board import neopixel pixels = neopixel.NeoPixel(board.D18, 50, auto_write=True, pixel_order=neopixel.RGB, brightness=0.1) pixels = (255, 100, 0)
D18 is the pin I connected the data wire to, as seen on this graphic adapted from pinout.xyz:
I also connected the 5v and ground wires directly to the Pi. Probably not the best idea since most tutorials say to power the LEDs using a separate power source to avoid overloading the Pi. That’s why I set the overall brightness to 0.1, or 10%. That’s still plenty bright enough for indoors!
That last line of code sends the right signals through the data wire to light up that first LED in orange. Colours are specified using amounts of the three primary colours of light – red, green and blue – with values between 0 and 255. Orange is made using full red (255) and a bit of green (100). If I wanted to make a bulb flash I would keep on updating the colour to black (0, 0, 0) and back again every second. It worked well enough for the hack day and that’s all that mattered.
The code that I wrote for that day lives on GitHub.
Once the event finished, I gave the LED string back and that should have been where I stopped.
But I didn’t stop there.
I mentioned this project to one of my colleagues who suggested I could extend the project by putting the lights onto a Christmas tree. I thought that seemed a fun idea so I bought my own set of lights. It was this WS2811 set: the same standard that I used for the Oxford hack day.
That evening I kept having more ideas, namely how can I get a Christmas tree to post updates on Twitter. The project got out of hand surprisingly quickly. I made a to-do list with all the features I wanted and registered the account @TflTree straight away.
The basic idea was simple enough: this tree will play a sequence of colours representing the status of all the Underground lines. When the status changes it should record a video along with some generated speech and tweet it. The video should also have subtitles: autoplaying videos on Twitter always start muted, so adding subtitles will make it less likely that people will miss the start of the content, plus they’re essential for people who are deaf / hard of hearing.
I purchased a 3ft artificial tree from Pound Crazy in Shepherd’s Bush for £6.99.
I put the tree in the corner of my living room and decorated it with the LEDs. The Raspberry Pi and camera in its case was precariously balanced on a Gorillapod (which in turn was precariously balanced on a bin). That particular corner of the living room was quite far away from a power socket, but an extension lead and a phone charger was just long enough to provide power to the Pi.
I ordered some wires to make it easier to connect the LEDs to the Pi without me having to solder anything.
While they didn’t require any soldering, they did like to fall apart very easily. A slight knock might disconnect the wires and I’d have to reattach them. Perhaps some solder might have been a good idea.
Since I wanted the LEDs to be repeatedly flashing/fading/flickering independently of each other, I thought I’d create a library that would handle that functionality separately from the code that was responsible for recording the videos. Early on in the project I wasn’t sure if I wanted to run the lights on a separate Raspberry Pi, so I made the library easily configurable so that it could either run in the same process or over HTTP to another device on the local network.
In the end I decided to run the LEDs on the same Raspberry Pi but in a separate process, so it was using HTTP but on the same device. The LED process needed to be run as root, so by separating it out, I could run the rest of my project on a user with fewer privileges.
I haven’t published the library on Pypi, but feel free to take a look at the code on GitHub.
Text to speech
For generating speech, I had a few options. There were programs that I could run on the Raspberry Pi, and some Cloud services too. Amazon Polly would have produced a realistic-sounding output, but I quite liked the idea of something that sounded a bit more robotic. I tried two tools on the Pi – Espeak and Pico – the latter of which had better pronunciation of certain important words like “Bakerloo” so I used that one. I was also glad to see that speech could be generated faster than real time and saved to a .wav file for use later on.
I generated separate .wav files for each “chunk” of audio. That is, one file was created for the announcement that the Victoria Line has severe delays due to a signal failure, and a separate one was created to say that there was a good service on all other lines. This was initially so that it was easier to synchronise subtitles to the audio, but it was super handy for changing the lights when the speech changed subject. By structuring my code with accessibility in mind from the start, it became easier to add new functionality, and I think this could be the case with many people’s projects.
Adding subtitles to the video was an interesting task. Looking at videos on Twitter, most subtitled videos have the text “burned in” to the video, that is they’re part of the picture and can’t be disabled. Other videos have subtitles as an overlay that will disappear as soon as you turn up the volume on your phone. Since I’d never seen a feature to upload a subtitle file along with a video, I thought it might have been a feature exclusive to Twitter Media partners, so I investigated burning in the text to the video first.
Turns out it’s possible to burn subtitles into a video using ffmpeg (a free command line tool that can do various operations with audio and video), and it can run on a Raspberry Pi. The problem is that it takes ages: a 27 second video took a whopping 34 minutes to process! Considering that I was hoping for these videos to be relevant, such a long delay wasn’t acceptable.
I then investigated with AWS Lambda, to see if that would be faster. I was surprised to discover that it is possible to bundle ffmpeg into your Lambda, and it can run very quickly as well. It was good to know that that was an option, but since I had everything else running on this little Raspberry Pi I thought it was almost a shame to have this one bit of processing happen separately, so I went back to look into uploading the subtitle file to Twitter.
After a little bit of research, I discovered that adding a subtitle file was indeed a feature of Twitter Media Studio, but since I’m not a Twitter Media Partner I couldn’t use that tool. I looked into Twitter’s API documentation and it seemed like it should be possible to add subtitles programmatically. I was using the open source library python-twitter to help me upload video and send tweets, but it didn’t support uploading of subtitle files. Poor documentation on Twitter’s side meant that it took quite a bit of guesswork, but I eventually managed to get a subtitle file uploaded and associated with a video! I even managed to contribute my subtitle code back to python-twitter, so it should be available to everyone else in a future release!
The file format required by Twitter is SRT: a feature-sparse but easy to read format. It consists of a line number, timestamp and content. I couldn’t find a standards or recommendations document so I guessed.
1 00:00:00,000 --> 00:00:03,844 The Bakerloo Line is closed. Train service resumes later this 2 00:00:03,844 --> 00:00:04,844 morning. 3 00:00:04,844 --> 00:00:09,184 There is a planned closure on the Circle Line. Sunday 29 4 00:00:09,184 --> 00:00:11,773 December, no Circle Line service.
After the first night of the tree running I thought it would be a good idea to light up the tree when a video was recording.
I wanted each video to have consistent lighting, whether my housemates and I were in the house or not. I originally put a lamp next to the tree, connected to a smart plug (that I won at the Hackference Hackathon), so that when the Pi needed to record a video it could turn the lamp on, then turn it off when it had finished recording.
The tree was perhaps too well-lit! It didn’t feel very festive.
And then I remembered that my living room has a smart bulb (yet another thing I won from the Hackference Hackathon). With a bit of Googling I worked out how to operate it from the Pi, making sure to only turn the light off if the Pi had turned it on: if my housemates were in the living room with the light already on they probably wouldn’t appreciate the Pi turning it off randomly.
By this point I’d also set the white balance of the camera so that the red used for the Central Line didn’t look pink, and I was very happy with how it was running.
I managed to get TfL Tree online and posting updates on the 1st December. It was running pretty much solidly for just under a month, until the night of the 30th. I’d gone back to stay with the family for Christmas and New Year but I made sure I could log into the Pi remotely.
Unfortunately, even with remote login to a hardware project, there are some things you can’t fix: like when the precariously balanced Raspberry Pi gradually gets more and more lopsided. The wires connecting the Pi to the LEDs weren’t very good at staying connected either, meaning the slightest movement could cut their power. On the 30th, the Pi finally tipped over enough so that the wires to the LEDs disconnected, and that was it for the tree: all I could do was watch from afar.
While the project didn’t gain as much traction as I thought it might have, I found it worthwhile all the same. It was great to learn about what can and can’t be done on the Raspberry Pi; I got some more practice coding in Python; I learnt how to tweet programmatically; but I think the most important outcome from this project was my contribution to the python-twitter library. Even if one person uses that library to add a subtitle file to a Twitter video, this whole project would have been worth it.
Will TfL Tree make a comeback in 2020? Only time will tell…