Have Yourself a Matrix LED Christmas

Making an interactive Christmas jumper

Jo Franchetti
10 min readNov 29, 2019

--

After my previous project, the tweet controlled wedding dress, I wanted to make another wearable using the lessons I learnt while making the dress. Since the festive season is now upon us, it seemed like a great opportunity to make an IoT Christmas jumper!

Christmas jumpers often have a large graphic on the front; Santa, a snowman, a reindeer. I wanted to make a jumper that could change the graphic it displayed depending on what Christmas song was playing. That way, at Christmas parties, I’d always be wearing the right picture for the moment!

TLDR: This is a long blog post, but I wanted to explain each bit, rather than give those interested the frustration of the “How to draw an owl” meme. If you’d rather just see some code, the repo is here: https://github.com/thisisjofrank/LED-Matrix-Jumper

Tech the Halls!

What’s Involved?

  • Some sort of wearable display that can be attached to a jumper
  • A microcontroller to change the graphic on the display that can fit inside a jumper
  • A device to listen for Christmas songs
  • A music recognition service
  • A way to get the recognised song to the microcontroller
  • A battery

The Display

For the sake of affordability, I decided to make a 16 x 16 pixel display for the jumper. This would be made from addressable RGB LEDs. Each LED in the array can be set to a specified colour to show a graphic. It would be low resolution, but who doesn’t love pixel art?!

Imagine the graphic below is an array of 64 individual pixels, setting the pixels to the colours shown would display a basic graphic, in this case a gift. Some LEDs are set to red, some to green and some to black (which is off).

A basic representation of a pixel display

I made so many mistakes while making the dress that I was determined not to make this time around. The dress had 100 individual RGB LEDs in it, each of which I soldered into a chain with copper wire. Each LED had 6 solder points, which meant that I soldered 600 tiny little connections, not only was this backbreaking and eye straining, but I then learnt that each point was a weak spot. When repeatedly bent, the connections would eventually snap, an undesirable trait in a wearable, which has to bend with the person wearing it.

The individual RGB LEDs in the dress

Instead of individual LEDs, I decided to use LED strip for this project. The strip has solder points integrated into the ‘ribbon’ which makes them much less likely to snap. LED strip is great because you can cut it along the marked lines to make it whatever length you need.

Addressable RGB LED strip

You can also see that the strip helpfully has directions labeled on it, which shows you which way to connect it up to the board for data, power and ground.

You can buy LED strip with different distances between the pixels, the denser the LEDs the more expensive the strip. I bought this strip, which has 2cm between each light and which came with waterproof casing (that I ended up removing). The important things to note when choosing which product to buy are the voltage (for this project we want 5V), the pixel density and whether the lights are RGB or RGBW (we’ll talk about why later).

To turn the strip into a two dimensional array, I sewed lines into two pieces of fabric so that I could slot each strip in and hold it in place. The backing fabric is a thick cotton, to protect the array, and the front one is a thin lining fabric so that it will allow the lights to shine through clearly. I measured the distance between each strip to make sure that the distance between the lights was 2cm horizontally and vertically; this would keep the aspect ratio of the images correct. (I am not very good at sewing, please excuse the occasional wiggle in my lines!)

Pockets in fabric ready for each LED strip

Next up, each of the cut strips needed to be connected together, which I did with short lengths of single core copper wire. The final array looked like this:

A 16 x 16 array of RGB LEDs

The Microcontroller

In order to set the colours of the lights, I needed a microcontroller board. I chose the Adafruit Feather Huzzah, a small 5cm x 2cm board, which is easy to hide inside a jumper. It has wifi, which means I can connect it to the internet, which will allow me to getting the recognised songs from the web to the lights. It can take usb power or battery power and it isn’t too expensive. You can write code for the board using either the Arduino IDE or the VSCode extension. I have already written about getting a board and lights set up in this post, so I won’t go over that here, but as mentioned before, double check whether you’ve got RGB or RGBW lights and set the right one when you’re setting up your strip in the code:

Adafruit_NeoPixel strip = Adafruit_NeoPixel(
NUM_LIGHTS,
PIN,
NEO_GRB + NEO_KHZ800 //NEO_GRB or NEO_GRBW
);

Below is a video of the array, once the board was connected, with all of the lights set to blue.

Showing the flexibility of the LED Array

Jingle Fails

Sending an image to the array

The LEDs in the array are addressable, which means that we can use the board to set each one to a specified colour. They take RGB values to define that colour. From our earlier example, an array of the necessary RGB values might look like this:

The eagle eyed amongst you may have spotted a mistake that I made at this point. When we (and the computer) read this array, we read it from left to right going down each line, but that isn’t how I soldered the array together. I soldered it like this, so that I wouldn’t have trailing wires:

Oops, I soldered the LED strip together in a bizarre direction

I have since affectionately termed this ‘snake mode’. Unfortunately for me this meant that in order for my images not to appear jumbled when I sent the colours to the LEDs, I needed to reverse every other line of the RGB array. Oops!

I wrote some JavaScript to do this for me. It takes the array, slices it up into 16 separate arrays of 16 values each then it loops through each of the new arrays and reverses the order of every other one. I can then concat the 16 separate arrays back into one.

// slice the array up into 16ths (chunkSize = 16)
(chunkSize) => {
for (var i = 0; i < arr.length; i += chunkSize)
chunked.push(arr.slice(i, i + chunkSize));
return chunked;
}
// reverse the order of every other line (arraySize = 16)
(arraySize) => {
for (var i = 0; i < arraySize; i++)
if (i % 2 == 0) {
chunked[i].reverse();
}
return chunked;
}

As an example, here is a santa hat and a candycane in snake mode.

A santa hat and a candycane in snake mode mode, represented with CSS Grid

Recordin’ Around the MediaStream…

Listening for Christmas Music

I needed a device with a microphone that could listen to my surroundings. Luckily, we all tend to carry one of these around with us, a phone. I set up a web app with Node, which could record the sound around me and make the data available for a music recognition service. To record the sound I used the MediaStream Recording API. This uses the MediaRecorder interface to take the data from a MediaStream and deliver it for processing, or saving to disk.

With getUserMedia to get the media stream (in this case, Audio) from the microphone, I captured 15 second chunks of encoded audio and sent it to an array buffer, this is what will be sent to the music recognition service.

function startRecording(recorder) {
recorder.ondataavailable = (e) => storeChunks(e, recorder.stream);
recorder.start();
window.setTimeout(() => {
recorder.stop();
startRecording(recorder);
}, 15 * 1000);
}
navigator.mediaDevices.getUserMedia({ audio: true, video: false })
.then(function(stream) {
const mediaRecorder = new window.MediaRecorder(stream);
startRecording(mediaRecorder);
})
.catch(function(err) {
console.log(`The following error occurred: ${err}`);
}
}

I built a front end with a button that triggered getUserMedia and started the recording. I hosted the app on Heroku which provided a url for me to visit in my phone’s web browser.

API Wish it Could be Christmas Everyday

Music recognition

I used the AudD API. It costs $2 per 1000 requests which is affordable for the jumper. They have a great telegram bot for account creation, API key generation and payment. After some trial and error with their service and documentation, I discovered that the API works best if you provide it with a url rather than an audio file. This meant that I needed somewhere to upload my audio recordings to. As an aside, it would have been great if AudD had provided some working example code for me to take a look at, if you’re writing an API please provide demo code as well as documentation!

Do They Node it’s Christmas?

Uploading the audio recordings

Microsoft’s Azure Blob Storage provides a container for uploading blobs (which are basically collections of binary data). It then creates a URL for the blob. In order to use Blob Storage, you’ll need an Azure account and an API key. I used their npm package called @azure/storage blob which provides methods to upload blobs and get urls back.

With the url provided by Blob Storage, it was then possible to send this to the AudD API with a POST:

var request = require("request");var data = {
'url': url_from_blob_storage,
'return': 'timecode',
'api_token': your_api_token
};
request({
uri: 'https://api.audd.io/',
form: data,
method: 'POST'
}, function (err, res, body) {
console.log(body);
});

And it worked! It returned a song title when I played some music!

So, now I had a song title, which could be anything, so I needed to decide on the songs that I wanted the jumper to recognise, then map those song titles to their relevant images to display on the jumper.

We’re makin’ a list, we’re checkin’ it twice

Mapping song titles to RGB value arrays

I made 13 Christmas themed pixel images and converted them from pngs to an array of RGB values using ImageMagick. It provides a command line tool and this command: $ magick convert image.png image.txt will return an array of RGB values.

I then stored each of the 13 RGB value arrays on the Adafruit Feather Huzzah so that it would be able to reference an array from an image name:

key_value_pair images[] {
{
"default",
{{255, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, ...
},
{
"tree",
{{0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 0, 0}, {0, 255, 0}, ...
{
"santa",
{{0, 0, 0}, {185, 27, 24}, {185, 27, 24}, {185, 27, 24}, ...
},
...

The language used to program for the Huzzah is a mix of C with some bits of C++ thrown in, so it doesn’t have a dictionary type. The board has very limited memory (only 4MB), which means that the things we take for granted in Javascript, like dictionaries, which can contain big sets of data and take up lots of memory, just aren’t feasible. What I used instead is an array of arrays. (The 13 images stored in this way took my memory usage on the board up to 89%.) It was then possible to reference the key value pairs and get the pixel values from the image names. The next job is to get the image names from my Node server to the board.

I made another dictionary, this time on the Node server, mapping the song titles to image names:

const map = {
"jingle bell rock": "bell",
"let it snow! let it snow! let it snow!": "snow",
"rockin' around the christmas tree": "tree",
"santa claus is coming to town": "santa",
"deck the halls": "holly",
"we wish you a merry christmas": "pud",
...
};

Then I added an API to my Node app, that based on the last song recognised, looks up the key in the dictionary and returns the value over http. This provided an endpoint for my Huzzah to do a GET request on. There is a brilliant blog post on how to get your Huzzah board connected to wifi and how to make a GET request by Bonnie Eisenman https://blog.bonnieeisenman.com/blog/making-get-requests-from-the-adafruit-feather-huzzah/

Let it show, let it show, let it show!

So now:

  • The app can record music from its surroundings,
  • It can send that recording to azure and get a url back,
  • It can send that url to audD and get a song title back
  • It can look up the relevant image name to match the song title
  • It can put that image name on an endpoint
  • Which the arduino can do a get request on
  • The arduino can then look up which RGB values to display based on the icon name it receives and send them to the LEDs

And it can illuminate the jumper!

We wish you a Merry Christmas and an API New Year!

Now you can build your own music controlled jumper! If this post inspires you to build something I’d love to see it, and if you have any questions or would like to see me give a talk about the jumper then drop me a message on twitter: @thisisjofrank

A heartfelt and giant thank you to David Whitney who not only paired with me on the code, but made it much more beautiful than I ever could and kept me on track when I was deep in procrastination mode 💖.

--

--