We love this time of year — the lights, the celebrations, and of course, the annual ecentricarts holiday card. This year’s is our most ambitious one to date, with plenty of whiz bang design and development flourishes that we’re excited to share.
Great projects aren’t created from the ether; they require people, planning, and process to be executed well. Luckily, our day jobs have taught us a thing or two about running a successful project, and we treated our holiday card like we would any client project, using the same processes and tools from strategy through launch.
We hope you enjoy this peek behind the curtain, and that you love the finished card as much as we do.
The results of our Design Studio session.
We kicked off the project in October, starting with a collaborative brainstorming session with the whole creative and strategy team. The centrepiece of this session was a Design Studio, the purpose of which is to land on a strong creative concept. It involves sketching, presenting, critiquing, more sketching, and voting to choose the final concept. Ours? An animated card of 15 ecentricartisans (and a dog) singing a holiday standard.
Wireframe, Design & Prototype
Following the kick-off, Nancy, UX Designer, created simple, interactive wireframes. Wires are a key part of our process, allowing us to identify any gaps in our thinking as well as validating functionality. While they do take time up front, they always save time in the end. This set of wires was no different — as soon as we saw them, we realized we had a lot of unnecessary features obscuring the overall message and that we needed to simplify the experience. The wires also provided value to Stephen, who was able to start prototyping early on, so that he could think through and solve for anything that might come up in development.
Next, we created an interactive, animated design prototype using Photoshop and Axure to help nail-down the transitions and animations — a vital step for Stephen so he could see the full “vision” of what we were going for. (It was important to get the lowering and flickering of the neon sign just right, for example.)
With the skeleton of our experience solidified, it was time to flesh it out. Because of our timeline, we couldn’t leave much to chance, so we held a series of planning sessions. In them we chose a classic holiday song, found sheet music, created storyboards, and did a few test runs with instruments. We even serenaded the entire office during beer o’clock one day!
As one of the major pieces of functionality of our card is the song, we knew we needed more polish than what we could do with Garage Band. So we headed to Syndicate — a real recording studio — who helped us bring everything together. After laying down instrumentals, our CTO Michael sang a lovely rendition of the song that our other vocalists were able to sing along to. Each person sang the song twice: a “nice” and a “unique” version. (If you’ve listened to the final song already, you may have noticed that we ended up using the unique versions.) For extra flourish, we also recorded a bunch of sound effects, including bells, a slide whistle, shouts, and voice overs.
Bella the bulldog.
With the song mixed and recorded, we needed to add the visuals. We’ve become quite adept at putting together a photo shoot, which comes in handy for projects like this. The art direction drew inspiration from a mash-up of sources: Think Wes Anderson set meets Monty Python, with a whole bunch of Christmas cheer on top.
Igor, Designer, ran the in-house photoshoot, using holiday wrapping paper and props to add that extra layer of holiday kitsch. Special guest: Bella the bulldog.
After the shoot, we got the images ready for Stephen, who'll walk you through the rest of the project.
Thanks, Michelle. On the surface, the concept for this year’s card seemed pretty simple: Sync a grid of 15 animated ecentricartisans to a holiday tune.
That’s it? Sure thing, no problemo — easy peasy, lemon squeezy!
While the build had a lot of moving parts to deal with, at its core we just needed to nail down four key items to ensure the rest fell into place:
- Each member of the choir needed to have a unique character animation
- Choir members needed to move together, as variable sized groups
- Each grouping needed to follow the timing and elements of the song
- The audio and choir needed to start simultaneously and remain synced
Ok, so maybe not peasy easy, but still nothing too crazy, right?
The desired Pythonesque cutout style made for an early win on the animation front, with no need for strict lip/motion/instrument syncing. To handle each choir member as a separate entity, we had two basic options: animated GIF, or CSS sprite. We opted for sprites in this case, as GIFs require double the assets loaded. (Because there's no “pause” control for a GIF, we would have had to toggle visibility between a still image and its animated counterpart. Double the assets means a longer loading time, which means a less enjoyable user experience.)
This basic principle was the same approach behind all of your favourite classic 8-bit (also 16 and 32 bit) video games. Mario images courtesy of Nintendo.
To keep our scope reasonable and the sprite animation modular, we limited each choir member's individual frames to three. Every choir member’s background image contains a trio of distinct frames (or poses) that follow the same pattern. This allowed us to set up the keyframes, animation loop, and sprite positioning only once, and then apply it to all 15 individuals in the choir.
Each choir member’s sprite sheet starts with their static, natural pose on the left, followed by two alternate frames to the right.
We did end up adding an optional speed variable to each choir member, however, which was multiplied against the default length of the global animation loop. This gave us a bit of variety, as some characters’ actions looked better at a fast pace, while others looked better slowed down.
For every beat of the song, we needed to have x number of choir members running their individual in-motion states. We based x on the song itself — layers of singing, instruments, percussion, etc. that are simultaneously active. Our choir had 15 members, so that gave us a potential x value anywhere between 0 and 15 — and that value changed at every interval.
We chose arrays for serving x — an object full of objects containing arrays, to be more precise. Each interval of the song (we based intervals on beats ... more on that later) has a corresponding entry in the sequence’s JSON object, which stores a collection of predetermined choir members to be activated.
Storing the sequence data in JSON format kept the groupings easily labelled and editable by any ecentricartisan — regardless of development chops — if performers needed to be added to, or dropped from any given interval.
This is where the whole concept required a bit more thinking. Fortunately, music is built around measures, which are set by the tempo. A simple composition (like our selected tune, along with most pop songs) keeps to a 4/4 time signature — 4 beats per measure (or bar) — making for a steady and predictable beat (perfect for programmatically syncing animation). When you match a tempo with the time signature, you wind up with the beats per minute (BPM) value.
While this image is technically a screen grab from Adobe Audition instead of ProTools (the software on which our stellar track was so expertly mixed), it does illustrate the majestic beauty of a musical waveform.
For our recording session, we stuck with a speedy 148 BPM. Thanks to our savvy friends at Syndicate, we were actually able to hold to our timing. A set BPM is a helpful value to work with, since we all know seconds run at a fixed rate of 60 per minute and milliseconds at 1000 per second. This pegged our recurring beat rate at a solid 810 milliseconds.
Every 0.81 seconds, we needed to hit the sequence object to trigger the next set of ecentric sprites. In theory.
Naturally, some performers ran less than a full beat. If we had a percussion animation running an entire beat, for example, while the sound only covered one quarter of the time, the result looked pretty sloppy. To remedy this, we added another value to each beat’s object like the timing modifier used on individual choir members’ sprite animation. For every call to the sequence object, the group array is returned along with a number, which gets multiplied against the timer (a default of 810 milliseconds) that calls the next group. Our quarter-time percussion, for instance, would have a multiplier of 0.25 returned with his or her group. Since the next round is triggered by a local timeout, instead of global timer, no scope issues interfere with ensuring the call is made only once the specified time has elapsed.
Now, if I put an extra 0.25 on the chorus, then I should have 0.25 off the next verse, though it’s actually running early, so it should be an extra 0.75, then I’ll make up the 0.5 over here...
Audio and (newer) browsers play well together, generally. Thanks to updates made when the HTML5 spec kicked-in, we had an API suite of extremely helpful options to work with audio (many share functionally with video as well). Using any one of onloadstart, onloadedmetadata, oncanplay, or oncanplaythrough will fire an event to signal a specific point in the file’s loading progress. We chose oncanplaythrough*, which is triggered when the browser estimates that the audio file is sufficiently loaded that, based on the track’s duration, loading will be complete by the time it has played through.
Because our object sequence runs separately from the audio, it's vital that the track can play without buffering or load-time interruptions in the syncing with the sprite animations. That’s one loaded asset and one reliable event fired. We also have two additional JSON files that need to be loaded and parsed before the user can click “Play”, along with images and other assets. Since our JSON is loaded asynchronously (independently of initial page load, when many assets are typically included), and could therefore be available any time during or after page load, we set up a series of pings. A ping fires every time an external asset has completed loading. Each time the ping function is called, it tracks the number of loaded files against a preset goal value. Once the goal is matched and we know that the show can run successfully, the loading screen is removed and the Play button is enabled. Thanks to our rigid BPM, matching sequence object, and replicable sprite settings, we wound up with a functional match of audio and in-browser animation that starts and ends on time.
This all adds up to our friends and family enjoying the holiday card for the silly charm of it, without having to even think about the technology making it run.
Let the show begin!
*While oncanplaythrough is great when it works, we eventually ran into some trouble with the preload trigger on iOS devices. In an effort to curb wasted data and battery life, Apple has dictated that audio and video files must not start loading until the user interacts with the page (generally with a click on a “Play” button) - which is a pretty reasonable theory. If you check out the holiday card on your desk/laptop and an iOS device side-by-side, you’ll notice a slightly different behaviour in the “Play” button to “Loading…” message sequence.