Rewinding Foo Fighters Touring History

With Intel, After Effects, Dataclay, and 18 cores

10 min readOct 27, 2018

--

According to Songkick, the Foo Fighters have played over 1,300 concerts since 1995, traveling 1,269,259 miles by van, bus, and chartered jumbo jet. Now think about how many hours of music were performed and consider just how many people had a chance to hear those songs live. That’s a lot of Everlong.

While everyone argues over streaming counts and digital chart positions, I can’t help but be floored by the importance of live music data. Double clicking a song in a piece of software seems so minor next to the emotional weight of attending a concert. Don’t get me wrong, listening to music by any means is highly emotional. However, in terms of the data we store, a play count seems so trivial next to concert attendance. In addition, streaming music data will only go back as far as the service has existed, but live music data predates the web itself.

These are the kinds of thoughts I let run through my mind while at Songkick when looking for ways the service could differentiate itself in the music business. I should also recognize both Bands in Town and Setlist for their contributions in this space. The reality is, every city traveled to, concert performed, setlist played, and encore executed is data that represents hard work by artists forging intimate connections with fans. This is data that should be celebrated.

It’s in this spirit, that I approached the Foo Fighters to conduct an experiment involving rewinding their tour history as part of an ongoing Instagram story series titled, “This Day In Foo Fighters Touring History.” It was also an excellent opportunity to test out the capabilities of the 18 core processor box Intel had sent me after reading about my Macbook Pro catching fire during the Marilyn Manson render project. I figured rendering over 1,300 unique high res 15 second Instagram story videos would be a good benchmark.

You can view an example render and read on to learn how it was done.

Storyboard

A simple storyboard made in Google Drawing

In order to make posting easy for Foo Fighters, I decided to generate a single Instagram story video for each past concert rather than spreading the story over several videos. This decision also established strict guidelines on what I was creating based on Instagram’s own limitations. The video should be 1080px wide by 1920px tall and under 15 seconds in length. Knowing these parameters, I put together a very abstract storyboard outlining each moment of the story and it’s duration.

Initially, we would present a familiar graphic that establishes the video’s purpose as part of our ongoing series. I know from reading Hooked that the mystery around “what show” we’re rewinding to today helps get users excited and engaged. Is it going to be a show near me? Maybe even a show I attended? I’ll invest a couple of seconds to find out.

I then wanted to present a fun “rewind” animation which takes us back to the year the event actually occurred. Animation is not one of my strengths but initially I thought about flipping through old photos from each year or scrolling through a horizontal timeline. Whatever I go with should be visually pleasing and quick without being too hectic.

In the center of the story, we reveal the concert we’re celebrating today. I know that this will be a tricky visual to layout due to the great variance of location and venue text lengths.

I then want to reveal something unique about the show. While it’s tempting to build a scraper that finds photos from past Foo Fighters concerts, realistically the easiest content to add is the setlist thanks to setlist.fm. This also works nicely with the limited time we have to work with. If a setlist doesn’t exist for this show, we would simply remove this part of the video.

Finally, we encourage relevant users to take the next step by sharing their memories using a hashtag unique to each concert.

Design

“Long Collage To Render”

Figma was my tool of choice to tackle the initial design of this thing. Inspired by some of the high level story designs of brands like Spotify, I laid out the composition as a long horizontal collage. By doing this, I could immediately get a feel for the video by simply panning the composition from one side to the other. Since we’re “rewinding” history, I decided to start the composition on the right and pan it backwards.

The typography and layout styling is drawn from the band’s recent “Concrete and Gold” release. There’s a big gold gradient rewind symbol “<<” right at the beginning to establish the direction we’re going. Backwards. For the sake of the collage, I dropped a photo in for each year between the current and featured concert’s date. I challenged myself to figure out a more scalable solution to this problem once I moved into After Effects. To add some depth, I brought in a paper texture for the setlist and dropped in Dave’s signature for added effect. There’s also some subtle arrows pointing to the left that I would hope further established the time travel effect.

Once the manager and I felt comfortable with this static layout, it was time to take it into After Effects and begin testing the limits of the layout with real data.

After Effects

After Effects setup including Templater and Render Garden

I decided to organize elements of the layout using sub compositions within After Effects where it made sense. To begin with, I created a huge 6480 x 1920 composition called “collage” which included the entire layout from the static design. I then placed the collage composition onto my main composition which is Instagram’s recommended 1080 x 1920, knowing that I would be transforming it horizontally throughout the animation. The collage composition also had a few sub compositions such as “setlist” and “location,” which handled laying out the type dynamically.

Speaking of dynamic content, if you followed along with my Marilyn Manson project, you’ll know that I used the excellent After Effects plugin Templater by Dataclay to add dynamic data to my composition. This project is no different. Make sure you check out their amazing YouTube tutorials to learn the basics. Once Templater is installed, you simply add the “Templater Settings” effect to the layer you wish to make dynamic. That layer can consist of images, text, and more. You can then use the provided Templater UI to preview your data sources and create dynamic replications of your composition.

I decided to use the setlist.fm API as my data source. In addition to locations, venues, and dates, they were able to provide setlist information. Go figure! I wrote a simple node script that pulled and organized the information I required into a minimal JSON file which I was able to reference in the Templater UI.

Instead of covering the tedious work of dealing with the overflow of text related to longer song titles and venue names, I wanted to quickly share how I handled the animated text which rewinds the year number. There is a hidden dynamic text layer called “years” on the main composition which receives the total number of years since the date of the concert. In addition, there is another visible text layer which displays the animated text. Following a popular tutorial, I used the slider control effect to tween between 0 and 100 and referenced this number in an expression on the animated text layer.

total = parseInt(thisComp.layer(“years”).text.sourceText);i = Math.round(effect(“Slider Control”)(“Slider”)) / 100;(total * i) + “ years ago”;

The rewind visual I settled on is a live Foo Fighters video which I have chosen to play backwards with a VHS distortion effect. This proved to be easier and more scalable than the photo flipping effect I was considering.

Render

With my design, composition, and data prepared, it’s time to do some rendering! If you recall from my Manson project, this was quite the ordeal last time and cost me a Macbook Pro. This new composition is of similar length but is much more complicated. On the flip side, we’ll only need to render about 1,500 of these videos vs. the 25,000 for Manson. So, am I going to sacrifice my iMac Pro to the cause? Perhaps provision a Mac Pro box on something like MacStadium? Not this time! As I mentioned earlier, Intel was kind enough to send me one of their 18 core systems and I’ve been dying to put it through it’s paces.

Well, as it turns out, After Effects isn’t setup to take advantage of multiple cores by default. While running a test render, I noticed that my CPU usage would plateau around 17% usage. Upon further investigation, my worries were confirmed in the Adobe forums. In the last 3–4 versions of AE, Adobe has not offered a solution for multi processing. Bummer.

Instead of losing hope, I dropped an email to Dataclay. They quickly suggested that I check out the 3rd party After Effects plugin RenderGarden by Mekajiki. RenderGarden promises a setup that gets AE to use all of your cores for rendering. After watching their excellent tutorial videos, it was clear that they achieved this by breaking up larger compositions into pieces and passing each of those pieces to a rendering process. The rendered pieces are then assembled and encoded as needed. Excited, I easily configured RenderGarden and ran a test.

t was faster but not much faster. The issue? RenderGarden was splitting up compositions which were already pretty small (15 seconds) and then putting them back together. What I really need for this use case is the ability to simply render a couple of videos in parallel. They don’t need to be split into pieces since they are already so small. I spent a bit of time thinking through my options but ended up sending RenderGarden an email as well. Their response? It can be done!

If you actually read through the RenderGarden documentation, you’ll learn that they apply two basic functions: seeds and gardeners. A gardener is a process that will do the actual rendering and it’s suggested that you run half as many of these as you have processors. I chose to run eight of these. A seed represents how many times you’d like a video to be split up as part of the assembly process. RenderGarden told me that if I simply set this number to one, their plugin wouldn’t split up anything and simply run eight unique video renders in parallel. Amazing!

Equipped with the knowledge I required, I decided to run a couple of different tests for your enjoyment. First, I would attempt to render 10 and 50 videos using the default After Effects Media Encoder which chooses to render videos one at a time. Then I would try the same quantities with 4, 8, and 16 gardeners! Here are the results:

|     | AE Render | 4 Gardeners | 8 Gardeners | 16 Gardeners |
|-----|-----------|-------------|-------------|--------------|
| 10 | 14:07 | 19:16 | 5:10 | 3:26 |
| 50 | 26:29 | 29:00 | 16:00 | 13:30 |

Taking a look at this chart, it’s clear that the benefits of using Render Garden do not come into play until you’re using 8+ gardeners. I decided to use 10–12 gardeners to render the final total of 1185 videos. That should be enough to speed up things significantly. So… how long did it take?

About 12 hours.

Is that fast? I don’t know. The whole process leaves a lot to be desired starting at the beginning with After Effects supporting multi-core computers. Render Forest is a nice solution in the meantime but I would love to see Adobe step up. While the Intel machine had no trouble distributing and handling the job, the real bottle neck starts with composition replication via Dataclay and without multi-core support, the stress on After Effects really compounds. One solution might be to run the 2014 version of AE which was the last to support multi-core processing. I will continue experimenting with all of this software and hardware and publish my findings here.

Thanks

Photo by Brantley Gutierrez

I seem to always be thanking the Foo Fighters and for good reason; they are always willing to experiment with new ideas and have a rich history worth celebrating. Thanks to the band and their organization for allowing me to develop this project. Speaking of history, thanks to setlist.fm for building a community that allows fans to share live music data and then making that data available via an API. If you ask me, this is one of the most emotionally relevant APIs in all of music.

Thanks again to Dataclay for providing the technology that makes this dynamic video generation possible and to their team for supporting all my use cases. Thanks to Mekajiki for taking it upon themselves to build a solution to After Effects lack of multi processing support. You should definitely give RenderGarden a try.

And finally, thanks to Intel for heeding the call of a developer in need. I look forward to further exploring what this 18 core box is capable of and sharing the results here on my blog.

--

--

Netmaker. Playing the Internet in your favorite band for two decades. Previously Silva Artist Management, SoundCloud, and Songkick.