Broadcasting a Dynamic YouTube Video for Slipknot
Putting the “We” in We Are Not Your Kind

Slipknot has had a year of triumphs and tragedies while setting up last Friday’s release of their new album We Are Not Your Kind. Well, the reviews are in and the critical response to the record has been overwhelmingly positive. Last Monday, I pitched a concept for record release day, was granted approval on Tuesday, and launched the damn thing Thursday night. It was an incredible display of developer friendly systems coming together for a common purpose: gathering maggots.
So, what was the idea? Well, we knew we wanted to develop a Spotify and Apple Music powered player to begin with (inspired by a recent Lamb of God campaign.) In addition, we thought it compelling to provide a visual of fans and their listening activity. The great thing about establishing the custom player it allows you to publish your own set of realtime events which you can use in a visualization (or simply an understanding) of how your album is performing. I decided we would let fans take photos of themselves while listening and also geocode their IP address into a pair of coordinates for map visualization. This was all par for the course if you’ve seen my work in the past. However, I had an additional aspect I wished to experiment with: YouTube Live.
Rather than create a web accessible visualization , I thought it might be interesting to instead broadcast the visualization on Slipknot’s YouTube channel live. This would create a shared meeting point for the album release which would allow everyone to see the spread of the record and give fans the ability to share a photo. For a fanbase as loyal as Slipknot’s, this type of gathering made sense.
At this moment, we are still broadcasting and you may visit the web app to stream the record and take a photo of yourself. Stick around to learn about some of the key systems at work.
Progressive Web Album

The core web app is a custom album player powered by the Apple Music MusicKit API and Spotify Platform. I would suggest checking out my Lamb of God case study on what I think is important about progressive web albums.
I actually spent some time earlier this summer creating a Vue.js component which handles the logic of wrangling both Spotify and Apple Music into one cohesive interface. At the moment, this engine sits privately on my bit which allows me to easily bring in this functionality to projects and evolve the component as necessary.
One of the bummers of authenticated streaming is device coverage. In particular, Spotify will not stream in full on mobile devices. For Lamb of God, the fallback provided simply told users this, but I wanted to provide a “clips” player this time around. So, I wrote the simplest HTML audio element player in Javascript which receives an array of MP3 urls and track meta then defines functionality for toggling playback and jumping through the queue.
let tracks = [{
id: '',
name: '',
url: '',
duration: 30000
}]let player = new preview.Player({
tracks: tracks,
onload: () => {
// loaded
}
})player.togglePlay()
Once I clean up the preview player, I’ll throw it up on a gist for you to check out. 🧼
Live Photography
I’m no stranger to camera work, having built a camera powered app as recently as the David Bowie Space Oddity campaign. Check out that case study or Quote Camera for an understanding of how I integrate WebRTC to create custom camera experiences in the browser.
Typically these camera campaigns involve a silo’d individual making content but our project required the photos be brought into a shared visual. This requires temporary storage and real time event broadcasting. I decided to create an AWS S3 bucket which I would upload photos directly to with a pre-signed URL. This forced my otherwise static website into the world of server processes. 😔 Luckily, my host Netlify had me covered. Following a few tutorials on the subject, I created a Netlify Function which would create a pre-signed url for putting objects on S3. I then used Uppy and it’s AWS S3 plugin as the interface for managing file uploads. Uppy signs the upload the Netlify function and returns the URL of the uploaded file.
let uppy = Uppy({
autoProceed: false,
})uppy.use(AwsS3, {
getUploadParameters (file) {
return fetch(`/.netlify/functions/presign?key=${file.name}`)
.then(response => response.json())
.then(data => data)
}
})uppy.on('upload-success', (file, data) => {
let url = data.body.location
})
This URL is then sent to the visualization interface using Pusher.
IP Address to Heatmap
We knew we wanted to include a map of activity as part of our visualization which shows where fans were coming from. Initially, I thought I would build this on top of Mapbox but the terms weren’t clear on broadcasting their map live without a special license. Instead, I focused on coming up with a simple custom solution. I thought if I could turn an IP address into a pair of coordinates and then turn those coordinates into screen pixels, I could create a simple visual.
Maxmind provides an excellent GeoIP service with companion Javascript library which will turn a user’s IP address into a location at several levels of fidelity. It’s not perfect but it’s close enough for our abstract visual. Each pair of coordinates is then sent to the visual using Pusher.
geoip2.city(data => {
// data.location.longitude
// data.location.latitude
}, error => {
console.log(error)
}, {})
We can then take the coordinates and convert them into pixels based on mercator projection. Thank you Stack Overflow.
let coordinateToPixel = (height, width, coordinate) => {
let x = (coordinate[0] + 180) * (width / 360)
let latRad = coordinate[1] * Math.PI / 180
let mercN = Math.log(Math.tan((Math.PI / 4) + (latRad / 2)))
let y = (height / 2) - (width * mercN / (2 * Math.PI)) return [x, y]
}
With screen pixels readily available, you can come up with any kind of visual. The client envisioned a heat map which showed “hot” points around the world. You could just about imagine my surprise when I stumbled onto mourner’s simpleheat library which did exactly that. Not that it is surprising mourner had developed a solution, more that Vladamir’s libraries always seem to make it into my projects. (See suncalc on David Bowie.) Anyway, simpleheat is nice and simple, just the way I like it. You pass in some data and visualization parameters and it projects as a heatmap visual onto a canvas of your choice.
let heat = simpleheat(canvas)heat.max(15)heat.gradient({
0.5: 'white',
0.75: 'black',
1.0: 'red'
})heat.add([x, y, z])heat.draw()
Of course, I wasn’t the only person excited about integrating simpleheat into this Slipknot project.
Single File (Maggots)
One thing I was keen on preventing was the visual crashing (to the best of my ability.) I suspected both photos and locations would come in at a high rate, especially on launch. Since the photos arrive as URLs from Pusher, they must first be loaded before being added to interface. To do this, I rely on a handy promise function.
let loadImage = (url) => {
return new Promise((resolve, revoke) => {
let img = new Image() img.onload = () => {
resolve(img)
} img.crossOrigin = 'Anonymous'
img.src = url
})
}
Rather than try and load every photo as it comes in, I decided a queue would make more sense. Luckily, Async provides an aptly named queue method which queues functions at a concurrency you decide. To keep things chill, each photo is loaded one at a time.
import queue from 'async/queue'let photoQueue = queue((task, callback) => {
loadImage(task.url)
.then(img => {
// add image to visualization callback()
})
}, 1)photoQueue.push({
url: url
})
I used a similar setup to receive and draw adjustments to the heatmap. It’s a less costly function but just in case…
Visualizer
I’ve already shared how we created the heatmap so what about the grid of glitched out photos? As the photos are loaded one at a time, they are drawn onto a buffer canvas in a random cell on the screen.
let buffer = document.createElement('canvas')let drawImage = (img) => {
let context = buffer.getContext('2d')
let split = 4 let cellWidth = buffer.width / split
let cellHeight = buffer.height / split let scaled = loadImage.scale(
img, {
maxWidth: cellWidth,
maxHeight: cellHeight,
cover: true,
crop: true
}
) context.drawImage(
scaled,
Math.floor(Math.random() * split) * cellWidth,
Math.floor(Math.random() * split) * cellHeight,
cellWidth,
cellHeight
)
}
This Brady Bunch intro is then passed through a set of Seriously.js WebGL filters which desaturate, blur, glitch, and grain the photo grid composition into something more Slipknot like. Seriously pipes one filtered scene to another and finally to a waiting canvas in your view.
let screen = document.getElementById('screen')let seriously = new Seriously()let source = seriously.source(buffer)
let target = seriously.target(screen)let blur = seriously.effect('blur')
let glitch = seriously.effect('tvglitch')
let grayed = seriously.effect('hue-saturation')
let noise = seriously.effect('noise')blur.source = source
glitch.source = blur
grayed.source = glitch
noise.source = grayed
target.source = noiseseriously.go()
Even though our visual is based on real time data, it isn’t always changing. For this reason, you should tell Seriously to redraw the visual at a constant rate. This is also an opportunity to randomize some of the glitching effect.
let render = () => {
glitch.distortion = Math.random() * 0.1 source.update()
}requestAnimationFrame(render)
You can then use some css z-indexing the place the map on top of the screens.
Broadcast

While broadcasting to me seems like the most magical part of this, it really is the easiest. I have a spare supercomputer Intel sent to me after reading my Marilyn Manson case study along with a bitchin’ internet connection. So, I downloaded the incredible Open Broadcast Software and investigated its feature set. OBS allows you to stream to all major services, including YouTube Live. You can choose from a plethora of sources like video, audio, and what we’ll need: browser. In the end, we simply choose browser, set the URL to our visual, and click “Start Streaming.” As of this writing, OBS and YouTube have broadcasted our stream for over 50 hours.
Technology aside, I think this is a very simple and affordable way to create a meeting point for an album premiere. It gives fans a place to share their feelings about the new release while showing the records evolving reach throughout the world. The photos show that these are real fans gathering around a shared purpose: their love for this artist.
Thanks

I’ve said it once and I’ll say it again: Slipknot is a perfect band to work for and their passionate fans are made for digital experiences. I just want to thank the band and 5b Artist Management for continually allowing me to build the campaigns I believe in and trusting in four day turnarounds. 🤘🏻