Building an Interactive Tracklist Reveal for Pop Smoke
Shoot for the Stars, Aim for the Moon
Pop Smoke passed away tragically in February of this year, ahead of the release of his debut album Shoot for the Stars, Aim for the Moon. Since then, his family has setup a foundation in his honor and have moved forward with the release of the record. I had the honor to help his estate and Republic Records develop an interactive tracklist reveal ahead of the album’s release this Friday. This build came together quickly over a handful of days and I also used the time to educate myself on Bashar Barakah Jackson. The recent New York Times piece is a must read to better understanding the ambition and artistry of this young man.
The simple app we put together allows fans to literally point their phones at a virtual star field to reveal the titles of all 19 songs off the album. Check out the app at www.shootforthestars.app and read on to understand how it was developed.
Camera & Motion & Orientation
As it turns out, I already developed an actual celestial navigator for Foo Fighters some time ago so I had a general idea of how I might approach this project. Back then, the Device Motion & Orientation API was freely available (read: not very secure) but now it requires user permission. Luckily, iOS 13 added a simplified permissions dialog to obtain access if the user allows. I already covered this topic here. Similarly, gaining access to the camera couldn’t be simpler with WebRTC.
let stream = await navigator.mediaDevices.getUserMedia({
audio: false,
video: {
facingMode: 'environment'
}
})
I’m most proud of the extremely simple permissions screen which allows the user to grant access to both of these items in whatever order they please. It also handles any of the issues a user may face. I plan on using a similar flow for future projects.
Simple Three Scene
I was dabbling in Three.js recently on a Khruangbin project but finally got to use their new module based installation process this time around. It is very clean. For this project, we just need to import the basics and their Device Orientation controls from the npm installation.
import * as THREE from 'three'
import { DeviceOrientationControls } from 'three/examples/jsm/controls/DeviceOrientationControls.js'
Here’s some random bits and pieces from the Three.js work:
In order to activate the controls, initialize them with reference to the camera.
let controls = new DeviceOrientationControls(camera)
Don’t forget to controls.update()
in your animation loop also.
I added a wireframe sphere to help emphasize the virtual hemisphere which the user is gazing up at and debug the placement of stars.
let geometry = new THREE.SphereGeometry(50, 24, 24)let material = new THREE.MeshBasicMaterial({
color: 0xFFFFFF,
opacity: 0.1,
transparent: true,
wireframe: true
})let sphere = new THREE.Mesh(geometry, material)scene.add(sphere)
The stars are simply Points objects using spherical coordinates to place them in the northern hemisphere of the wireframe sphere. Initially, I was using a single Points object but then realized I couldn’t easily adjust the opacity of a single star once the user revealed it. Instead, I have created a Points object for each star. We decided to randomly but evenly place the stars. However, I wrote an integration that would have allowed us to manually declare the phi and theta of each. We didn’t have time to fill out the polar grid though. 😅
tracks.forEach((track, index) => {
let phi = (Math.random() * 60 + 30) * Math.PI / 180
let theta = (index / 19 * 360) * Math.PI / 180 let vector = new THREE.Vector3().setFromSphericalCoords(
50,
phi,
theta
) let geometry = new THREE.Geometry().setFromPoints([vector]) let material = new THREE.PointsMaterial({
blending: THREE.AdditiveBlending,
color: 0xFFFFFF,
map: texture,
opacity: 1,
size: 10,
transparent: true
}) let star = new THREE.Points(geometry, material) scene.add(star)
})
With stars in the sky and the ability to observe them, we can use a Raycaster to detect when stars come into the user’s center view. Once this happens, we can display the title and lower the opacity of the star so users can keep track of their progress in the star field.
First, initialize a raycaster. Note, you may want to adjust the threshold depending on the size and density of your objects.
let raycaster = new THREE.Raycaster()raycaster.params.Points.threshold = 5
Then, in your animation loop, check for intersections with any of our “Points” and take appropriate action.
raycaster.setFromCamera(new THREE.Vector2(), camera)let intersects = raycaster.intersectObjects(
scene.children.filter(child => child.type == "Points")
)if (intersects.length) {
intersects[0].object.material.opacity = 0.15 // show title
} else {
// hide title
}}
Liquid Titles
In order to add a bit of delight to the track titles themselves, I have employed Bradley Griffith’s excellent Blotter library to add a liquid effect to each. Since Blotter doesn’t support line breaks out of the box, I created a simple Vue.js component to dynamically handle this. Here’s is a simplified version of the solution in CodePen.IO form.
Thanks again to Republic Records, management, and the family for allowing me to be a part of this. Rest in power Pop Smoke! Shoot for the Stars, Aim for the Moon is out this Friday, July 3rd.