Reverse engineering Apple Music's background gradient | ||||
|---|---|---|---|---|
Motivation#Apple Music, in the fullscreen view for playing music, shows an animated, flowing gradient in the background that appears to sample colors from the playing song’s album art.
I spent a lot of time trying to figure out how this could work before giving in and actually doing a bit of research into it. I ended up going down a multilayer rabbit hole (as most of my projects end up in) to reproduce it myself1. This was a relatively short project: I initially started looking into it in the afternoon of October 20…
…and wrapped up in the evening of October 25. The vast majority of actual work that contributed to the final project was actually done in the last two days, which ought to give you a sense of how finnicky these projects are. There are still things which my reproduction doesn’t do which drive me crazy, but that’s for later. Initial exploration#This all started off with a message I sent to Gemini on a whim, after doing about 5 minute of googling:
Gemini managed to dig up this wonderful thread from developer/designer Sam Henri Gold, which is the key resource I used for practically the entire project. It notably does not work by sampling colors from the album art and blending them, as most gradients of this type do; here’s how Sam explains it:
This is quite smart; the notable advantage is that all of the functionality of the gradient can be implemented as a series of fragment shaders. There are three key steps:
At the time, though, I didn’t fully understand the idea (mainly because I didn’t even know how shaders worked), so I shot off a DM to Sam:
At the time of writing, Sam hasn’t replied. I was on my own. Also note that there are a huge number of associated configurable constants — how many copies of the artwork, the sizes of each copy, the speed at which they moved, the standard deviation of the gaussian blur, the blur kernel size, twist angle, twist radius, number of twist effects, etc. etc. I had no idea what these should be either. I decided the best way to resolve both of the issues (ambiguity with how the effect worked, and choice of constants) was to reverse-engineer the Apple Music app myself. This wasn’t easy, considering I was basically trying to debug a proprietary app on a proprietary platform, both owned by Apple, which has of course built in many safeguards to prevent reverse engineering. Gemini threads:
Trying to reverse-engineer the desktop app#I first looked at how Sam had done his research: he had used FLEX, which is basically the equivalent of DevTools for iOS. However, FLEX requires an iPhone, and the iPhone needs to be jailbroken. I unfortunately do not have an iPhone, let alone a jailbroken one. I thus turned to the Xcode debugger as a last resort. Xcode wasn’t the hail mary it may seem to be: it was extremely slow, had safeguards built in to avoid debugging Apple apps, and often froze in the middle of debugging. But it was a start. Xcode has two main ways to debug an external app: to load in its executable, or to attach to a running process. When using the former, Apple Music generally could immediately detect it was running in a debug environment and crash itself:
so I mostly used the latter. This enabled me to debug Apple Music for several minutes (and, of course, much longer if/when I paused it), although it did occasionally still crash after detecting the deubgger. This was enough to start doing basic work, though.
My main goal here was to disable the blur shader used, so that I could get a better internal look at how the effect worked, similar to what Sam had done. Gemini recommended I use the Metal capture system builtin to Xcode to debug the shaders. Unfortunately, the capturing button was disabled, I assume as a result of me debugging a first-party app.
I reviewed Sam’s FLEX demo again and noticed he didn’t do anything Metal-specific — rather, he found the blur object in the heap and directly edited its sigma value. I tried to do something similar using the LLDB debugger, but inevitably ran into issues. I was able to find the object:
But LLDB would complain something along the lines of
Gemini suggested a lot of other methods to get around this, such as injecting a dynamic framework to swizzle the MPSImageGaussianBlur initializer to set the sigma to zero, but they didn’t end up panning out. This part of the project lasted around 3 days itself. I started growing concerns around the second day that it wouldn’t work, so I decided to go another route. Gemini threads:
Reimplementing from scratch#At this point, I concluded that the fastest route to answer my questions would be to write the effect myself and then tune it until it matched the Apple Music one. I initially decided to try using WGPU and WGSL since my graphics-nerd friends recommended it (and also because Rust = ⚡blazing fast⚡), but quickly burned out after it took about 200 lines of code to draw a triangle, and 350 for a simple image. The concept of textures/fragment shaders/vertex shaders weren’t super clear to me at the time, either, which probably added to the confusion2. I decided to try using WebGL instead on a whim, which led to the fumble of the century:
Yeah, I completely didn’t realize that WebGL and WebGPU were entirely different things and used their own separate shader languages. After getting acclimated, though, I managed to port a basic program to TypeScript with WebGL. And it was so much simpler 😅
I used this tutorial to figure out how to render images and also as a good source of truth for how WebGL works in general. I jumped in headfirst without reading any prior parts of the tutorial, which might have been a mistake, but oh well. The development itself was relatively boring, but after a day or two of coding I ended up with this TypeScript code which implemented all of the needed effects. It looked like this (again, using the Breach album art for the demo):
It was relatively good, but not yet perfect:
Gemini threads:
Reverse engineering the web app#It was at this point that I realized the obvious thing, which was that Apple Music Web had a very similar view. It’s not the exact same as the MacOS app’s background, which I’ll talk a bit more about later, but it was close enough to continue with. I initially searched for general terms I might expect to find in a shader — This wasn’t trivial, as the file itself was over 12,400 lines of untyped, minified JavaScript. The minification meant that no variable or class names remained, making it very hard to understand what was going on; unfortunately, Apple remembered to not include source maps, unlike the recent incident with the App Store web app. After manually hunting around a bit I compiled a list of shaders. There were a few common ones like one to apply a transformation matrix to a texture, and a few specialized ones. Notably, I found the shader which implemented the infamous twist effect!
There were a few shaders which confused me, like one that ostensibly did a box blur — despite my knowledge that Apple Music on Mac used Gaussian blurs — although I later realized that shader actually ended up being used to implement Kawase blur (which approximates Gaussian blurs). It was at this point that I actually bothered to scroll to the bottom of the line and read the last line:
This was very important, as it clarified the I looked into the constructor for
The key things to note here is that
worked perfectly!
The animation itself, however, was still a black box to me. I could make very broad changes by enabling or disabling full shaders, but otherwise couldn’t change the animation at all. After a lot more poking around, I eventually realized that the vast, vast majority of the file was actually just a minified embedded version of Pixi.js! Pixi is a “wrapper” of sorts around WebGL and similar standards which provides a game engine-like API to access them. After checking the This took quite a bit of work, basically the entirety of Saturday, but I eventually managed to finish decompiling the code and get a TypeScript reconstruction working! You can see the TypeScript code here. The code itself is less than 100 lines, meaning that the other 12,300 lines of the file were just Pixi.js (and the adjacent I finally could get a good look inside the inner workings of the animation. It was quite interesting:
This likely isn’t how the native animations work (more on this later) but is good enough to get a mesmerizing animation for the web version. I made a few more tweaks and have put up a demo here. The demo was actually built with Solid, making this my first ever Solid project. It was a quite interesting development experience; I ran into a few weird issues with the stores that led me to just use regular signals, which I think is a result of me not having the full intuition for Solid’s mental model, but it was overall very nice! Getting Astro to play nice in a project with both React and Solid was not fun but I managed to get it to work. Gemini chats:
Conclusion#This was a very fun exploration! I learned a lot about reverse engineering and shaders/WebGL, so I think it was definitely worth the week of work. It still doesn’t perfectly match the native Mac view, which drives me a little crazy. Take this example, where the blur on some parts of the background is clearly sharper than that of other parts:
A chat with Gemini seems to suggest there could be some kind of edge-preserving blur at play here, but I’m not sure. A few future directions I could take this are actually learning WGPU (despite its verboseness…) and porting the visualization to run natively, perhaps converting the shaders via wgpu’s As always, thanks for reading! Footnotes#
|