If you missed the first blog post or the second update, check them out here:
Update #1: The Backstory of an Audio Visualizer for the Touch Bar
Update #2: Finished Colors & Automatic Aggregate Device Creation!
How We Got Here
It’s been about 3 months so I’m sure you’re all excited to hear about the system audio integration (Don’t worry, there is a sample video showing this below). But before we get to that, I’d like to follow the path we took to get here and describe the work that has taken place prior to this point.
Previously, we had left off at my newly found developer and I teaming up to work on the app together. He was an amazing person to talk to and work with, and we created a substantial prototype in Electron. Unfortunately, after the prototype our developer got incredibly busy with his life and couldn’t commit to developing it with me anymore. I’m really happy I got to team up with him and the project probably wouldn’t be where it is today without him.
The Prototype
Our prototype was designed and created using Electron. Electron is a great tool for devs that know Javascript – you can actually build full fledged apps using just Javascript! And I have enough working knowledge of Javascript to understand what was going on inside the app, win-win.
However, Electron left us with some problems that needed to be sorted out. Mainly:
The audio was only accessible by playing the song through the app.
In order to use the audio visualizer, you needed to have the mp3 file downloaded to load into the app. This could be solved by using the system audio, similarly to what we accomplished, but there were other issues.

There was no way to keep the visualizer open when you clicked off the app’s window.
I tried to fix this by forcing the window to always be in front. I was successfully able to put the window in front, only to be disappointed that you still had to click the window to see the visualizer. There is open source code to force an app’s touch bar configuration open, but it was not compatible with Electron.

The efficiency was not good.
Using Javascript and Electron to produce an audio visualizer is pretty inefficient. The process had to be done by having the app read the audio buffer, process the audio with calculations, draw the rectangular bars on a canvas, then take a picture of the canvas, transform the picture from a png to base64, then display the image on the touch bar. It would do this 30 times a second! The cpu usage would sit around 100% and sometimes get up to 120%. This would kill the battery quite fast.
Electron is not natively compatible with Better Touch Tool.
As far as I know, we wouldn’t be able to make a native integration with Better Touch Tool. Rather, we would have had to send the images we were creating to BTT in base64. This would decrease the efficiency of the app overall.
Although using Electron wasn’t going to be feasible to create full the app we wanted, it still made a fantastic prototype that didn’t take too long to put together. If anybody is curious, we used an open source Electron audio visualizer code to produce the spectrum and then put it on the touch bar from there. The open source project can be found here.
Turning Prototype into Reality
So, we are at a point where we’ve realized that we have a prototype that needs to be reverse engineered using swift and an assortment of languages that have the letter “c” in it. It was time to see what we could scrap together.
System Audio
Let’s start with the audio. Accessing the system audio on Macs can be a pain in the butt if you want to do it by hand. There’s no api to grab hold of the system audio and listen in so you can process the audio or record it. It has been like this forever and you need third party code to access the system audio.
Luckily, Soundflower is an open source app that has been around for quite a long time and allows you to create a virtual audio device so you can read the system audio. There are plenty of examples of how to use and process sound from the internal microphone of a Mac and from there it is adjusting to use the new soundflower input to read the audio.
Audio Visualizer
I’m not a swift guy. I’m barely a javascript guy, but I know python pretty well. I know enough coding to understand what’s going on with other languages, but wouldn’t be able to write anything. When I attempted to create some visuals in swift, it was as if I was a little guppy in the ocean attempting to be a shark.
So what do you do when you have a really awesome idea and have no idea what’s going on? You hire out. Even though the last time I looked to hire somebody the options were quite expensive and limited choices, I wanted to try again because I had a much better idea of the project’s requirements and where we could start.
I only had 3 responses to my Upwork job posting invitations. I only got one person to respond to my response. He had 3 reviews, all 5 stars, but on much smaller projects. I took the leap of faith that when he said he would be able to do all the math calculations to make the visualizer aesthetic, he would actually be able to do it. And……he killed it! I personally think he improved the visuals to make them better than the prototype even. So, shoutout to you, developer who I shall keep nameless.
Using the FFT Spectrum & Algorithms
In order to work with the FFT spectrum, you can start processing audio using Apple’s Accelerate Framework. This framework allows you to do image processing, signal processing, matrix math and vector operations. It’s an efficient framework to use and will be perfect for animating visuals based on FFT data.
For the animations themselves, we used SwiftUI, as our developer was most comfortable with that. We could have also used UIView or CALayer, but SwiftUI turned out to be a great choice. It is very lightweight compared to what I was expecting (after watching my CPU hit 100% with the prototype). Our visualizer now sits around 30% for cpu usage.
While that may sound like a lot, it’s not actually 30% of the computers resources. Rather, this is 30% of one core on your cpu being used for the processing. And since most laptops these days have between 4-8 cores, this is actually a pretty light load on the cpu for what we are doing here. Here is a cool project you can check out that uses SwiftUI and shows why it’s so good for animating in real time.
Forcing the Visualizer to Stay Open
As you saw earlier in the post, when you click off the prototype the visualizer went away. You will have the same issue with an app using swift, because whenever you change focus to a new app, that apps touch bar configuration will show.
We lucked out big here, making use of an open source private touch bar api, Touch Bär. This allows our app to be shown in the control strip, and when pressed will open the visualizer and it will not close while you browse and do other things.
Better Touch Tool Plugin
Talking about forcing the app touch bar configuration to stay open leads us right into this topic. I absolutely love Better Touch Tool. And many of you readers are Better Touch Tool users.
Update: You can use AVTouchBar inside BetterTouchTool with a little bit of configuration. I will write a blog post on how to do this soon.
Well, until we get a Better Touch Tool plugin made for AVTouchBar, you will have to quit Better Touch Tool in order to enjoy AVTouchBar. The reason is because Better Touch Tool will override the Private Touch Bar API and the visualizer will close when you focus on a different app other than AVTouchBar.
With that being said, we should be able to make a BTT plugin with a custom view that displays the visualizer. WHICH WOULD BE THE ULTIMATE AUDIO VISUALIZER FOR THE TOUCH BAR SETUP EVER. Sorry, I get excited thinking about that. Let’s hope it can all work itself out and we get this done.
What’s Left to Do
For the initial beta release, all that’s left to do is touch up the visualization, add some different colors, and add some fluidity to one more crucial step in the process. And that one step is creating an aggregate audio device of your speakers and the system audio when the app is launched. And then have it return to the previously used output when the app is closed. That way you don’t have to create or change anything with the audio settings and the integration will be seamless.
That about wraps up this update. As we wrap up development of this app, my plan going forward is to make the beta available on Patreon for those who want early access. The early adopters will help guide the app to it’s initial full release by giving input on bugs, new features they would like, and all that good stuff. I imagine this will happen in a few weeks time.
Signup to stay updated!

Jake is a professional baseball player that was drafted by the Toronto Blue Jays in 2016. He played in the minor leagues for the Blue Jays for 5 years until he was drafted by the Miami Marlins in the rule 5 draft in 2020. In his spare time, he enjoys creating technology videos on YouTube and pursuing creative technologies including an audio visualizer for the touch bar on MacBook Pros.