Late in 2017 I released a concept album / EP regarding time paradoxes called “Reactor Zero”. I used computer type and talk apps to create the audio. While working on the album I started to have the idea of making an alternate story based off of the music and initial concepts. This is the story of how “Reactor Zero: Eradication Protocol” was made…
So the original album was released, people generally seemed to like it and several indie stations were giving the tracks airplay. As an unsigned nobody that is about all you could hope for realistically. I was starting to work on some remixes and collaborations but in the back of my mind I wanted to tell another sci-fi story.
I reached out to several other independent artists I knew from Twitter and Atom Collector Records and asked them if they would be interested in working with me to tell a new story using the framework created by the first album. Some passed, but my friend from Operation Neptune Spear loved the idea and jumped and agreed to work on it with me.
The first part was easy. I had been recording on iPhone and then mixing on my Mac Mini. I muted out all of the vocal parts and sent the files with BPM to ONS (Operation Neptune Spear). We agreed on some initial concepts and I told him to take the story where ever he wanted to go with it. ONS came up with brilliant ideas about a rampant AI as opposed to time paradoxes. Slowly but surely he wrote all of the tracks and recorded his vocals by aggressively belting them into his iPad. He uploaded files to dropbox and sent me links.
So once I had all of his parts, I needed to re-integrate them into the original mixes. One thing I did not want to do was to just insert in new lyrics and call it a day. So in iOS garageband, I imported the original vocal-less MP3 files and created individual tracks for ONS’s parts. Knowing that he was playing the roll of a really pissed of computer that was going to destroy the world it would not seem right to have his vocals remain as they originally were. So I computerized him…
To computerize the vocals I made several copies of them and layered several effects onto each track such as bitcrusher, phaser, distortion and others. From there, I isolated those tracks only and with the iOS FX mastering function, I then went though each songs vocals one at a time playing with them adding stutters, repeats, glitches and other digitizing effects. I exported the combined computerized track out as an uncompressed file and then re-uploaded it back in to Garageband as its own track, wiping out the original modified. From there I balanced his natural vocals to the computerized to get the effect I was looking for. As you listen to the EP, pay close attention to the sounds… they turned out super cool and it honestly took a while…
From there, based on how the vocals felt, I started toying with instrumentation. I added second and third guitar parts with Prometheus, my rebuilt 7 string) a handful of industrial synths and additional layers of drums to give it more of an industrial / EDM / darkwave feel. Once I had everything sounding right, it was time to export the tracks back to the Mac via Airdrop and start mixing and mastering.
Listening to the initial track, it needed something else to make it have the right feeling. I reached out to Skyline Tigers and BetaPSI who I had worked with in the past (and hope to work with in the future because they are exceptionally talented and fun to work with, although one uses Ableton that is a totally different story). Skyline came up with an excellent scientist narrative that I was able to cut and paste into ONS’s in System Status: Eradication. I liked what BetaPSI had done and had ultimately nearly mastered everything when I realized that I forgot to add it in and then for the life of me could not fine the file. (If you read this BetaPSI, I was too embarrassed to ask you for the file, but it was totally awesome). Wether or not I had the file, I decided to not search it out and add more in because I felt the mix was great and the story was perfect. BetaPSI’s part may have ended up as a narration on Requiem: Eradication, but it felt right to leave it instrumental… Maybe we will add a remixed version as a variation 3 later on.
So after bouncing the songs back and forth between my collaborators we hammered out a good mix and master. From there is was time to hype it and eventually release it to the general public to either love it, hate it or not be aware of it… Kind of how it goes.
So that is how it happened. Tech wise, I bounced MP3’s to dropbox, sent the out, uploaded them as new tracks into iOS Garageband, added new tracks and imported them as loops, processed them and kicked them back out to reimport them with added effects and then send back to Mac Garageband for finishing…
Moving forward, I am going to omit the Mac portion to what I do and experiment with mastering only on the iPhone to see the results of audio. With some of the new updates and things I discovered that I had not known about before it could be a lot of fun and even more simplistic.
So, what do you think? Did you listen to the album? Did you like it? Do you have any questions on how I recorded anything?
I would love your feedback and to also share what I have learned so others can take things to their own next level.
Thanks for reading and have a great day!