ZETA is a collaboration between Paul Ortiz (Chimp Spanner), Daniel Tompkins (TesseracT) and Katie Jackson. The UK artists seek to push their own creative boundaries by exploring epic soundscapes that intertwine with stunning visuals.
This unique project fuses the retro synth heavy decade of the 80s with futuristic and breathtaking imagery, bringing past and future together in a Cyberpunk-esque package. With a huge span of influences ranging from metal, future garage, retrowave, prog, classical and various game and film soundtracks, their music embraces the sounds of electronica, but with textures and layers inspired by the whole musical spectrum.
We had a chat with Paul about creating music for ZETA and how Reason plays a big role in the creative process.
Tell us a bit about how ZETA came about and what your intention was when launching the project!
I guess it kinda formed by accident! So I'd known our singer Dan for a while through the Progressive Metal scene – I was busy with my project Chimp Spanner and he sings for TesseracT. We'd always planned on working together but just never got around to it. It wasn't until I shared a song of my partner Katie's that he approached me, thinking it was a song of mine. After I explained the mixup we decided that it'd be awesome to all work together and, here we are! Originally we'd intended to make a futuristic/chill kind of album, and then for a while it was all-out Synthwave, and then it naturally settled somewhere in the middle. I think it works because we all have a shared love of influences old and new, ranging from Tears for Fears, George Michael and Vangelis to Ghost in the Shell, Future Garage, sci-fi games and all of that.
Being an (almost all) electronic album, what was your approach on producing the album, as opposed to any guitar centered albums you've done previously?
Well the workflow was very different for me. I'm used to just writing on my own, instrumentally. With Zeta what'd usually happen is Katie would give me a MIDI file and a demo mixdown from Cubase. I'd listen a couple of times for reference and then dump the MIDI in Reason and basically start from scratch, then embellish with guitars or add new sections, chord changes, etc. So I guess it was more like re-mixing than anything. Then we'd send it off to Dan to do his thing, get the stems back and edit them in Reason, then figure out what needed to stay or go in the mix to make them fit. So yeah; for someone who's used to doing everything all at once it was a very different experience to bounce the songs around between three people. But it seems to have worked well. Of course some songs I wrote directly in Reason from start to finish but in either case the focus was on drums and bass. I found that once I nailed the rhythm section everything else fell into place, which really isn't too dissimilar to how I approach guitar music.
I've accumulated so many REs over the years that I had a device for just about every job, and where I didn't, I just made one myself in the Combinator.
How did Reason help you creatively when writing music for the album?
It's just fun! We tried Cubase at first; Silent Waves is actually the only track not made in Reason, and it would've been if I had been able to find the project. But I just wasn't happy with the sounds I was getting. Everything was kind of "cold", and I found the environment kinda taxing to work in, especially when it comes to automation. So we made the decision early on to switch. With Reason it felt like I was playing with a bunch of cool toys rather than working. I've accumulated so many REs over the years that I had a device for just about every job, and where I didn't, I just made one myself in the Combinator. But yeah more than anything it's just that fun factor. And then of course on a technical level the clip based automation is just such a time-saver. You can go really crazy with it and not have to worry about setting things back to the right position afterwards. In Cubase I'd normally just leave stuff as it is because I can't be dealing with my parameters being left at the wrong value after MIDI or host automation.
OK, synth nerd alert: what was the most used synth on the album?
Tough one! I'd say Antidote, just because it's so versatile. It's great for those dark unison Future Garage style basslines, as well as pads and leads. But beyond that, I used a lot of The Legend and Viking (wanted that authentic Moog kinda feel). And I'm pretty sure Quadelectra's Jackboxes are on every track (707, 808, Linn Drum). The Kings of Kong ReFill is also fantastic if you want even more retro drum machines. That features a lot also.
Any special, secret Reason production trick used in the process?
Well there's a tonne of side-chaining haha. Kinda comes with the synthwave/future territory. Typically what I'd do is take all my melodic elements (except for lead instruments and vocals) and put them in a group channel called "SC". Then I'd either key the compressor using audio from the kick, or more often than not I'd just use Pump RE and trigger it via MIDI. Having certain instruments outside of the side-chain group keeps the mix from sounding too ducked and keeps those elements more in focus. Also Audiomatic's Tape and Bottom presets got a lot of use on the album. I have no idea what they do, but they make the mixes sound kinda warm and fuzzy, and I like that. Scream's Tape setting is also great for warming up basses and kick drums. Distortion isn't necessarily a destructive tool. It can be really musical.
Scream's Tape setting is also great for warming up basses and kick drums. Distortion isn't necessarily a destructive tool. It can be really musical.
Any tips and tricks for mixing vocals in Reason?
Hmm, considering this was my first time mixing vocals, I think it might be me who needs a few tips and tricks! But I mean, it was a learning
experience. I'd say automate. Lots. I'm kind of a set-and-forget guy normally, but for vocals it just doesn't work. You have to really ride the faders and "play" the mix. Also try using ducking on your reverbs. So you could send a lead vocal to a nice long reverb with a compressor after it. Then use the Spider to take a copy of that dry vocal and send it to the sidechain input of the compressor. Kinda like lazy-man's automation. When there's singing there's less reverb. When there's no singing, there's more reverb. Works pretty well most of the time.
Could you share any synth patches used on the album?
Well a lot of the patches are really not that complicated; most of the basses and pads are really sort of "naked", in that they're not dressed up with a lot of effects or complex routing. It's mostly sawtooth oscillators (either dual detuned or something with a rich/wide unison section like Korg MonoPoly or Antidote) and then a suitable amp/filter envelope depending on whether it's a bass or a pad or whatever. I've included a few patches here, although they're not much to look at!
Download Zeta's Reason presets here!
(Please note that some of the patches requires Rack Extensions)
A few people have asked about the snare on The Distance. And I can tell you it's a layered 707 snare, 707 low tom, and the BBGunSnare_BSQ sample from the Reason FSB, all running into a gated reverb! Ohhh and guitars are almost entirely presets from Kuassa's excellent amp REs!
Follow ZETA on YouTube, Facebook, Spotify, Bandcamp.
LIsten to ZETA's new album here:
https://www.youtube.com/watch?v=WauHp-nt5DA&list=PLUe-KUapQ3AU336POXnHwMt5l2Qj4Bjkd