PerforModule Recommends: Effects Order

While we all know that there is no such thing as a perfect or ideal FX chain order for all situations because it totally depends on context, i have eventually developed some general preferences for the order of effects in a signal chain. Recently updating all my templates for Live 11 has further honed my thoughts on the situation.

We can of course swap around the sequential ordering of effect devices, either for a specific intended result or as a matter of experimentation just to see if an alternate routing happens to sound better on given audio.

As usual when sharing my ideas, it is recommended that you not simply adopt the structure as presented, but rather that you test it out in practice and modify things over time to suit your particular style, keeping notes and updating your own templates as you go. Maybe you think the way i place transient shapers before compressors is idiotic. That’s totally fine!

I’ll share below my go-to effects order, and (most importantly)… WHY.
While some of the choices are probably pretty unorthodox, none of them are arbitrary; they all have reasons. Are they bad reasons? Good reasons? Who knows. But i like to think they are built on logical rationale.

Keep in mind you’re seldom if ever going to need all these types of effects on any single track, but for times when you are using even two different processor types, some guidance as to their ordering might prove useful. Resist the urge to add more effects to a chain just because you can. The fewer processors required to get a sound how you want, usually the better.

Scroll to the bottom for a handy cheat-sheet!

~`~

1: High Cut Filter

Why this first? By removing the upper end of the frequency spectrum, the chance of audible aliasing being generated by subsequent plugins is greatly reduced. If on the other hand you apply a high cut filter after effects that are reflecting non-musical aliasing distortion down across the spectrum, it will do nothing to clean up the aliasing below the filter cutoff which has already been embedded .

Simply put, applying high cut filtering at the very start of tracks’ signal chains when possible is a sensible workflow for maintaining the highest fidelity audio possible at all times. Try it — you’ll like it!

Sometimes it’s not convenient to place the high cut filter relegated to a channel strip at the start, due to it being coupled as part of an EQ or whatever. In these cases the options are to; A) just use it where it is and not worry about it, B) create a duplicate of said plugin to place at the start with only its high cut function enabled, or C) use an extra high cut plugin at the start of the signal chain. I usually go with option A or C.

2: Pitch Effects

Includes: pitch shifting, formant-shifting, vibrato, tuning and pitch correction type effects.

Why basically at the beginning? Placing pitch-shifting and other similar effects subsequent to distortion, saturation, etc. can reduce effectiveness, as they generally have a harder time tracking pitch the more complex harmonics a signal includes (i.e. the less like pure sine waves they are). Modulation effects can be even worse in this regard and seriously mess up pitch-tracking.

By a similar token, using a high cut filter prior to pitch tracking can possibly increase accuracy by stripping away upper harmonic information not related to the melodic tones (as long as you don’t set it so low it mutes the roots).

Pitch shifting type effects can often generate unnatural artifacts, which are more likely to be “smoothed over” by later modulation processing, such as chorusing and reverb.

Placement Order Exceptions: Sometimes you might want to tone-shape, transient shape, gate, or compress a signal before it hits pitch effects in order to sculpt the sound to increase the effectiveness of the pitch tracking. Perhaps try this if the pitch tracking on the raw signal seems to struggle at times.

3: Input Console Saturation & Distortion

If you’re using some sort of “preamp” or “console” type plugin (for example, tape saturation), this is where you’ll want to stick it to emulate the effect of running your signal into an analog console, after which the other processing steps occur. (Alternately or additionally, you could use an Output Console… see Step 13 below.)

Here is also where i’ll generally place any additional distortion boxes such as overdrive, fuzz, bit-crushing, or even certain more brutal-sounding clippers.

Why not placed after compression, transient shaping, etc? Since distortion itself can have a dynamics-flattening effect on the transient contour, you may need far less compression on a distorted signal than otherwise, for the same end result. By applying the distortion first, you may not need nearly as much compression.

Placing distortion plugins after a high cut filter can often help to reduce their digital “fizziness”, leading to a more authentic analog grit tone.

Certain types of “enhancer” effects can fit well in this placement step as well; for example, harmonic exciters.

Placement Order Exceptions: depending on the distortion used, you may want to low- and/or high-cut filter first, so that the distortion isn’t reacting to frequency content that isn’t going to be in the final signal anyways. On the other hand, you might want the filtered signal to be pre-colored by fullband distortion. EQing before distortion can also change the way it reacts to a great degree.

Sometimes you might want distortion way later on in the chain, after dynamics and/or modulation effects. However, i generally find that most often if i’m using some distortion, harmonic enhancement, etc. on something, it works well fairly early in the effects chain.

4: Equalization

I usually prefer to EQ prior to dynamic effects in order to sculpt the track a bit before it hits them. For example: reduce low-mid mud and the compressor then won’t react to it as much. Or reduce a high freq spike so that a transient shaper acts more mellow. However, at other times i want to do the opposite. See the placement order exceptions under the compressor section below for more on that.

Many EQs have cut filters built in. If this is the case, you may not need additional cut filters for the track, and steps 4 and 5 can be combined (although additional filters are often helpful anyways).

Linear Phase EQs… When To Use? The most obvious use-case for linear phase EQs is when EQing a duplicated part, or when audio is being sent to multiple locations which are then being summed — otherwise the phase offset between the copies is almost certain to be problematic. For example, if you have a Return Channel in a mix with some tracks sending audio to it, with partially-wet reverb and saturation, and you want to EQ it… a linear phase EQ might be a good idea so that the phase stays lined up with the tracks sending audio. Would any EQs on those source tracks have to be linear phase too? Yes only if an EQ is processing a track post-send, but no on any that are pre-send.

5: Transient Shaping

Why before compression? Since transient shapers are level-agnostic (i.e. looking at and reacting to the comparative ADSR sloping of a waveform rather than its absolute energy levels), it makes the most sense to me to place them first, when they can have strict control over transients, sculpting them carefully to creatively alter the way subsequent dynamic devices such as gates or compressors react. A few possibilities include…

Transient Shaving: shaving off peaky transients a bit first can help make a compressor react more smoothly and less abruptly.

Sustain Attenuation: dipping the sustain portion prior to hitting a signal with a gate can have the effect of giving it increased hysteresis, making the gate cut out the signal at a lower threshold with less chance of removing valid signal you want to retain.

Transient Boosting: pushing up all transients, then hitting them with limiting, clipping, or fast compression afterwards is a technique that can add a forward-leaning aggression to intense material… but be careful about unwarranted distortion, though.

Sustain Fluffing: increasing the sustain signal can help suss out ambient room tone and a touch of it prior to a reverb can enhance the spaciousness of a room when you’re looking for a thick, squashed sound. Watch out for unnatural-sounding envelope transitions with moderate to large amounts.

6: Gating / NR

It may seem intuitive to place a gate/expander first in a signal chain and is usually done that way, but here it ends up being more like halfway through. Gating before compressing can sometimes help prevent the compressor from increasing the low-level noise content of a recording in an unnatural manner (aka “sustain swelling”). Placing a gate after cut filtering or transient shaping can help it to ignore energy-producing content which you don’t care to retain, leading to improved performance.

Noise Reduction effects also fit in this placement step, usually, for similar reasons.

Placement Order Exceptions: Sometimes, it makes more sense to place a gate earlier or later in the chain. Maybe placed earlier, it prevents some noise from being boosted in the first place by cutting it out prior the being enhanced. On the other hand, maybe placed later it cleans up something that was added by a processor. Gating after delay effects can trim the later tailing off, avoiding congestion.

7: Multiband Dynamics / DeEssing

Why place this category before wideband (aka singleband, normal) compression? Well, with multiband dynamics it’s possible to be more precise (surgical), honing in on only a particular frequency range to compress. Careful adjustments can help prepare a signal for how a subsequent wideband compressor will react to it. A multiband dynamics processor (like Ableton’s) which also includes expansion (gating) capabilities, can be considered to apply to step 7 as well as 8, and an additional gate is likely not needed.

I include Dynamic EQ and De-Essing together in this placement step as they serve similar purposes: to hone in on and dynamically affect only specific frequency zones.

Placement Order Exceptions: sometimes a multiband dynamics unit is used as a “maximizer” for the purpose of juicing up overall loudness. In these situations it can make sense to have it placed closer to the end of the chain.
Other times, you may want a bit of dynamic action going on even before the EQ, in which case multiband dynamics can be a better choice than a singleband compressor since it can sculpt tone to a degree.

8: Compression

I almost always prefer to EQ before compression, as the contour will alter how the compressor reacts (usually in a beneficial way). Compression is usually placed pre-modulation, as certain types of modulation effects can alter the peak and rms levels quite a bit, unpredictably changing how a compressor might react. However, you might instead want to place it post-modulation for the very same reason: to control wayward dynamic movements caused by the modulation. Or maybe use two compressors: one for “groove”, and one for “tidying”.

Placement Order Exceptions: When a signal has unpleasant frequency spikes, i often bring the compressor before the EQ in the chain so that it intentionally does squash down transients featuring that frequency more (then requiring less work by the EQ). It all depends on whether it’s more helpful for the full frequency to get ducked during those moments… or not. This is helpful for “tidying”, but not so much for “groove”. It also tends to sound more musical on individual instruments than on group busses.
Another example of when to EQ post-comp might be if you want to do a boost but you don’t want that extra energy to impact the dynamic action… say to get just a bit more 60Hz oomph on a 2buss without the entire mix ducking more every time the kick hits.

9: Modulation

Phasers, flangers, choruses, rotary speakers, width enhancers… all that type of shtuff.
Generally placed late on in an effects chain (or in parallel) rather than early on. Note that although they are pitch-based, chorus effects generally fit into the modulation category rather than the pitch effects category since they are meant to add smear and imprecision, not to add clarity and precision.

Placement Order Exceptions: sometimes you might want to modulate your reverb tails by placing modulation effects after the reverb instead of before. It’s a less “natural” but more “organic” choice. Is that a paradox?

10/11: Delay / Reverb…
& 11/10: Reverb / Delay

I often like to augment Delays and Reverbs with each other, for more complex spatial depth. But…
Delay before Reverb? Reverb before Delay? Wet FX in parallel? What to do?

When a delay is going into a reverb, those delay repeats are going to get diffused (mushier; smeared across time). On the other hand, when a reverb is going into a delay, you will be left with rhythmic pulsations of reverb tail slices. The former is more “soft” feeling; the latter notably rhythmic. So my recommendation is to place delay before reverb for most melodic instruments, but to use delay after reverb for most drums and percussive, staccato instruments.

Keep in mind that running these sorts of effects in serial effectively adds their decay times and multiplies their feedbacks together, so you might want to dial back length parameters when doing so.
You could also run them both in parallel only, so they won’t be affecting each other at all (i.e. sum together the dry, fully wet delay, and fully wet reverb signals). Try this for a less blurry option if you find you don’t like the way the reverb and delay are affecting each other in either serial configuration.

•Reverb into Delay: use for most melodic instruments and slow, wooshy percussion such as a brushed crash cymbal.

•Delay into Reverb: use for most drum tracks and for staccato, transient-laden instruments such as mallets or fast-picked guitars.

•Parallel: use for clean, cohesive spatial enhancement of groups or return tracks .

But what about convolution vs algorithmic reverb? Placing a slight amount of algorithmic reverb after a convolution reverb can make it a little more “alive”; less “cold”-seeming. On the other hand, placing an algorithmic reverb before a convolution reverb (particularly with a small, ambient room type IR) can help place the artificial reverb into a “distinct physical space”.

Placement Order Exceptions: in this list (which is intended for in-series, on-track effects), reverb and delay are shown near the end of the chain. However, if using them primarily on aux (return) channels, it can make sense to have them near the start of their chain instead, followed by other effects such as compressors and filters. Then you will have an “effected reverb” to send tracks to.

12: Limiter / Maximizer

Since individual tracks in a mix aren’t all going to want to be pushed right up to 0dB, don’t be shy about setting limiter thresholds low enough in order to massage some peaks and sculpt the sound a bit, for character, by ear. Doing a bit of tasteful limiting on individual channels is likely to lead to a mix that is easier for the mastering engineer to work with, and can help avoid the temptation to overdo 2buss processing since the summed peak level will be more in control.

So what’s the difference between a limiter and a maximizer? Usually something billed as a “maximizer” is a brickwall limiter combined with additional stuff going on to “juice up” the sound, such as saturation. They tend to generate a whole lot of THD and are not recommended for mastering, but can be fun to beef up single tracks.

13: Output Console Saturation

Just like the input console, can be used to add a glimmer of analog feel. Effects after the limiter? What the heck!? Track limiters are generally not slamming up against 0dB like a limiter on the 2buss might, so there ought to be plenty of headroom for a little touch of additional flavor post-limiter.

Balancing the input and output saturation drives can be a way to pseudo-emulate analog gain-staging, sculpting the character differently as one saturator is hitting the raw signal (and affecting the subsequent processes) while the other is affecting the already-processed signal.

Remember, it’s completely optional, and always A/B the results, as the added THD may be doing more harm than good. Sometimes i’ll use an input console, sometimes an output console, sometimes both, sometimes neither.

14: Fader / Balance

Having a volume fader after all other effects is super useful for automating the level in an arrangement without altering the gain-staging of the effect chain in any way (which would disturb all the dynamic thresholds, etc). Some controls for re-balancing the stereo panning can also be useful at this point. While you could use the DAW’s channel faders to automate level, doing so “locks those in”, so i recommend keeping them free for fine-tuning the balance of tracks against each other during the mixing phase. The same goes for panning; by using a dedicated panning plugin for creative automation during song arrangement / production, you can keep the mixer’s panning controls open for adjustment at any subsequent point in time.

Sometimes, having a fader at the start of an effects chain makes more sense than at the end… for example, if you want to adjust the amount of signal pushing into a preamp. But usually then you’ll still want an output fader.

15: Low Cut Filtering

What? A low cut filter after the output fader? That’s as crazy as using a preamp after a limiter!

Ok, so hear me out. Certain effects (notably compression & limiting) can generate a degree of DC offset (ultra low frequency content). Just like with the High Cut at the very start of the chain, the Low Cut is positioned strategically to clean up the low end during the most ideal moment. In this case, if we cut lows early on in the chain and then use an effect (or volume automation) to alter the sound, DC can be generated, which we’re stuck with. By placing low cut filtering as the VERY last step, we ensure no bonus offset sneaking in. Woohoo!

The caveat? Well, filtering can alter peak level, of course. So you might have to readjust your limiter’s threshold while both it and the filter are engaged, if you need to match a specific peak or true peak output value (say, for mastering).

My favorite VST plugin for post-filtering in this manner lately has been the free RC Filter by Xhip, since it can be set as low as 0.1Hz(!)

Placement Order Exceptions: Sometimes you might want to use a cut filter early on, say to prevent distortion or reverb from being affected by low signal. In this case it’s up to you whether to use an extra one later on to clean up DC with.

16: Analysis

You’ll want to analyze the signal after all the stuff is happening to it. If you group the entire effects chain into a rack, except for the analysis whose GUI is kept open, you can switch the effects chain off & on at will to see the difference between the unprocessed and wet signals in the analyzer.
Some people like using the same analyzer plugin for all tracks, for consistency. I like using different types of analyzers for different types of tracks, with one level-based and one spectrum-based each.

I wouldn’t bother to put an analyzer at the start of a signal chain since you can just disable the chain to see the analyzer respond to the dry original audio. Exceptions to this are certain plugins which require analyzing both the input and output of an fx chain to ascertain the delta (difference) signal.

~`~




The Takeaway

Think carefully about two things when you place an audio effect device into an existing effects chain:
•How the previous devices in the chain will affect the new device.
•How the new device you are placing is going to affect the devices after it in the chain.

If you’re not sure, test out different possibilities. Maybe you like to put distortion pedals after your reverb. Who am i to judge? Sometimes it makes no difference. Two static EQ plugins in series? Likely doesn’t matter which one goes first. My own Elemental Mixing templates diverge from the archetype quite a bit here and there, for various reasons (usually when a single plugin serves two or more functions that aren’t adjacent in the list).

Basically: Don’t randomly place effects wherever. Think about what they are going to do, and why you might want (or not want) it to happen a certain way. Also, don’t worry about if you’re using the “proper” effects ordering or not. As they say, “if it sounds good, it is good.”

 

Effect Order Cheatsheet

Click here to grab the Effect Order Cheatsheet in ods format.

  • 1. High Cut
  • 2. Pitch
  • 3. Input Console & Distortion
  • 4. Equalization
  • 5. Transients
  • 6. Gating / NR
  • 7. Multiband / DeEss
  • 8. Compression
  • 9. Modulation
  • 10/11. Delay & Reverb
  • 12. Limiter
  • 13. Output Console
  • 14. Fader
  • 15. Low Cut
  • 16. Analysis

Peace out, y’all!

One thought on “PerforModule Recommends: Effects Order

Leave a comment