Advertisement
  1. Music & Audio
  2. Audio Production
Music

How to Use the Transformer Object in Logic's Environment

by
Difficulty:IntermediateLength:MediumLanguages:

The transformer object in Logic's Environment is one of its most used and most important objects. In basic terms, it looks for MIDI events that match a set of user defined conditions and then alters those events according to a second set of conditions. In short, as its name suggests, it transforms MIDI.

In my last tutorial, we looked at how to create a multi-instrument in Logic's Environment which you can brush up on here. In this tutorial we will take things a bit further and begin to get a feel for transformer objects. We will do so by using them to change the data flowing from just one parameter of our bass multi-instrument in order to control the cutoff knobs of all three instruments equally and in real time.

Step 1

Just so we have a solid reference point, this is the screen we left off on in the last tutorial.

This is the loop (I've changed the Digital Mono synth to the ESE as it is better suited for this demonstration).

Step 2

The first thing we want to do is some prep work in the Environment (Window>Environment or ⌘8) so we can see what is going on. To do this, I first deleted the existing patch cables with the eraser tool. Next I dragged the individual tracks so there was space in between to accommodate a monitor object. I then went to the NEW menu and pulled a monitor object and option-dragged it twice to create two copies and placed them as you see below. I then dragged the patch node at the top of each channel to its neighboring monitor.

Step 3

Here, I've opened the ESE and turned the cutoff knob a bit. Immediately information appears in the monitor cabled to the ESE channel. The meaning of this information will be explained in depth later, but for now it is helpful just to know that it translates to the position of the cutoff knob.

Step 4

I then went to the arrange page and opened the multi-instrument's automation (View>Track Automation or keystroke A) and click-held the track's automation parameter menu (it reads Ch. 1 in the image). I then selected MIDI control 20 as the parameter I wanted to automate (20 being the easiest undefined MIDI parameter to select in this instance).

Step 5

This step consisted of simply drawing a basic automation curve with the arrow tool by clicking to create the nodes along the track and dragging them.

Step 6

Step 6 was quick as well. I just went back to the Environment and attached another monitor object, this time to the multi-instrument object, and pressed the space bar. As soon as the playhead began to play, the monitor displayed every piece of MIDI information the multi-instrument was outputting; namely automation and note event data as it was received in real time.

Step 7

In this step, I pulled a transformer object from the NEW menu in the Environment and patched a free multi-instrument node to its left side (input) and dragged the transformer's node (output) to the input of the ESE channel strip.

Step 8

This gets us to the meat of the tutorial which is hopefully where you will begin to understand how MIDI is read and understood in Logic.

First, we need to understand a bit more about the monitor objects we have filled with data. The data in the monitors is arranged in four columns with the most recent entry on the bottom. The first column displays what type of data is being sent (fader, note, control, etc.), the second column shows the channel number the data is being sent on, the third gives the value of the first data byte (this can generally be summed up as the numerical name of what you are trying to control; 20 means parameter 20 of the multi-instrument in this case) and the fourth tells me the value of the second data byte (generally being the value of the parameter you are trying to control; from 0-127).

As such, in looking at the multi-instrument's monitor I know it is sending out control data (that funky symbol means control data) on channel 1 from parameter 20 at a varying value according to the position of the automation curve.

Step 9

Now it is just a matter of opening the transformer and inputing the data.

Inside the transformer, I see four columns which correspond exactly with those of the monitor object (type, channel, data byte 1, data byte 2) arranged in two rows. The top row tells the transformer what to look for while the bottom row tells the transformer what to change the data to.

Accordingly, I need to tell the transformer to look for control data and only control data. I do this by choosing the equal sign in the top status menu and then choose control in the menu that appears directly below it.

I only want the transformer to act on control data passed through channel 1 (as that is the channel my monitor tells me the multi-instrument is on), so I follow a similar process in the upper channel column by choosing an equal sign and plugging a 1 into the menu that subsequently appears.

Similarly, the monitor tells me that the numerical code for the automation parameter is 20. As I only want to affect that parameter, I once again choose the equal sign and then plug a 20 into the lower menu.

As the automation curve is going to be my main cutoff control and I want it to be able to fully open and fully close (have a range of values from 0 to 127), I do not want to specify a second data byte value for the transformer to look for, so I leave the last column set to All.

Step 10

I now look at the monitor object attached to the channel strip of the ESE to give me the data associated with the ESE's cutoff. It gives me the data in the same order as that of the multi-instrument. I see the cutoff knob is a fader (F) sent on channel 2 and is parameter number 2, with the fourth column telling me the position of the knob.

Back in the transformer, I now need to input the data into the bottom row of the transformer in order to define what the MIDI control data should be changed to (in this case changed into the cutoff control of the ESE). I want fader data to be output from the transformer so I 'fix' the status to fader by choosing Fix in the bottom status menu and Fader in the menu that appears directly below it.

Similarly, I want to change the channel and parameter data that is going to be output and do so by 'fixing' them both to the numeral 2 in the same way as described above. I again leave the second data byte open as I want the incoming data of the automation curve to remain unchanged (0 on the automation curve will be a closed cutoff while 127 will be an open cutoff).

Now, according to the signal flow, all data will flow from the multi-instrument into the transformer, the transformer will look for the control data we specified in the top row and 'transform' it into the fader data we specified in the bottom row and output it to the ESE.

As such, automation parameter 20 of the multi-instrument in effect becomes the cutoff control of the ESE. Open the ESE and hit Play and you will both see and hear this happen as the cutoff knob moves according to the multi-instrument's automation.

Step 11

After the initial transformer was programed, I copied it twice and patched the multi-instrument to both copies and then ran one transformer to the EXS24 and one to the ESM. I then turned the cutoff knobs of the newly patched instruments, ascertained their data via their respective monitors, and plugged that data into the bottom rows of their respective transformer objects (in this case fader, 2 and 25 for the EXS24 and fader, 2 and 3 for the ESM, once again leaving the last data byte unchanged).

Once that was done I colored (option-C) and appropriately labeled the individual objects in the Environment's inspector so as to keep things tidy and easy to understand.

This is the final loop demonstrating the cutoff parameter in action across all three instruments by using just a single control.

Step 12

A reader brought up in the comments section of my previous tutorial that using the volume fader knob of the multi-instrument will automatically lock the volume data of every instrument down the chain to its own value; thereby erasing the volume balance between the channels. Personally, I had never run into this problem as I had always bussed my sounds to a submix immediately and used the submix channel's fader as it had simply been more convenient.

As such, bussing the sounds to a submix still seems the best bet in creating an accurate master fader, but we can take an extra step and disable the multi-instrument's volume fader to insure against accidentally erasing the volume balance.

Here, I used another transformer and set the very top most parameter to 'filter matching events'. I then told it to look for control events that matched 7 (7 being the universal MIDI control number for channel volume). I didn't have to touch the channel number in this instance since it was only going to be put on the multi-instrument and I did not touch the second data byte as I wanted to filter the entire range.

I then inserted the volume filter just after the multi-instrument in the signal chain and then bussed the individual tracks to bus 1 to act as a submix channel and master fader.

Although the transformer takes some getting used to, once understood, it holds a lot of interesting possibilities. I suggest playing around with it to learn some of these possibilities and realize some ways in which you might incorporate them in your own work. Try making the value of one cutoff go up while making another go down, or have a resonance value increase along with the cutoff of the same instrument. The more you realize how the tools you are using work, the better off your music and production process will be.

Advertisement
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.