Composer Status Update

Winter sunray
Creative Commons License photo credit: DavyLandman

A microsurgery time this last week as I set out on the path of orchestrating the design I have.  I am barely 1/3 the way through.  Why is it taking so long?  Well, what I didn’t tell you is that I’m also rendering a MIDI prototype at the same time with one of my orchestral sample libraries.  Purists would sniff and say it would never sound like that in a live setting, and that it shouldn’t be done.  The argument away from that stand is that orchestral prototyping is indeed useful for understanding how the dynamics work  with instruments and how these markings should be incorporated into a score. Crescendos, decrescendos, forte, piano in combinations or non-combinations can tell you a lot.  It won’t tell you everything, and it can indeed be misleading if you aren’t experienced, but it still can be useful if you do understand what it can and cannot provide.

MIDI orchestration is a deep subject. I’m somewhat limited in my toolkit and should really look for some more strings.  However, I’m working with what I have and I’m not digging too deep into all my libraries.  The secret to making it realistic is in how you layer and how you use controller information to build in performance data.  Combining sample variations into layers and switching between them can add realism.  Up bows and down bows is a hard example.  Having different stresses or using multiple libraries in legato passages and switching between them in real time during crescendos/descresendos is another example.

For my attempt, I am simply adding expressive controller data for dynamics, adjusting velocity, with the addition of a soloist per section.  The soloist sample adds the realism of bowing and other micro data to the line.  You can also do this with live players, adding 3-4 live musicians and using samples for the remainder of the section – for budgetary reasons.  I’m also raising the lever in each instrument for release sample data.  This helps with a more legato sound.

In recording there are also some other tricks.  Placing the instruments properly in the stereo field and in the reverb can give the instruments a 3d position.  More reverb and less source places the instruments in the back of the stage.  Using EQ can take off some of the frequencies that interfere with each other and mask unrealistic residual overtones.

With some experience you can become pretty quick in editing a MIDI score to get it production realistic.  With this movement so far I would say its pretty close, but there is still a lot left to do and rework before its recording ready.  Dynamics and velocity could use more adjustments.  Additional library samples could be auditioned and layered into the mix.  Notes that come up again and again using the same instrument might be replaced or varied in some sonic way so they don’t sound the exact same.  The instruments could also be better adjusted internally to control ASDR (attack, sustain, decay, release) and other levers.  The overall dynamics in places is too loud.  But for now its good enough for government work.

It’s been very good practice to get into it again.  This is all strings and its very challenging.

As for other thoughts, definitely will need a movement before this one.  And as for this movement or section, I can see now a couple of derivatives.  Firstly there is the long movement for orchestra which is the absolute truthful representation of the polemic or aesthetic statement.  Its also demands or takes quite a bit from the listener (not the right words but to is longer and sets up a psychological universe quite heavily and in a manipulative way w 8 voice counterpoint in places).  Secondly, a shorter version for string quartet which is more for a traditional audience and less demanding performance.  Thirdly, the same string quartet short version with the addition of a bass for string orchestra.

Tags: , , , , , , , , , ,

Leave a Reply

You must be logged in to post a comment.