Summative Report

To begin the process, as a pair, we planned what microphones would be placed where and began to make the connections. As each cable was plugged in to the XLR splitter rack the corresponding channel strip was labelled with a sharpie and tape. The microphone used for the kick drum was a D112, the snare top and bottom SM57s, overheads were c1000s, the guitar was miked with a single SM57 and the vocalist used an SM58. SM57s are industry standard live microphones used for these particular applications because they are dynamic, have a high SPL, low bleed pickup and are hard to break. 57s also have a moulded frequency response with a natural low cut and some excitement in the high end making it suitable for instruments with a lot of harmonic qualities – i.e a snare drum/guitar. The D112 has a warm response in the low end of the frequency spectrum and the SM58 is brightened in the mid range making them both suitable for these applications. The C1000s are condensor microphones with a cardioid/hyper-cardioid polar pattern, with a frequency response tailored to provide brightened top end making them suitable choice for the role of over heads as more distance microphones are receiving and replicating, mostly, the top end of the sound. In retrospect, the process of setting up took a little too long but a cautious approach helped to ensure the connections were made correctly. After everything was connected I took charge of the main desk to ensure the signal was routing to the desk correctly. To achieve this my partner, Ikenna, stood on the stage and scratched the microphones whilst I boosted gain using the PFL setting on the Soundcraft Live 8 and analysed the LED meter to see if there was response. The only microphone to fail this test was the right C1000. Here began our first troubleshooting scenario. After following the signal path for this microphone all the way through and ensuring everything was cabled correctly and phantom power was provided the fault was placed on the microphone and the overhead scrapped and left out of the mix because time was against us. In an ideal world this problem would have been solved, but the severity of a lost overhead is less than that of a delayed gig. Once this process had taken place I quickly removed the most troublesome frequencies from the EQ via ringing out. I found trouble and got caught up on a frequency that I later found to be at around 800 Hz.  Once this initial setting up process had taken place we were able to get the band in to soundcheck. I went through the instrumentalists in order of drummer, bassist, guitarist and then vocalist checking each aspect of their instrument was set to a correct gain using PFL on the mixing desk. Once I had done this I could place the instruments in this mix and set levels. I noted that the vocalist was the quietest of the instruments and attempting to mix around this, this however created some difficulty because the vocalist was slightly unpracticed and didn’t use the microphone correctly and wouldn’t sing during the setting of levels, she would only speak. This lack of microphone knowledge hindered the mixing process because I was unable to adjust to a consistent level on vocal. However, I could have reduced the volume of the other instruments to improve the overall mix. During this process, I also failed to engage the EQ on some channel strips meaning my adjustments weren’t taking action. In the future, it would be invaluable to remember to do this. The bass guitar was DI’d through the amplifier during the verse chorus soundcheck the were a few loud cracks in the sound and bangs within the system. We then discovered that the bass guitar has a loose jack input, but also an XLR used to DI the bass was loose. Unable to fix the bass guitar at that point, we settled for fixing the DI connection by inserting a DI box and encouraged the bassist to be careful with his instrument. During this soundcheck, I attempted to feed in the auxiliary effects but they didn’t seem to be working. To fix this problem I checked the signal path from the point in which I knew it worked onwards, eventually finding that the effects had been unplugged and replaced with similarly labelled cables. To fix this I wired the auxiliary effects back in correctly, set gain and introduced them to the mix.

 

In reflection the aspects that needed greatest improvement during this assessment were organisation and cable management. To be successful in the live music industry, tasks must be managed with time efficiency and safety. The pressure of assessment found my partner and I being overly careful with processes we were both actually comfortable with, so keeping a cool head and mind set to task would help to create a more productive, successful process. Cable management was also very poor, I believe, due to the lack of prioritisation within our workflow. Whilst having to go back through already tidied cables to uncover problems hindered the tidying process, if this task had been approached as an ongoing responsibility, it would never have become unmanageable. Alongside this, my lack of knowledge of this particular desk found me making silly mistakes whilst mixing and removed my ability to be watching the band, as I had to spent a large amount of time locating things on the desk. In conclusion to improve upon my ability in this area I should be more practiced in live mixing scenarios and more organised in my approach to time allocation and time management.

 

 

 

Mixing with subgroups

 

Subgroups utilise a bus send to assign multiple channels to one subgroup channel control, usually a stereo channel with independent left and right faders. This allows us to control an assigned amount of instrument channels with one hand and with the multitude of things and engineer must take in to consideration with a mixing console, whilst listening to and watching the band, this may make things more manageable (especially in large band scenarios).

 

To engage subgroups, you must locate the in-out switches that connect the sends to the group channels, usually located just above the faders near mute, PFL and any other sends on the mixer. Once switched in it is usual practice to disengage the main mix sends to ensure that the is a single control for the mix. From this point, the audio is being sent directly to the group channel controls and these controls can be adjusted.

Multimedia: Sound Sweetening

Game of Thrones Battle Scene Dragon Roar

 

Modern sound design relies on the human ability to match image with sound. A lot of sounds that are seen in films today exploit this ability to create sounds that are actually larger than life. By layering sounds with the sound that we expect and the sound that we can subconsciously respond to it is possible to achieve new sounds that are more captivating and imperative in their direction of the audience.

When creating a dragon’s roar, it is hard to match sound to that of the real subject as nobody has ever witnessed a dragon roar, nor will they. So, a little imagination must be employed.

 

To draw influence and spark inspiration, I researched in to similar sound design productions and found joy, after a while, in the sound design of the original Jurassic Park movie.

 

https://www.youtube.com/watch?v=qwWvO4UgJiU

 

To create a dragon sound I employed the methods used in this production by combining the sounds of a bear with a dolphin, employed pitch-shifting and time stretching techniques and the results were impressive. To add a little more dissonance to the sound I added a recording of a tire screech with a few effects to add screech to the roar. I then recorded a whistling kettle and placed this over, to add air and top end to the dragons war cry.

Collaboration in technology based music

Collaborating in music has the potential to be both beneficial and restrictive. Introducing a second consciousness to the creation process can provide new influences, techniques, experience and ideas but it may also mean that more preemptive organisation is involved, possibly slowing the creative process.

 

There is only one thing more volatile than a human and a computer making live music together and that is multiple humans with multiple computers creating music. Unlike traditional musical instruments, where to create expression the user can simply adjust the way in which they play the instrument these expressions must be pre-explained to the computer, making musical ideas such as stops, time changes, inversions and tempo changes much harder to consciously materialise.  Conversely, the amount of things that can be done by two humans and two computers, when planned could easily match that of a full band. The restrictions involved in this process are comparable that of a traditional band. It is simply based on what the members are physically and creatively able to do. It does create more choices to ponder. The contemplations involved in a traditional band situation are simply what each member should do with their designated instruments. However, when these limitations are removed the process can become overwhelming and the creative freedom can actually result in a less polished product. More choice isn’t necessarily better. Having access to such a vast array of digital instruments can lead to frustration, especially when individuals conflict and all seek the ‘lead’ role. This may result in the gradual increase in volume as members complete for presence. Other variables to consider are the fact that two people means four hands and when there are a thousand buttons and knobs to twist throughout a song, more hands is definitely MORE.

The creative process within this type of music has its differences but is effectively the same. The preference between collaboration and solo work is very subjective and personal. Some artists believe that creating music should be a lonesome process as it allows a more accurate translation of personal emotive influence. This point of view is much more expansively explained by the artist Claudio in her TedxTalk.

 

It is abundantly obvious, despite this, that collaboration has many benefits. It allows personal development and for our art to be affected by others. It was once said by George Harrison that all of The Beatles catalogue consists of “personal ideas filtered through the minds of each member of the band”. Whilst The Beatles are exceptional in many ways, this is an effective way of describing the process of collaboration in any type of music. It allows others to interact and effect the process. An effective model for collaboration in technology based music is having one operator of technology and then a traditional instrumentalist allowing both of the stylistic elements of each to be combined to create a cohesive, emotive work. A good example of this is comes in the form of a track named ‘Little Nerves’ created by Binkbeats, an experimental technology based musician and Niels Broos a more traditional musician, even though his instruments of choice may argue him, more technological than most might consider a musician to be.

 

 

This performance boasts many impressive, persuasive,  sonically interesting sounds. Watching this process is beyond impressive, proving a perfect example of the amount of rehearsal, organisation and planning that must be employed to collaborate successfully within technological music.

Insert VS. Auxiliary audio processing

Audio processing is an essential part of all almost all music. Bar the old acoustic guitar around the campfire, it’s used in almost every aspect of music to enhance the listening experience for the audience. Audio processing hardware, used with mixing desks, outside of the standard channel strip effects are utilised in one of two methods; Insert or Auxiliary effect. These two statements effectively describe the placement of the effect in the signal chain. Insert effects are applied directly to a channels signal pre-fader whereas auxiliary effects utilise a split signal returned to a different channel to allow signals to be blended together.  As to what effect should be placed where it varies hugely on the desired outcome of the effect but as a rule of thumb EQ’s and dynamics processors such as compressors, limiters and gates are the main uses for insert effects in a live scenario. Auxiliary sends may be used for effects like reverb, delays and harmonisers (essentially any effect you wish to have a wet/dry control for. Auxiliary’s also send and return busses to allow multiple channels to feed signal through effects.

Stage Setup: Cable routing

In a live performance scenario, having knowledge of cable routing is a necessity. The cable connections of a stage setup could be likened to the human nervous system, an assemblage of wires all interconnected to provide a pathway to a stable body and if a connection happens to be wrong, there can be severe consequences. The signal begins on stage at sound source (instruments) with the end goal being the speakers. Of course, during this journey there are multiple processes it must undergo.

 

The sound must first be converted to electrical signal, most commonly with the use of microphones, otherwise by direct injection. From here the sound will be processed via a mixing desk. However, in professional productions, sound is usually split between two mixing desks – One front of house and one side-stage for monitoring. This is done via a splitter-box rack. In scenarios where only a single desk is employed instruments would go directly in to a multicore and in to the desk, to be managed by auxiliary outputs, which feed in to separate amps. From the splitter rack, the signal run to the each mixing desks channel inputs. The signal can then be processed here to adjust dynamics, EQ and inserted or auxiliary effects. The main output of the front of house desk is fed to amplifiers and then to the speakers, usually via speakon cables. The main outputs of the monitor desk (which usually contain more outputs) are fed to amps and then to monitor speakers (wedges). The result is a separately managed mix, which allows more focus to be applied to each aspect of the production.

 

Screen Shot 2017-12-06 at 20.09.10.png

Improvisation

Improvisation within music is a long standing, inspired practice. Improvisation, traditionally, allows a musician to be more free and expressive within their performance, allowing the presentation of momentary emotions to be translated via an instrument and losing the judgement that tends to come with polished products. Within technological based music, improvisation finds its self in an isolated, confusing world of its own. Music within technology generally involves programmed patterns and repetitive cycles, which makes the prospect of improvisation hard to grasp. In essence improvisation is reacting to spontaneous stimuli. In the more traditional concept of music this stimulus comes in the form of animated change from fellow musicians or personal emotion. When working within a computer this spontaneity is hard to achieve because the computer requires instruction to function. This can however be offset by certain workflow habits and preemptive actions. Within Ableton Live there are a few methods of introducing this. One may use programming instructions such as follow actions to achieve a randomly generative song sequencer, this does however require the pre-planned element of writing parts to place in clips or the practiced ability to this on the fly. However, this becomes more interesting when one considers the possibility of using Ableton as a clip looper. To be used with or without random follow actions ableton can be used to loop large collections of sounds either spontaneously created or pre-inserted either way spontaneously launching them to create unique sounds and structures.

FKJ Improvisation Session

Methods like this are used by FKJ throughout his music. FKJ usually implements the use of an AKAI APC40, a Machine controller, a laptop and assorted instruments including Rhodes Keys, Guitar, bass and saxophone. FKJ uses this equipment to form a live looping session in either an improvised manner or a more planned structure. A good example of an improvised session by this artist is his Eton Messy session in Red Bull Studios. This track begins with a simple guitar motif and new instruments are introduced periodically and mixed live to create a unique, random performance. To improvise in this manner requires a lot of experience and one must already have preconceived ideas of ones music to create such a cohesive fluid piece and in this sense can almost definitely be compared to that of a more traditional musicians ideology.  One can however, draw heavy comparisons from this and his other work arguing that a lot of his playing throughout this track is not true improvisation instead arguing that this work contains a lot of familiar motifs and musical ideas, caused by said experience and muscle memory drawing fingers to familiar keys. So is this truly improvised?

Yes and no. It must be said that the definition of improvisation is very definitely subjective. If your definition describes improvisation to be the reaction of ones application of music to external stimulus then perhaps not because all of the components of this composition are self created and understood. However, the unplanned nature of the song allows for personal interpretation. Personally, to me if a piece of music is unplanned or unformed as an idea before it’s performance it is in fact improvised.

 

FKJ live improv session

Immediate actions

Self driven development is a fundamental skill within the modern music industry. With such intense competition in all areas of the industry, to diversify is to survive. To ensure progress within this overwhelmingly broad task, immediate actions should be implemented. An immediate action describes an event to be undertaken with a specific outcome in mind.

To encourage progress within my Work Based Learning, the actions I must undertake in this oncoming week are:

  • acquire stems from Daniel June
  • create a gantt chart outlining desired progress.
  • correspond with John from The Moonbase to confirm placement
  • Create mockup website design

 

With these actions in place I can ensure that development is efficient and regular. This is hopefully reinforced by the gantt chart allowing me to implement planned action on a regular basis and keep my progression organised and professional.

Communication within the industry

Success within the music industry relies upon the ability to network and communicate with peers, colleagues, collaborators and professionals. Whether it’s sending a mixtape to an idyllic endorser or approaching an industry professional about work, the foundational approach taken during this process could be the key to success or the doorway to failure.

When approaching people within the music industry as an entry level member it’s more than likely the case that you, the approacher stand to gain a lot more than them, the approached. For this reason, the WAY in which we approach is THE most important aspect. One should be mindful of self presentation whilst also taking in to consideration the way in which the receiver would like to be approached. When attempting to find a way through this maze, if appropriate, one could be more personal with their correspondents, investigating them via social media and understanding their personality before sending the message. Where possible, it may also be an idea to explore outside of instant messaging and pick up the phone. Vocal tone and impromptu conversational improvisation can reap many benefits and help portray your message in a way that the written word just cannot (much more information is processed via spoken conversation).

The subject of a message should be clear, concise and well planned. It is likely that whoever is being contacted receives such correspondence very frequently and would simply ignore frivolous, wordy messages.

Multimedia: Cue Sheets

Within a soundtrack there are numerous sonic events, with the most emphasis being placed upon these five:

  • Dialogue
  • Music
  • SFX
  • Foley
  • Ambience

 

When planning a soundtrack for a multimedia product, a cue sheet is used to organise each of these sounds. Cue sheets are effectively a timeline of events, a script for the sound producer if you will. Cue sheets are made via a method called spotting – which is effectively exploring the media and planning for specific events along a timescale. These are useful because, the vast amount of responsibilities placed upon a sound designer make it easy to miss opportunities to have effect. So, having a solid plan of action makes this less of a possibility.

 

Design a site like this with WordPress.com
Get started