Production Expert

View Original

How Do You Collaborate Part 2: An In-Depth Look At Collaborative Workflows.

Bob Brown returns with part 2 of his series on creative collaboration. 

In my first post about music collaboration, I looked at the challenges of building DAWs based on technology that has evolved over the past 100 years. I talked about how musicians use both serial and parallel workflows when recording music. Finally, I looked at how time and space affect musicians workflows. In this post, I will look at three workflows most musicians use as part of their process of recording.

Non-Real Time (File Sharing) Workflows

Today, most music production is in phases or stages: tracking, editing, mixing, and mastering. The first three phases may be repeated several times throughout the creative process as the song progresses and new ideas are brought to the project.

The core idea of the non-real time production workflow is that each phase occurs separately, and in order. Realistically, however, these phases are not entirely in an ordered sequence, and are often done in parallel. With each additional recording session, many takes are laid down and the engineer to select the best takes or comp pieces of each take together to create final take.

These workflows all depend on one or more people completing a piece of work on the project and handing it off other another person for additional work. Hopefully, everyone is contributing quickly and the production moves along without delays. Ultimately, each piece of work happens at each persons own pace, in non-real time.

An example of type of workflow is a band working on a song together where the project may be shared to another musician or studio using DropBox or some other file sharing system. Once shared the other studio can add additional parts such as a bass track. When the additional tracks are added or edited, they are sent back to the main studio to be incorporated into the master project.

Near-Real Time (Offline) Workflows

In some workflows things move very quickly. Specialized tools that help automate, manage, and move projects or pieces of projects can make the workflows feel “near-real time”. The core idea of near-real time is that changes to projects or pieces of projects are moved between different DAWs quickly and automatically with little interaction from the user.

Some DAWs such as Steinberg Nuendo and Ohm Force’s Ohm Studio have tightly integrated features for sharing tracks and clips between multiple computers in “near-real time”. These programs feature specialized systems for notifying different machines when changes are made and moving the changes between computers quickly.

Some In other cases, very clever studio engineers have learned how to work with their tools to speed up the process if the DAW they use does not have built in functionality. One of these tricks is to copy over the raw audio file that project file references. This “tricks” the DAW into reading the newly changed audio even though the DAW did not make the change.

As example of this type of workflow is sound for picture post production (mixing a movie). Since a movie has many tracks sometimes numbering in the hundreds each contributing music, dialog, and sound effects the work for creating, editing, and mixing all of these sounds is typically spread across several sound engineers. When the mixing engineer is working on getting all of the track levels correct, the movie director may want a small change made. This change is typically done by someone different than the mixing engineer. Instead of pausing the entire mixing process, the sound editor will make the change on a separate system and then share the files back to the main mixing system.

Real Time Workflows

Certain circumstances demand an experience that is, as much as possible, similar to being in the same studio at the same time. Tracking vocals or an instrument, working on a mix, or final review and approval of the finished song are all situations where it can be critical to perform a task and have immediate feedback. The core idea behind real time productions is that the performance or media is streamed between locations with absolute minimum latency, or sophisticated latency compensation. Having communication (audio and/or video) in both directions between the locations is also key to real time workflows.

When most musicians think about real time workflows, they immediately ask if they can jam together with other musicians in different locations. Unfortunately, this type of workflow does not work as well as desired due to a combination of high latency and poor audio quality from compression or glitches in the audio stream.

Two key examples of workflows involving real time media streaming are “talent capture” and “review and approval”. With talent capture, a studio in Los Angeles would be trying to recording an artist in New York. The studios would transfer projects and files between each other and streaming the performance between the studios in real time. This allows a producer in one studio be a part of recording an artist in another studio while providing feedback. With review and approval, a mix studio would stream the current mix of a song to another studio where a producer or artist could listen to the mix and provide feedback. The producer and mix engineer would be able to communicate and make changes in real time without the need for creating bounces and moving files around.

Many studios use specific tools that perform well in real time. Software such as Skype, Apple GarageBand with iChat integration, Source Elements Source Connect, and Steinberg VST Connect Pro enables studios to work with each other in a more seamless fashion.

When working with Source Connect, the main studio transfers a project file or audio/video reference material in non-real time to the secondary studio. Once the secondary studio has received the project file and audio/video reference material, they connect a live stream of audio between the two studios. With an integrated transport control, the engineers start both systems playing together. In the secondary studio, the artist performs to the local reference material while their performance is being streamed to the main studio. The performance is recorded in both locations. Source Connect keeps the recording is in sync in both the main and secondary studios. The trick to all of this is how Source Connect properly synchronizes the audio/video playing in both the main and secondary studio with the performance of the artist. Even though there is latency between the studios, everyone hears the performance in sync as they would expect.

While Source Connect is not designed for “real time jamming” where artists at different locations all perform together while listening to each other, there are other pieces of software that try to solve this problem, such as eJamming, Cockos Ninjam, and JamLink.

About: Bob Brown has worked with many top tech firms and was part of the team working on collaboration technologies at Avid and more latterly Gobbler. In the first of three articles Bob talks about the concept of creative collaboration, the opportunities and the challenges.