Produce film music on the laptop? Sound like recorded with a real orchestra and classical instruments? Is this possible? Yes of course! Just like any other auditory work step in the film business is possible with a laptop. Here you can find my instructions.
I love laptop computers: These things are still the most powerful device we can carry around with us to do really creative / technical work. In this article, exclusively for Filmpuls, you will find my instructions on how to compose a Hollywood-ready soundtrack on your laptop. Of course including the acoustic result.
To make you curious, you will find the soundtrack I composed on my laptop as an introduction to my article (the film is not yet available – producers are welcome to contact me!)
In the next chapters I will show you step by step how you too can achieve such a result with your laptop:
AHow to produce orchestral film music on laptop
Nothing against film music on the smartphone or tablet: this is also possible. But only to a limited extent. On the other hand, too many people still feel that for the digital production of a rich soundtrack, you need equipment that competes with NASA. Or at least Hans Zimmer and his men and women. If, like Zimmer, I received a million monetas per order, I wouldn’t stop at buying the world’s best hardware.
Because that is where the big difference lies: in the hardware.
The software is always more or less the same everywhere. Unless you are talking about programs that let the hardware communicate with each other. But the actual software for music production, the DAW (Digital Audio Workstation) as well as the libraries and plugins, are the same for all professional film composers.
When it comes to music, sound design and sound mixing, the most common audio programs for us are all available apps like
These programs can be enhanced by purchasing specific software extensions called plug-ins or libraries (just as After Effects can be enhanced with third-party particle and smoke plug-ins).
This article does not cover the requirements you need to meet to produce film music. It’s about trial and error. The software I used for the composition presented at the beginning is
- Logic Pro X and the
- Hollywood Strings/Brass/Woodwinds/Choir-Lybrary by the company Eastwest.
Everyone is allowed to try this! Start the Garage Band and let’s go! But I think that’s where most people fail. By simply doing it, having the courage and the enthusiasm and the persistence to try it over and over again.
The piece discussed in this article is my current state of work on it. I have no intention of not working on it further. It is quite possible that in a future article I will therefore express quite different views based on new experiences. But back to the here and now:
CThe result right from the beginning before your eyes
My composition for this manual should sound like a classic movie soundtrack. Orchestral instruments and that’s it.
This idea helps me at the very beginning to stay in the right groove and to set the right course. Older soundtracks (this is my view) have more melody. They are more emotional and in themselves correct pieces of music, which the audience carried very much.
Unfortunately this is a dying breed of film music, because big studios don’t want to take any risks. Because a really strong soundtrack can also be recognized as such. But this also brings along economic risk, because a composition with character and therefore high recognition value either pleases the listener – or not.
I work digitally, not with pencil and music paper. Since it is easy to get lost in the technology that a laptop brings with it, we need to know what the goal is. Or the work becomes a literally endless story with many dead ends.
By the way: Do any of you remember the soundtracks from the Avenger movies? Me neither. So my soundtrack is based on works from Star Wars, Jurassic Park or James Bond. Music with dynamics, melody and a strong emotional impact. In my piece this should be threatening and nerve-racking.
When it comes to orchestral music for film, I always have composers like Richard Strauss (So sprach Zarathustra. Listen!!! I’m sure you know it and you’ll thank me for finally bringing you to this monument of a piece of music), Igor Stravinsky (Le sacre du printemps) or Maurice Ravel (Daphne et Cloé).
Because people, believe me: If you are really, really receptive to acoustically created emotions, then this will enrich your existence!
So, before I have recorded just one note, I know where the journey should go. Will I really end up there…? That is the exciting thing. Because I don’t know.
DThe laptop. The software. The setup.
Logic Pro is the software of my choice, so it’s for a purpose: to produce symphonic film music. Logic Pro is just as well suited for producing pumping house tracks or creating a sophisticated sound design. Or as a recording station and host for apps like Skype or Zoom. Logic Pro (as well as all other DAWs) is so powerful, it forces you not to lose focus. I also work with a third-party software orchestra from Eastwest. The men and women from Eastwest have recorded real orchestra musicians and their instruments in all possible variations. These so-called samples, I play them afterwards.
Writing film music on a laptop brings with it a very specific challenge: lack of space on the screen! I strongly recommend working with external monitors. It prevents you from making any careless mistakes.
The stereo monitors (by that I mean loudspeakers and not screens) for monitoring are connected to my laptop via USB interface. They are located at two corners of an isosceles triangle, I myself sit at the third corner. The tip facing me, of course. This way I make sure that both speakers have the same distance to my ears. I also often work with headphones. I hate those things that you have to put in your ears. I only use half open, ultra-comfortable and high quality headphones. Also in this area there are models for 100-150 Euro / franc, which are really super good.
I play my music via MIDI keyboard. This is connected to the laptop via USB. Or – and in this case it’s really worth mentioning because it’s very unusual – I connect my guitar to a software that translates my frequencies into MIDI and communicates with Logic Pro.
The central technology used in film music is called Music Instrument Digital Interface (MIDI). Whether Mr. Zimmer or you: you need MIDI – one of the greatest technical achievements in music and sound design production ever.
MIDI is a communication protocol that sends information about audio content from a controller to a host (MIDI-compatible audio software). For example, the controller can be a MIDI keyboard (which makes absolutely no noise without being connected to a MIDI-compatible host).
If I now connect the controller to the computer or my DAW via USB or MDI cable, the whole magic starts to work its magic. Because now my keyboard, my guitar or my double bass clarinet can be anything I want to be: a drum set, a violin, a church organ or even a piano!
Since almost all DAWs understand the MIDI language, it now reads the MIDI parameters that are created when MIDI messages are transmitted (when a key is struck on the MIDI keyboard).
Typical MIDI parameters are velocity, panning (left/right), aftertouch (what happens after the note is struck: do I release immediately or do I hold down?) or even the pitch. The lowest MIDI parameter is 0, the highest is 127 (128 states in total).
MIDI has the big advantage that I have a lot of editing possibilities. For example, I can correct the length of a note (see instructions in the video “Changing note length” below) or adjust the pitch (-> video “Changing pitch”) at any time.
In addition, MIDI information can be displayed very easily in musical notation. So if I ever get the opportunity to have my film music recorded by a real symphony orchestra, it’s relatively easy to get to notation.
These MIDI notes can take any any software instrument on my laptop. So if I want to transfer my double bass line to my violins, I drag my MIDI file from the double bass track to the violin track and move the whole content up a few octaves.
On the whole, this means that the raw framework of the composition is reflected by these three sequences: (1) Recording, (2) Adapting, (3) Repeating.
Simply recording MIDI and that’s it, unfortunately, is not enough. There is a bit more to it. We have to bring humanity into the whole thing.
Electronic music (EDM, House etc.) for example is generated on the computer as expected and is allowed to sound afterwards. So I can place, tune and interlace everything perfectly and sample-exactly.
Orchestral music however is too deeply human. Sometimes loud, sometimes quiet, sometimes a bit imperfect. Just as we humans are. You have to feel the breath of the musicians, you have to feel the room in which the musicians are.
This happens through four central interventions:
For humanization, each instrument is provided with an EQ. This allows me to emphasize or attenuate specific frequencies. The following video shows the EQ settings for a double bass and a violin. In this way I make sure that overlapping frequencies of both instruments do not get in the way of each other too much and sum up, and that specific frequencies in which the bass or violin is particularly well presented are emphasized.
So I try to give each instrument its individual place in the frequency spectrum.
2Create and control room reverb
To get closer to classical instruments played by humans, I create a track on which I load a so-called reverb plug-in. This kind of effect lets us simulate space and its sound in the second step.
A good reverb plug-in lets me choose whether I want the spatial sound of a broom closet. Or if I prefer the surround sound of a cathedral. For the film music example in this article I chose a large hall. It should not sound like a cathedral. And certainly not the sound of a broom closet.
Now it is necessary to send each instrument to that reverb effect stronger or weaker as required.
I have found that the best way to work is to automate the reverb intensity of each instrument over time (automating means defining specific values over time. In Adobe Premiere Pro I do this with Keyframes). This means that I don’t just send everything “full pot” to the reverb. I choose when and how much. In full-can mode it simply sounds too washy (like a room with bad acoustics). In no application does the sound simply come across as too dry.
The animation recorded from my screen in the video below describes with violet lines which reverb effect is fed by a certain instrument along the time axis with which intensity . Looks just as labor intensive as it is!
3Expression Edit MIDI parameters
In the third step, I now want to create more dynamics, as a real classical orchestra does. I want to be able to distinguish loudly and softly. Where necessary, I want to drop a needle on the proverbial ground hear. And if necessary, I want to be able to even break glass with the force of my music drown it out. That’s exactly what parameters such as expression (intensity)
Expression means with which intensity an instrument is played. As mentioned before, a MIDI parameter can either be 0 and therefore not active, or 127 and therefore maximum active.
At expression level 0, the virtual violin bow strokes the strings of the instrument completely inaudibly. At level 127 it does so as intensely as possible. Each instrument track is again individually automated. This step also looks as intense as it is in the screen recording. The blue lines in the video therefore represent exactly this parameter per instrument – from violin to trombone to French horn.
Through this intervention I obtain an effect that can best be compared to the breath of a living being. Breathing represents life as such. The small and large dynamic nuances that result provide my digital film composition on the laptop with a further degree of humanity.
In the video you can clearly see how the screen is horizontally divided. In the upper half of the screen we can see all the MIDI notes of the entire music production, while the lower half is reserved for the visualization of the MIDI automation.
In the fourth step the stereo image is in the centre. The final product is a stereo file consisting of two channels – a left and a right channel. Content can be placed from the far left to the far right anywhere in the stereo image.
An orchestra, for example, now always has a historically grown arrangement of the instruments on stage. This is an early form of sound design.
So I have now set about distributing all instruments according to groups in the stereo image. I intentionally placed deep elements like the double basses in the middle of the stereo image. Even though this is not the usual placement in a classical orchestra. Simply because in this way the basses are best audible regardless of the source of the sound reproduction (smartphone, tablet, soundbars etc.).
I have done this process in the plugin itself. The panning button of the actual track is still in the middle after that.
GOrchestral film music produced on the laptop: The result
Now I wish you again a lot of fun with all the background knowledge, while listening for the second time to my composition, which is not quite finished for me yet, as an example for the production of film music on the laptop. Because this is a composition from pure self-initiative; there is no release date and a corresponding film (yet?). Besides, the conclusion not yet corresponds to my ideas: Somehow it doesn’t work and it’s also not yet ‘humanized’ because of time constraints. So at the time you’re reading this manual, I don’t know myself how I’m going to leave my piece final.
H Music-theoretical properties of my composition
If you are interested in harmonic, music-theoretical features of my example piece for an orchestral music composition that was entirely created on the laptop, you will find more background information here:
The soundtrack presented and discussed in this article is based on the so called half tone / whole tone scale. The octave is divided alternately into semitone and whole tone steps. Thereby we leave the comfortable, easily accessible sound material. Because this scale has enormously dark, nerve-wracking sounds, which cannot be achieved with one of the seven modes of the ionic scale.
Due to the symmetric structure of semitone and whole tone steps, the scale has no real resolution, which creates a great tension. It is also enormously rich in chords.
Such tonal material was not necessarily part of the works of Bach, Beethoven and Mozart, but it was only with people like Richard Wagner in the Romantic period, and definitely with the impressionists Claude Debussy and Maurice Ravel, that it found its way into the collective sound memory. Composers of the so-called New Music such as Igor Stavinsky draw these and other unusual timbres (whole tone scale etc.) into ever new realms and created some incredibly impressive music (Stravinsky’s Le sacre du printemps).
This music production was created entirely on my MacBook Pro, coupled with MIDI-capable electric guitar.
After that I did a lot of MIDI editing: moving notes around, shortening, raising, lowering, etc. All with tools we all have at our disposal: a commercially available computer and software. So a very unusual way when you think of orchestral music. But times have changed.
Just imagine: Artists like Bach, whose only way to preserve music was by notation! Ink, pen, paper (and even that not available in the online shop at that time). Unbelievable! In the cold of Leipzig’s St. Thomas Church, by candlelight and the scent of incense, Bach nevertheless achieved one of the greatest artistic achievements of mankind! Madness.
Famous last words
I listen to symphonic music a lot. It could be Beethoven’s 9th Symphony, a fusion of progressive metal with classical instruments (Dream Theater, Six Degrees of inner turbulence), a jazz soloist accompanied by a section of strings and woodwinds (Charlie Parker with Strings) or funky disco sound à la Chic’s Le Freak or Good Times. The range of applications for this is huge.
The effect these symphonic instruments have is strong and should not be underestimated. For better or worse.
However, I have listened so much over the years that I have a clear idea of where I want to go in terms of sound. Since, like most of you, I have no real instrumentalists at my disposal, I have produced countless compositions of various genres over the years. And each time I have broken new ground. The whole thing was in the least cases a commissioned work. Rather curiosity coupled with the joy and the will to try and optimize.