Using SMPTE in Live
- Live Versions: All
- Operating System: All
What is SMPTE?
SMPTE timecode is a standard for labeling frames of video or film. The standard was developed and defined by the Society of Motion Picture and Television Engineers (hence the name SMPTE) and allows for accurate editing, synchronization, and identification of media.
What is SMPTE used for?
Even though most productions are digital now, SMPTE time is still very relevant for syncing video across platforms and programs. The number of frames is the backbone of a video's timing, therefore there are three different units of time when scoring music to film: Samples (Audio), Beat Time (Music) and Frames/SMPTE (Video).
How is SMPTE timecode derived?
SMPTE timecode appears as hour:minute:second:frame (for example, one hour would be written as 01:00:00:00). The frame rate is derived directly from the data of the recorded medium: in other words, the frame rate is inherent to the media, and can differ for film vs. digital, video vs. audio, and color vs. black and white.
SMTPE Timecode is linear, which is why it's commonly called LTC (Linear timecode). This means that it doesn't change speed. So, for example 00.45.00.01 is always 45 minutes, 0 seconds, frame 1.
With audio, instead of frames being the lowest division, it's samples. So audio is HH:MM:SS:Samples (Hours, Minutes, Seconds, Samples)
What is an example application of SMPTE?
The classic example would be an actual spool of film with SMPTE timecode recorded as an audio signal on its audio track. For example a VHS tape of a movie with SMPTE timecode recorded on one of the audio channels. The SMPTE signal is then sent to a tape machine, playback system or DAW which "chases" the SMPTE signal to playback audio in time with the film. The movie starts at 00.00.00.00 SMPTE time and your music also starts at 00.00.00.00 in beat time.
How do you sync video to music if the start times are not the same?
It's more complicated when the start of the video and the start of the music are not the same, i.e. you're given the whole movie but you're only working on one scene.
Let's say the scene you're working on starts at 00.45.00.01 SMPTE time (45 minutes, 0 seconds, frame 1). So in order for beat 00.00.00.00 to be at 00.45.00.01 SMPTE time, you need to offset SMPTE time and Beat time from each other to get correct playback. Programs that support SMPTE timing offer ways to adjust the relationship between beat time and SMPTE time so that this works.
If the director edits another scene in the film, then the timing of your scene is still the same but now your scene starts at 00.43.03.15 (45 minutes, 3 seconds, frame 15). So you need to be able to adjust the relationship between SMPTE time and Beat Time accordingly to keep everything in sync.
How can I use SMPTE in Live?
Live is unable to sync to incoming SMPTE. Neither does Live generate SMPTE natively. However there are two workarounds to send an SMPTE signal out of Live:
1. Use specially generated LTC SMPTE audio files
You can generate an LTC SMPTE audio file with different frame rates and with different start times here.
Then load the LTC SMPTE file into an audio track in arrangement view, making sure to turn off warp in the clip. When the video is on another machine just route the LTC SMPTE audio from one machine into the input on the other (using a dedicated output on your interface).
If you are trying to sync Live to another application on the the same computer then you need to route audio between the two programs.
2. Use this Max for Live device to output an LTC SMPTE audio signal. (macOS only)
This free Max For Live device outputs a LTC SMPTE audio signal with the current playback position of the active clip, taking warp markers and tempo changes into account to guarantee a time sync that is always aligned with the clip playback.
This option works on macOS only and requires Live Suite, or Max for Live as an add-on to Live Standard.
See this tutorial from Ableton Certified trainer Will Doggett which explains how to generate LTC files and use them in Live.