Although closed captions and subtitles look very similar, they are actually designed for two different audiences. If you don’t know why you need to subtitle or caption your videos, check out my other post first to understand the need.
Subtitles provide written text for the dialogue of a video/clip, whether it be for the spoken words of the character on or off screen, narrators etc. Subtitles generally are provided for audiences that can hear to some degree, but need the written word to assist their understanding of voices.
Captions not only do what subtitles do, but they also describe environmental sounds and other relevant sounds that the viewer may need. Captions are generally intended for deaf or hard of hearing audiences, whom may miss aural cues that are important to the story being told. The captions may be across the bottom of the screen, or may be in a position relevant to where they are actually are present on screen (showing who is who, whilst talking, for example).
Important! There is a fine line between giving the viewer enough information, and giving them too much.
If the sound is not relevant to what is being said or the story being told, it will be pretty safe to say that it won’t be needed. The second example below shows where a sound should be captioned. A sound not needed captionioning would be closing a car door or gravel under his feet. Things like a gunshot or sirens, however, would need captioning.
Open or Closed?
You have two user choices when you decide you want to caption/subtitle your clip. You either want them on screen all the time (burnt-in), or you want them so that the viewer can switch them on or off depending on their needs. If they are on all the time, then they are called Open Captions. If you want them to be user-switchable, then these are called are Closed Captions. There are pros and cons to both:
Open Captions (AKA “open” subtitles)
- They are burned in, so everyone sees it.
- No complex hardware or software needed to show captions.
- User just presses play.
- Can be “burnt in” for free (as long as you have the correctly formatted subtitle file).
- Not everyone wants to see them.
- If using a small screen, the writing will be tiny (think mobile devices).
- The text may not be clear if the video file is re-encoded.
- It is not possible to archive / index the caption separately as a transcript.
- It is not possible to choose language, only one can be shown without taking up more space on the screen.
Closed Captions (AKA “closed” subtitles)
- They can be turned on or off, depending on the need.
- The user can customise the size, opacity and font of the captions shown.
- The captions will still be readable on a smaller screen
- The captions can be “embedded” as a player option for free (as long as you have the correctly formatted caption file).
- It is possible to archive / index the spoken word separately as a transcript.
- You can create captioned files in various languages that the end user can then choose which one to use.
- The video will require a caption compatible file player.
- Not all devices and apps will show captions (i.e. Facebook on an older phone).
- The captions are not automatic, they need to play the video and then choose the caption option.
In the above example, a deaf or hard of hearing viewer would not know that there is a faint buzzing sound to indicate an electrical current, nor the sound of the electrocution/explosion.
In the example above, by including the [BIRDS CHIRPING] in the captions, it lets the viewer know that whilst there is no speech in this part of the show, there is still sound that they need to be aware of to help them understand what they are seeing. Without this description, they wouldn’t know that it was the only sound and that he was isolated.
I came across an excellent styling guide by The Canadian Broadcasting Corporation, published in 2003, which is excellent reading – despite the age. Download it direct from here.