Why captions?
Whenever I mention at conferences or meetups that accessible video for blind watchers should have audio descriptions, I am asked a lot of questions: How precise should the audio descriptions be? What should be omitted? Should colors be mentioned (yes, they should)? What video players play audio descriptions? Et cetera. In other words, people are curious about the details and best practices.
Whenever I say that accessible video for blind watchers should have accessible captions, there is usually just one question: “Why?”
The answer is simple – as far as captions are concerned, blind watchers have exactly the same needs as sighted watchers.
Why people use video captions
Seeing what cannot be heard
Captions primarily assist the deaf and low hearing community. For people who cannot hear, the video without captions is completely inaccessible. Even for a video with no soundtrack at all, the information about the lack of sound should be provided either in the video title, the video description and/or in a short caption at the start of the video. Otherwise, a deaf person would not know if they are missing anything.
Another large group of people who use captions are those who do not want to hear the soundtrack, though their hearing is not impaired. More than 80 percent of Facebook users on mobile devices watch muted videos.
Are captions, meant as an alternative to sound, important for blind people? Absolutely. There are deaf and low hearing people within the blind community. For watchers with combined seeing and hearing disabilities, good transcripts are most useful, but captions followed on a braille display are also helpful to some degree.
Following spoken text
Seeing an on-screen text version of what is being spoken in the video helps many people. When the speech in the video is not clear, captions help to understand what has been said. But even when the audio is clear, captions are still important for non-native speakers of the language. They help the watcher to better understand and follow conversations and events.
Blind watchers have the same difficulties following on-screen speech as sighted watchers. In fact, sighted watchers have less difficulty with this, as they are able to see contextual signals which may help to understand what is being said.
While watching a video with captions is more difficult when listening to the video’s audio and captions read aloud at the same time, the advantages are worth it, especially when the captions can be easily and quickly turned on and off. One can compare this experience to the watching movies where translation is provided by a human narrator. There, the narrator’s recorded voice also overlaps the original sounds (including dialogue) in the movie.
Understanding content in a foreign language
When you do not know the language of the video, captions are the only tool to understand it.
Is this important for blind watchers? Of course. I do not know of even a one blind person who is able to speak all the world’s languages and could watch and understand videos in those languages. In reality, access to foreign language video media is a huge issue to persons with disabilities.
Captions in cinemas are almost never accessible to blind watchers. Captions in streaming services are often not accessible either, or their accessibility depends on the platform you’re using (Windows, iOS or Android). Most of the time, as a blind person, you’re simply unable to watch French, German or Spanish films unless you know the language or have someone to read captions for you.
Jumping backward and forward
People use sight to move the video forward or backward. You recognize a scene you want to go to and stop rewinding. Blind watchers use hearing to recognize the scene, but sometimes captions are better than audio. A good example is a video in a foreign language, such as a conference lecture. Only with captions can you find the timestamp you are looking for, unless you’re able to see the slides presented by the speaker.
Problems with captions
Captions are equally important for sighted and blind watchers. So what is the problem with videos that have captions? Are they not usable for blind watchers?
The main issue is that most captions are not really accessible for screen readers. Usually, blind watchers cannot take advantage of captions provided for a video. Here I’ll discuss the most common issues, some of which are specific for a certain technology, e.g. online players.
Open captions
Open captions are totally inaccessible for any kind of reading software. Unlike closed captions, open captions are burned into the video and cannot be turned off, so there is no separate text to be read by a screen reader. Open captions are never accessible to screen readers. At the same time, it’s important to remember that closed captions are not accessible by default. You need to make sure the video player you use presents the content in an accessible way.
Captions not read automatically by screen readers
The most common problem with captions is that they are not read automatically by screen readers. Captions should be read just after they appear on-screen. Captions that are not read automatically are almost useless to blind watchers.
Caption text is not accessible to screen readers
When the caption text is complicated – i.e. includes important or complex data or requires closer attention for any other reason – you may wish to pause the video and read the caption more carefully. Blind watchers do the same. When the caption text is not accessible to a screen reader, the blind watcher cannot navigate to the caption, read it, spell the unclear words, copy the text to make a note, etc.
Also note that when captions are not read automatically by a screen reader, reading captions manually is the only way of accessing them.
Captions read incorrectly
Another issue can occur when scrolling captions are used. Scrolling captions place the next line below the previous line, pushing the previous line up. If not properly tagged, the previous line of captioning is repeated when a new line is added. The result is a complete mess – a screen reader reads line A, then line B, but then the line B becomes line A and is read again. And if the text is not split line by line, but phrase by phrase, you get a kind of looping effect where you hear the same phrase many times interrupted by new phrases. You can observe this phenomenon very well on YouTube videos – just play any video with captions on Chrome, Edge or Firefox for Windows with NVDA or JAWS turned on.
How to test captions
Before you publish the video, make sure captions are accessible to screen readers:
- Test if captions can be accessed and turned on and off using keyboard and by swiping and tapping (on mobile devices) with a screen reader on.
- Check if captions are read automatically and correctly by screen readers. Test both desktop and mobile platforms.
- If you provide captions in a foreign language, check if the screen reader automatically switches the voice language to read captions. If this does not take place, check if the captions are tagged with LANG or localization attributes, and that they are tagged correctly
- Check if the caption text can be read manually – with a screen reader turned on, navigate to the caption text and check if it is read.
Note that besides the above-mentioned steps, you should make sure the video player you use is generally accessible to people with disabilities. AccessibilityOz is constantly updating and refining our accessible video player, OzPlayer, to make sure it remains the most accessible player available.
Summing up
Although captions are most notably beneficial to deaf and low hearing people, they are also used by other groups, including people with low and no vision. The blind and low vision community should not be forgotten about when captions are prepared and tested. Blind watchers have the same needs as all other watchers as far as captions are concerned. Keep that in mind when designing your media content.
Hi, I’m neither deaf or blind but still need captions for the only foreign language I know pretty well. I was taught a bit too late (when I was 15) so my listening skills are not very high. Without captions I wouldn’t be able to follow the plot. I always play films for my students with captions, it helps them a lot.
Thank you for letting know what the other people feel. It broadens our point of view. It’s priceless because it’s hard to imagine what you haven’t experienced. It’s very interesting.
Best wishes
Zosia
What would you recommend doing caption-wise for a video-only video (no dialogue) though words and quotes appear on screen throughout? A person who is deaf or hard of hearing could see the words but not a person who is blind or who has low vision. A transcript would be an obvious fix, but how about captions that describe the visuals and include the words shown on screen?
Hi I’m trying to make my website and social media accessible. I’m wondering if social media is now more accessible? If I’m uploading images or short videos to social media, can I do the following?
– use the caption option on TikTok
– write a transcript or image description in the original post on Facebook (or would I add a transcript in a word document?
– write an image description or captions in an Instagram comment section.
[…] Captions in all videos, music and other types of audio must have visual captions. […]
[…] not enough to just transcribe what’s being said. Captions and video transcripts should have descriptions of background noises and visual sights to replicate the full experience of watching the video. Recently, the English SDH (subtitles for […]