Success Criterion 1.2.3 - Level A
Audio Description or Media Alternative (Prerecorded)
Ensure a transcript or audio description is provided for videos where you can't hear what is displayed. The content can then be read in case of a transcript, or heard in case of audio description.
Impact
A transcript is useful for anyone who has trouble understanding video frames.
People who are blind can hear what can be seen in videos through an additional audio track.
Check
“Is a transcript or audio description available for all videos?”
This can be tested visually, no assistive technologies are needed.
Solution
Add transcript
- Android
- Jetpack Compose
- iOS
- SwiftUI
- Flutter
- React Native
- .NET MAUI
- Xamarin
Transcript - Android
On Android, you can use a TextView
to display written text. Don't forget to put it in a ScrollView
, to make the text scrollable.
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text="Appt transcript" />
</ScrollView>
Transcript - Jetpack Compose
In Jetpack Compose, you can use a Text
Composable
to display written text. Don't forget to add verticalScroll
modifier, to make the text scrollable.
Text(
text = "Appt transcript",
modifier = Modifier.horizontalScroll(rememberScrollState())
)
Transcript - iOS
On iOS, you can use UITextView
to present a transcript. A UITextView
is scrollable by default. You can also choose to place one or more UILabel
's in a UIScrollView
.
// Option 1
let transcript = UITextView()
transcript.text = "Appt transcript"
// Option 2
let transcript = UILabel()
transcript.text = "Appt transcript"
let view = UIView()
view.addSubview(transcript)
let scrollView = UIScrollView()
scrollView.addSubview(view)
Transcript - SwiftUI
In SwiftUI, you can use Text
to present a transcript. To enhance user interaction, consider enabling text selection
, allowing users to copy or interact with the text. For long transcripts, it's best to embed the text in a scrollable container like a ScrollView
.
private var transcript: String = "Appt video transcript."
var body: some View {
ScrollView {
// Place other views here
Text(transcript)
.textSelection(.enabled)
}
}
Transcript - Flutter
With Flutter, you can use Text
to display written text. Make sure to wrap the Text
widget in a SingleChildScrollView
and to set the overflow parameter to TextOverflow.visible
. Also, the softwrap
parameter needs to be set to true to prevent the text from overflowing outside its container.
SingleChildScrollView(
child: Text(
'Appt transcript',
softWrap: true,
overflow: TextOverflow.visible,
),
)
Transcript - React Native
In React Native, you can use Text
to display written text. Make sure to wrap the Text
widget in a ScrollView
to enable scrolling.
<ScrollView>
<Text>
Appt transcript
</Text>
</ScrollView>
Transcript - .NET MAUI
In MAUI, you can put a Label
inside a ScrollView
to achieve it.
<ScrollView>
<Label Text="{Binding Transcript}" />
</ScrollView>
Transcript - Xamarin
In Xamarin, you can use Label
to display written text. Make sure to wrap the Label
widget in a ScrollView
to enable scrolling.
<ScrollView>
<Label x:name="transcript" Text="{Binding ApptTranscript}" />
</ScrollView>
Add audio description
- Android
- Jetpack Compose
- iOS
- SwiftUI
- Flutter
- React Native
- .NET MAUI
- Xamarin
Audio description - Android
As of Android 4.1, the MediaPlayer
has support for multiple audio tracks. Use the selectTrack
method to select the correct audio track.
The code example belows shows a basic implementation of selecting an audio description track embedded inside a video.
val player = MediaPlayer.create(this, R.raw.video)
try {
player.trackInfo.forEachIndexed { index, trackInfo ->
if (trackInfo.trackType == TrackInfo.MEDIA_TRACK_TYPE_AUDIO) {
player.selectTrack(index)
return@forEachIndexed
}
}
player.start()
} catch (e: Exception) {
e.printStackTrace()
}
Audio description - Jetpack Compose
In Jetpack Compose, the MediaPlayer
has support for multiple audio tracks. Use the selectTrack
method to select the correct audio track.
The code example belows shows a basic implementation of selecting an audio description track embedded inside a video.
var mediaPlayer: MediaPlayer? by remember { mutableStateOf(null) }
var error by remember { mutableStateOf<String?>(null) }
val currentContext = LocalContext.current
DisposableEffect(Unit) {
val player = MediaPlayer.create(currentContext, R.raw.video)
mediaPlayer = player
try {
player.trackInfo.forEachIndexed { index, trackInfo ->
if (trackInfo.trackType == MediaPlayer.TrackInfo.MEDIA_TRACK_TYPE_AUDIO) {
player.selectTrack(index)
return@forEachIndexed
}
}
player.start()
} catch (e: Exception) {
e.printStackTrace()
error = e.message
}
onDispose {
player.release()
mediaPlayer = null
}
}
Audio description - iOS
On iOS, AVPlayer
has support to add audio description. Users can enable audio description automatically through System Preferences. Turning on audio description works automatically if you add the public.accessibility.describes-video
property to the audio description track.
The code example below shows a basic implementation of enabling audio description embedded inside a video.
let composition = AVMutableComposition()
guard let videoUrl = Bundle.main.url(
forResource: "Appt",
withExtension: "mp4"
) else {
return
}
let videoAsset = AVURLAsset.init(url: videoUrl)
// Add video track to composition
if let videoTrack = try await videoAsset.loadTracks(withMediaType: .video).first,
let videoCompositionTrack = composition.addMutableTrack(
withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid
) {
do {
try await videoCompositionTrack.insertTimeRange(
CMTimeRange(start: .zero, duration: videoAsset.load(.duration)),
of: videoTrack,
at: .zero
)
} catch { }
}
// Find and add the audio description track
for track in try await videoAsset.load(.tracks) {
if try await track.load(.mediaCharacteristics).contains(.describesVideoForAccessibility) {
if let audioCompositionTrack = composition.addMutableTrack(
withMediaType: track.mediaType,
preferredTrackID: kCMPersistentTrackID_Invalid
) {
do {
try await audioCompositionTrack.insertTimeRange(
CMTimeRange(start: .zero, duration: videoAsset.load(.duration)),
of: track,
at: .zero
)
} catch { }
break
}
}
}
Audio description - SwiftUI
In SwiftUI, you can integrate AVPlayer
which has support of adding audio description. Users can enable audio description automatically through System Preferences. Turning on audio description works automatically if you add the public.accessibility.describes-video
property to the audio description track.
The code example below shows a basic implementation of enabling audio description embedded inside a video.
let composition = AVMutableComposition()
guard let videoUrl = Bundle.main.url(
forResource: "Appt",
withExtension: "mp4"
) else {
return
}
let videoAsset = AVURLAsset.init(url: videoUrl)
// Add video track to composition
if let videoTrack = try await videoAsset.loadTracks(withMediaType: .video).first,
let videoCompositionTrack = composition.addMutableTrack(
withMediaType: .video,
preferredTrackID: kCMPersistentTrackID_Invalid
) {
do {
try await videoCompositionTrack.insertTimeRange(
CMTimeRange(start: .zero, duration: videoAsset.load(.duration)),
of: videoTrack,
at: .zero
)
} catch { }
}
// Find and add the audio description track
for track in try await videoAsset.load(.tracks) {
if try await track.load(.mediaCharacteristics).contains(.describesVideoForAccessibility) {
if let audioCompositionTrack = composition.addMutableTrack(
withMediaType: track.mediaType,
preferredTrackID: kCMPersistentTrackID_Invalid
) {
do {
try await audioCompositionTrack.insertTimeRange(
CMTimeRange(start: .zero, duration: videoAsset.load(.duration)),
of: track,
at: .zero
)
} catch { }
break
}
}
}
Audio description - Flutter
With Flutter, you can use better_player
to let users select different audio tracks.
The code example belows shows a basic implementation of changing audio tracks.
BetterPlayerController controller = BetterPlayerController(
const BetterPlayerConfiguration(
controlsConfiguration: BetterPlayerControlsConfiguration(
enableAudioTracks: true,
),
),
betterPlayerDataSource: BetterPlayerDataSource.file(
'assets/appt.mp4',
useAsmsSubtitles: true,
),
);
void changeAudioTrack(int track) {
if (controller.betterPlayerAsmsAudioTracks?[track] != null) {
controller.setAudioTrack(controller.betterPlayerAsmsAudioTracks![track]);
}
}
Widget build(BuildContext context) {
return BetterPlayer(controller: controller);
}
Audio description - React Native
On React Native, the React-Native-Video package has support for switching audio tracks. This allows you to offer users a way to switch to an audio description track.
Note: The audio tracks must be encoded in the file, this is not something you add programmatically.
import Video from 'react-native-video';
<Video
selectedAudioTrack={{
type: "audio-description",
value: "en"
}}
/>
Audio description - .NET MAUI
In MAUI, you can use MediaElement
to embed videos. Unfortunately, there is no built-in support to switch to an audio description track. In this case, a custom control will be required to support it. You can consider the following options:
Use a
Custom Handler
to create a control from scratch.Use
Platform Behavior
to extend theMediaElement
component.Create a native binding to expose any native third-party library that supports this feature via Native Library Interop for .NET MAUI.
Not available, contribute!
Audio description - Xamarin
On Xamarin, you can use MediaElement
to embed videos. Unfortunately, there is no built-in support to switch to an audio description track.
Not available, contribute!