APL Video
The Video
component displays a video player that can display a single video or a series a videos. The embedded video player doesn't have any controls. Instead, the Video
component provides the events and commands necessary to build the controls for controlling the video player. For details about how to control the video player, see the PlayMedia and ControlMedia commands.
Skills that use the Video
component must provide a way to pause the video content by voice and by the use of an on-screen button.
Properties
The Video component has the following properties in addition to the base component properties. See the meaning of the columns.
Property | Type | Default | Styled | Dynamic | Description | |
---|---|---|---|---|---|---|
|
One of: |
|
No |
No |
Audio track to play on. | |
|
boolean |
|
No |
No |
| |
|
boolean |
|
No |
Yes |
When true, mute the audio for the video. | |
|
Array of commands |
[ ] |
No |
No |
Commands to run when the last video track is finished playing. | |
|
Array of commands |
[ ] |
No |
No |
Commands to run when the video switches from playing to paused. | |
|
Array of commands |
[ ] |
No |
No |
Commands to run when the video switches from paused to playing. | |
|
Array of command |
[ ] |
No |
No |
Commands to run when the playback position changes. | |
|
Array of commands |
[ ] |
No |
No |
Commands to run when the current video track changes. | |
|
Array of commands |
[] |
No |
No |
Commands to run when the current track state changes to | |
|
Array of commands |
[] |
No |
No |
Commands to run when an error occurs and video player can't play the media. | |
|
Array of string |
|
No |
No |
Properties to save when reinflating the document with the | |
|
One of: |
|
No |
No |
How the video should scale to fill the space. | |
|
Boolean |
|
No |
Yes |
When | |
|
URL or source array |
[ ] |
No |
Yes |
Video source or sources. |
When the Video
is the source or target of an event, the following values are reported in event.source
or event.target
:
{
// Video-specific values
"type": "Video",
"currentTime": Integer, // Current playback position in the current track, expressed in milliseconds
"duration": Integer, // Duration of of the current track in milliseconds. Returns -1 if the track duration is unknown.
"ended": Boolean, // True if the video is in the ended state
"paused": Boolean, // True if the video is in a paused state
"muted": Boolean, // True if the video audio is muted
"trackCount": Integer, // Total number of video tracks
"trackIndex": Integer, // Index of the current track (0-based)
"trackState": notReady | ready | // State of the current track
failed
"url": URL, // The URL of the current track
// General component values
"bind": Map, // Access to component data-binding context
"checked": Boolean, // Checked state
"disabled": Boolean, // Disabled state
"focused": Boolean, // Focused state
"height": Number, // Height of the component, in dp (includes the padding)
"id": ID, // ID of the component
"opacity": Number, // Opacity of the component [0-1]
"pressed": Boolean, // Pressed state
"uid": UID, // Runtime-generated unique ID of the component
"width": Number // Width of the component, in dp (includes the padding)
}
height and width
The height
and width
of the Video
component default to 100dp
when not specified.
audioTrack
The audioTrack
property assigns the media content to foreground or background audio, or mutes it entirely (none
). Foreground audio interleaves with speech commands and sound effects. Background audio plays behind speech commands and sound effects. Only one audio source may be foreground or background at a time.
Value | Description |
---|---|
|
Audio plays on the foreground track. Speaking with the Alexa voice causes this media to pause. |
|
Audio plays on the background music track. It stops any existing background audio. Speaking with the Alexa voice may cause this media to duck or pause briefly. |
|
Audio content is ignored and only the video content is played. |
With audioTrack
set to foreground
, the PlayMedia command does not "finish" until the all media tracks have finished. Thus, a simple command sequence can run that interleaves media content and speech, as shown in this example.
"onPress": [
{
"type": "PlayMedia",
"componentId": "myVideoPlayer",
"source": URL,
"audioTrack": "foreground"
},
{
"type": "SpeakItem",
"description": "This will run after the media finishes playing",
"componentId": "myAnswerBox"
}
]
When a PlayMedia
command has audioTrack
set to background
or none
, the audio "finishes" immediately, and does not wait for the media content to end. Background media does not respond to touching the screen. For example, in the following sequence the SendEvent
command fires immediately and does not wait for the media to finish playing.
"onPress": [
{
"type": "PlayMedia",
"componentId": "myVideoPlayer",
"source": URL,
"audioTrack": "background"
},
{
"type": "SendEvent",
"description": "This will run immediately",
"arguments": ["Media has started, but hasn't stopped yet"]
}
]
autoplay
When true
, the video automatically starts playing as soon as the video loads. When false
, you must use a command to explicitly start the video playback. For details about the commands to control video playback, see the PlayMedia and ControlMedia commands. The autoplay
property defaults to false
.
autoplay
to true
and also set shouldEndSession
to false
to keep the session open for voice input. This causes the microphone to open during video playback. For details about how to get voice input when the video finishes, see Manage voice input after playing a video.muted
When true
, mute the audio for the video player. Ignored when audioTrack
is set to none
. The muted
property defaults to false
.
onEnd
The onEnd
handler runs when the last video in the video sequence finishes repeating and finishes playing. The onEnd
handler can run multiple times. For example, a video might play through to the end and stop, and then receive a seek command to rewind to an earlier point, and then receive a play command, and then play through to the end and stop.
The event generated has the following form:
"event": {
"source": {
"type": "Video",
"handler": "End",
... // Component source properties
},
"trackIndex": Integer, // Will be equal to trackCount – 1.
"trackCount": Integer,
"trackState": String,
"currentTime": Integer, // Will be equal to or greater than duration
"duration": Integer,
"paused": true,
"ended": true
}
Refer to Event source for a description of event.source
properties.
The onEnd
event handler runs in normal mode in the component data-binding context.
onPause
The onPause
handler runs when the video playback intentionally switches from playing to paused. This might occur because the video player reached the end of the last video, a command stopped the playback, or the user intentionally advanced to the next track and therefore interrupted the synchronous video playback. The onPause
handler doesn't run if the video player has to pause playback to download video content or when playback can't continue due to an error and invokes onTrackFail
handler.
The event generated has the following form:
"event": {
"source": {
"type": "Video",
"handler": "Pause",
... // Component source properties
},
"trackIndex": Integer,
"trackCount": Integer,
"trackState": String,
"currentTime": Integer,
"duration": Integer,
"paused": true,
"ended": BOOLEAN
}
Refer to Event source for a description of event.source
properties.
The onPause
event handler runs in normal mode in the component data-binding context.
onPlay
The onPlay
handler is invoked each time when the video playback switches from paused to playing. This may occur in a video with autoplay
set to true
, from a PlayMedia command, or from a play command.
The event generated has this form:
"event": {
"source": {
"type": "Video",
"handler": "Play",
... // Component source properties
},
"trackIndex": Integer,
"trackCount": Integer,
"trackState": String,
"currentTime": Integer,
"duration": Integer,
"paused": false,
"ended": BOOLEAN
}
Refer to Event source for a description of event.source
properties.
The onPlay
event handler runs in normal mode in the component data-binding context.
onTimeUpdate
The onTimeUpdate
handler is invoked when the playback position of the
current video changes. The handler is invoked on a "best effort"
basis; there is no guaranteed frequency of invocation.
The event generated has the form:
"event": {
"source": {
"type": "Video",
"handler": "TimeUpdate",
... // Component source properties
},
"trackIndex": Integer,
"trackCount": Integer,
"currentTime": Integer,
"trackState": String,
"duration": Integer,
"paused": BOOLEAN,
"seekable": BOOLEAN,
"ended": BOOLEAN
}
Refer to Event source for a description of event.source
properties.
The onTimeUpdate
handler runs in fast mode in the component data-binding context.
onTrackUpdate
The onTrackUpdate
handler is invoked when the active video track changes. This can happen during normal video sequence playback as the player advances to the next video track or as a result of a command issued against the player.
The event generated has this form:
"event": {
"source": {
"type": "Video",
"handler": "TrackUpdate",
... // Component source properties
},
"trackIndex": Integer,
"trackCount": Integer,
"trackState": String,
"currentTime": Integer,
"duration": Integer,
"paused": BOOLEAN,
"ended": BOOLEAN
}
Refer to Event source for a description of event.source
properties.
The onTrackUpdate
handler runs in fast mode in the component data-binding context.
onTrackReady
The onTrackReady
handler runs for a track when trackState
changes from notReady
to ready
. This happens before the playback starts for each track when trackState
changes from notReady
to ready
.
The event generated in case of a successful load has the following form:
"event": {
"source": {
"type": "Video",
"handler": "TrackReady",
... // Component source properties
},
"trackIndex": Integer,
"trackState": 'ready',
}
Refer to Event source for a description of event.source
properties.
The onTrackReady
event handler runs in fast mode in the component data-binding context.
The handler runs in the following scenarios:
-
The first source is valid and the video player gets the related media information required to start the playback. This happens after the document inflates and before the playback starts.
In this scenario, the handler generates an event with the following form:
"event": { "source": { "type": "Video", "handler": "TrackReady", ... // Component source properties }, "trackIndex": 0, "trackState": 'ready', }
- If source is invalid or the media isn't supported,
onTrackFail
runs instead.
- If source is invalid or the media isn't supported,
-
The video player advances to the next track after
onTrackUpdate
if thetrackState
property on theonTrackUpdate
event isnotReady
.For example, advancing to the second track runs
onTrackUpdate
, followed byonTrackReady
. Assuming the second track has a duration of 5000 milliseconds, these handlers generate the following events:"event": { "source": { "type": "Video", "handler": "TrackUpdate", ... // Component source properties }, "trackIndex": 1, "trackCount": N, "trackState": 'notReady', "currentTime": 0, "duration": 5000, "paused": false, "ended": false }
"event": { "source": { "type": "Video", "handler": "TrackReady", ... // Component source properties }, "trackIndex": 1, "trackState": 'ready', }
onTrackFail
The onTrackFail
handler runs for a track if an error occurs when trying to start the playback or during the playback or moving to a track. Errors might occur if source URL isn't accessible or media content is unsupported or malformed.
The player doesn't advance to the next track in case of failure and remains at the state it was when onTrackFail
runs.
The event generated has the following form:
"event": {
"source": {
"type": "Video",
"handler": "TrackFail",
... // Component source properties
},
"trackIndex": Integer,
"trackState": "failed",
"currentTime": Integer,
"errorCode": Number // Platform defined numerical error
}
Refer to Event source for a description of event.source
properties.
The onTrackFail
event handler runs in fast mode in the component data-binding context.
The handler runs in the following scenarios:
-
The video player fails to load the first track for playback, or the media content for the first track is malformed or unsupported. Playback won't start and video player won't advance to next track automatically.
In this scenario,
onTrackFail
generates the following event:"event": { "source": { "type": "Video", "handler": "TrackFail", ... // Component source properties }, "trackIndex": 0, "trackState": 'failed', "currentTime": 0, "errorCode": Number }
-
The video player advances to a track that fails to load, or is malformed, or unsupported.
In this scenario,
onTrackFail
runs afteronTrackUpdate
.For example, assume the video player advanced to a track with an invalid URL. The
onTrackUpdate
handler runs, followed byonTrackFail
. The handlers generate the follwoing events:"event": { "source": { "type": "Video", "handler": "TrackUpdate", ... // Component source properties }, "trackIndex": 1, "trackCount": N, "trackState": 'notReady', "currentTime": 0, "duration": 0, "paused": 0, "ended": 0 }
"event": { "source": { "type": "Video", "handler": "TrackFail", ... // Component source properties }, "trackIndex": 1, "trackState": 'failed', "currentTime": 0, "errorCode": Number }
-
The video player advances to a new track, invokes
onTrackUpdate
followed byonTrackReady
and begins to play the track. However, after the playback begins, an error occurs and playback can't continue. Playback stops at the time whenonTrackFail
runs and the video player won't advance to next track automatically.For example, assume the video player is playing the second track in the sequence, with a duration of 5000 milliseconds. An error occurs at 1500 milliseconds. The playback stops, and the
onTrackFail
handler runs and generates the following event:"event": { "source": { "type": "Video", "handler": "TrackFail", ... // Component source properties }, "trackIndex": 1, "trackState": 'failed', "currentTime": 1500, "errorCode": Number }
preserve
An array of dynamic component properties and bound properties to save when reinflating the document with the Reinflate
command.
A Video
component has the following component-specific property names you can assign to the preserve
array:
source
– The array of tracksplayingState
– The state of the player (playing or paused).
The source
option saves the current list of sources from the old video player and restores them in the new video player. This includes the currently selected track and position within that track.
The playingState
option saves whether or not the video is currently playing. If playingState
is not preserved, then the autoplay
property in the video component is used to decide if the video player should automatically start playing video.
scale
Scales the video within the container.
Name | Description |
---|---|
best-fill | Scale the video so that it fills the container with no letterboxing. The top/bottom or left/right sides of the video are hidden if the video has a different aspect ratio than the container. |
best-fit | Scale the video so that it fits within the container. Letterbox blocks are applied to the sides or top/bottom of the video if it has different aspect ratio than the container. |
screenLock
The screenLock
property controls the interaction time for the document when a video is playing. When the screenLock
property is true
(the default), playing the video disables the timer. When screenLock
is false
, the document lifecycle doesn't change.
source
Specifies the video clip or sequence of video clips to play. The source
property can be either a plain URL (string) or an array of source data.
When you provide multiple videos in an array, the player plays each video in turn. The source
property of the Video
component and the url
property of each source follow the rules of "array-ification". For the url
property, you can provide a plain string as a single URL, as well as an object with a url
property.
Each line in the following example shows a valid way to set the source
property.
"source": URL
"source": [ URL ]
"source": { "url": URL }
"source": [ { "url": URL } ]
"source": [ URL1, { "url": URL2 } ]
The most general way of specifying the media sources is to fully expand the definition:
"source": [
{
"description": "The first video clip to play",
"offset": 150, // Skip the first 150 milliseconds
"url": URL1,
},
{
"description": "The second video clip to play",
"url": URL2,
"repeatCount": -1 // Repeat forever
},
{
"description": "This video clip will only be reached by a command",
"url": URL3
}
]
The following minimal definition is equivalent to the previous definition:
"source": [
{
"offset": 150, // Skip the first 150 milliseconds
"url": URL1,
},
{
"url": URL2,
"repeatCount": -1 // Repeat forever
},
URL3
]
When source
is a data array, it has the following structure:
Property | Type | Default | Description |
---|---|---|---|
|
String |
"" |
Optional description of this source material |
|
Number |
none |
Duration of time to play. If zero or negative, plays the entire stream. Expressed in milliseconds. |
|
URL |
REQUIRED |
Media source material |
|
Array of |
|
Object representing a text track for this video source, used to provide closed captions during playback. |
|
Integer |
0 |
Number of times to loop the video. |
|
Array of Entities |
[] |
Entity data to set when reporting this media to Alexa |
|
Number |
0 |
Offset to start playing at in the stream. |
duration
Duration of time to play, in milliseconds. Leave not set to play the full video. When set to a duration shorter than the actual media clip, the video player stops after playing for the specified time. Setting duration
to a time longer than the actual media clip doesn't add any extra playing time. If duration
is zero or smaller, the video player plays the entire media clip.
url (urls)
The URL of the source of the media. Must be https URLs. Determine which formats are supported by checking the video property of the Viewport.
textTrack
The textTrack
property holds data about the text track or multiple text tracks that can be added to a video source. Use the property to provide text to display as closed captions during video playback.
A textTrack
object has the properties shown in the following table.
Property | Type | Default | Description |
---|---|---|---|
|
String |
|
Optional description of this text track |
|
URL |
Required |
Source for this text track. Provide a URL to a SubRip Subtitle File (SRT). |
|
String |
REQUIRED |
How this text track is meant to be used. Must be set to |
A video source can have a single textTrack
with type
set to caption
. If you provide additional textTrack
objects for the same video source, the video player ignores them during playback.
If the first textTrack
fails to load, the video player continues to play the video without any captions. Subsequent tracks aren't tried.
The following examples show valid ways to set the textTrack
property:
"textTrack": [{ "url": URL, "type": "caption" }]
"textTrack": [{ "description: "caption track", "url": URL, "type": "caption" }]
The following example shows the source
array for a Video
component. Each source
in the array has its own textTrack
.
Here is an example of a video source array with multiple sources and each source having its own text track:
{
"source": [
{
"description": "intro",
"url": "https://example.com/videoClips/intro.mp4",
"textTrack": [
{
"description": "intro caption",
"url": "https://example.com/videoClips/intro.srt",
"type": "caption"
}
]
},
{
"description": "main",
"url": "https://example.com/videoClips/main.mp4",
"textTrack": [
{
"description": "main caption",
"url": "https://example.com/videoClips/main.srt",
"type": "caption"
}
]
}
]
}
The Video
component supports text tracks in SubRip Subtitle File (SRT) format.
Users can control whether they see captions during video playback. On an Echo Show device, turn on the Settings > Accessibility > Captioning > Closed Captioning setting. For more details about these settings, see Turn On Captioning on Echo Devices with a Screen.
repeatCount
The number of times to repeat playing this media. Defaults to 0, which means to play once through and stop. If set to -1, the video will repeat forever.
offset
The offset
from the start of the media where it should start playing, expressed in milliseconds. Defaults to 0, which means that play begins at the start of the media. A video with a positive repeatCount
value will restart playing the media at the same offset each time.
Video State
The video player has an exposed state with the values shown in the following table.
Property | Type | Description |
---|---|---|
|
Integer |
Current track in the source array. 0-based index |
|
Integer |
Total number of tracks in the source array. |
|
|
State of the current track |
|
Integer |
Current playback position in the current track, expressed in milliseconds. |
|
Integer |
Duration of the current track, in milliseconds. Returns |
|
Boolean |
True when the video is in a paused state. |
|
Boolean |
True when all video tracks have ended playing |
The video event handlers expose these states as separate properties in the event
property.
The following example shows how to send a request to the skill with information about video playback when the video begins to play. The following onPlay
handler runs the SendEvent
command and includes the event
properties in the arguments
.
{
"onPlay": {
"type": "SendEvent",
"arguments": [
"Track index is ${event.trackIndex}",
"Track count is ${event.trackCount}",
"Track media state is ${event.trackState}",
"Current playback position is ${event.currentTime} milliseconds",
"Track duration is ${event.duration}",
"Playback is ${event.paused ? 'paused' : 'running'}",
"Playback is ${event.ended ? '' : 'not '}ended"
]
}
}
The following example shows the UserEvent request the skill receives. The arguments
property of the request includes the video playback data specified in the arguments
property of the command.
{
"request": {
"type": "Alexa.Presentation.APL.UserEvent",
"requestId": "amzn1.echo-api.request.1",
"timestamp": "2021-09-16T23:29:52Z",
"locale": "en-US",
"arguments": [
"Track index is 0",
"Track count is 3",
"Track media state is ready",
"Current playback position is 0 milliseconds",
"Track duration is 32439",
"Playback is running",
"Playback is not ended"
],
"components": {},
"source": {
"type": "Video",
"handler": "Play",
"id": "videoPlayerId"
},
"token": "videoHandlersExampleToken"
}
}
trackState
Returns a value that describes the state of the current track or media source.
State | Description |
---|---|
|
Initial or default state for a track. Playback can't start at this state. |
|
Track is ready and playback can start. |
|
Track has failed and playback can't start or continue at current playback position. |
Sample video
{
"type": "Video",
"height": "100%",
"width": "70%",
"alignSelf": "center",
"shrink": 1,
"autoplay": true,
"audioTrack": "foreground",
"id": "videoPlayerId",
"source": [
"https://d2o906d8ln7ui1.cloudfront.net/videos/AdobeStock_277864451.mov",
"https://d2o906d8ln7ui1.cloudfront.net/videos/AdobeStock_292807382.mov"
]
}
The following example shows a full document with the Video
component. This example uses the AlexaTransportControls
and AlexaSlider
responsive components to provide UI controls for the video.
Playback intents
You must implement built-in intents to support voice-based playback control.
Manage voice input after playing a video
To accept voice input after a video ends, use the onEnd event handler to invoke a SendEvent command. Your skill then handles the subsequent UserEvent request. Send a response with shouldEndSession
set to false
to accept voice input. Your response should include appropriate outputSpeech and reprompt objects to ask your Alexa customer for input.
Devices that don't support video
Some devices with screens do not support video playback. On a device that does not support video, the Video
component remains on the screen, but displays no content so users will see a blank area on the screen. Provide an alternative experience for devices that don't support video.
The disallowVideo
property in the data-binding context returns true
when the device does not support video. Use this property in the conditional logic in your APL document.
Alternatively, you can check for video support in your skill code. Check the context.Viewport.video
property in the request sent to your skill.
Related topics
Last updated: Oct 29, 2024