2010-04-02 07:03:07 +04:00
|
|
|
/* -*- Mode: C++; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
|
|
|
|
/* vim:set ts=2 sw=2 sts=2 et cindent: */
|
2012-05-21 15:12:37 +04:00
|
|
|
/* This Source Code Form is subject to the terms of the Mozilla Public
|
|
|
|
* License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
|
|
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
|
2010-04-27 12:53:44 +04:00
|
|
|
/*
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
Each media element for a media file has one thread called the "audio thread".
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The audio thread writes the decoded audio data to the audio
|
|
|
|
hardware. This is done in a separate thread to ensure that the
|
|
|
|
audio hardware gets a constant stream of data without
|
|
|
|
interruption due to decoding or display. At some point
|
|
|
|
AudioStream will be refactored to have a callback interface
|
|
|
|
where it asks for data and this thread will no longer be
|
|
|
|
needed.
|
|
|
|
|
2015-07-16 21:13:49 +03:00
|
|
|
The element/state machine also has a TaskQueue which runs in a
|
2014-06-18 09:07:02 +04:00
|
|
|
SharedThreadPool that is shared with all other elements/decoders. The state
|
|
|
|
machine dispatches tasks to this to call into the MediaDecoderReader to
|
|
|
|
request decoded audio or video data. The Reader will callback with decoded
|
|
|
|
sampled when it has them available, and the state machine places the decoded
|
|
|
|
samples into its queues for the consuming threads to pull from.
|
|
|
|
|
|
|
|
The MediaDecoderReader can choose to decode asynchronously, or synchronously
|
|
|
|
and return requested samples synchronously inside it's Request*Data()
|
|
|
|
functions via callback. Asynchronous decoding is preferred, and should be
|
|
|
|
used for any new readers.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
|
|
|
Synchronisation of state between the thread is done via a monitor owned
|
2012-11-14 23:46:40 +04:00
|
|
|
by MediaDecoder.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The lifetime of the audio thread is controlled by the state machine when
|
|
|
|
it runs on the shared state machine thread. When playback needs to occur
|
|
|
|
the audio thread is created and an event dispatched to run it. The audio
|
|
|
|
thread exits when audio playback is completed or no longer required.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
|
|
|
A/V synchronisation is handled by the state machine. It examines the audio
|
|
|
|
playback time and compares this to the next frame in the queue of video
|
|
|
|
frames. If it is time to play the video frame it is then displayed, otherwise
|
|
|
|
it schedules the state machine to run again at the time of the next frame.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
Frame skipping is done in the following ways:
|
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
1) The state machine will skip all frames in the video queue whose
|
2010-04-27 12:53:44 +04:00
|
|
|
display time is less than the current audio time. This ensures
|
|
|
|
the correct frame for the current time is always displayed.
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
2) The decode tasks will stop decoding interframes and read to the
|
2010-04-27 12:53:44 +04:00
|
|
|
next keyframe if it determines that decoding the remaining
|
|
|
|
interframes will cause playback issues. It detects this by:
|
|
|
|
a) If the amount of audio data in the audio queue drops
|
|
|
|
below a threshold whereby audio may start to skip.
|
|
|
|
b) If the video queue drops below a threshold where it
|
|
|
|
will be decoding video data that won't be displayed due
|
|
|
|
to the decode thread dropping the frame immediately.
|
2014-06-18 09:07:02 +04:00
|
|
|
TODO: In future we should only do this when the Reader is decoding
|
|
|
|
synchronously.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
When hardware accelerated graphics is not available, YCbCr conversion
|
2014-06-18 09:07:02 +04:00
|
|
|
is done on the decode task queue when video frames are decoded.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The decode task queue pushes decoded audio and videos frames into two
|
2010-04-27 12:53:44 +04:00
|
|
|
separate queues - one for audio and one for video. These are kept
|
2011-08-16 09:19:51 +04:00
|
|
|
separate to make it easy to constantly feed audio data to the audio
|
2010-04-27 12:53:44 +04:00
|
|
|
hardware while allowing frame skipping of video data. These queues are
|
2011-07-12 07:39:39 +04:00
|
|
|
threadsafe, and neither the decode, audio, or state machine should
|
2010-04-27 12:53:44 +04:00
|
|
|
be able to monopolize them, and cause starvation of the other threads.
|
|
|
|
|
|
|
|
Both queues are bounded by a maximum size. When this size is reached
|
2014-06-18 09:07:02 +04:00
|
|
|
the decode tasks will no longer request video or audio depending on the
|
|
|
|
queue that has reached the threshold. If both queues are full, no more
|
|
|
|
decode tasks will be dispatched to the decode task queue, so other
|
|
|
|
decoders will have an opportunity to run.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
During playback the audio thread will be idle (via a Wait() on the
|
2011-07-12 07:39:39 +04:00
|
|
|
monitor) if the audio queue is empty. Otherwise it constantly pops
|
2011-08-16 09:19:51 +04:00
|
|
|
audio data off the queue and plays it with a blocking write to the audio
|
2013-03-19 08:12:36 +04:00
|
|
|
hardware (via AudioStream).
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
*/
|
2012-11-14 23:46:40 +04:00
|
|
|
#if !defined(MediaDecoderStateMachine_h__)
|
|
|
|
#define MediaDecoderStateMachine_h__
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2013-05-30 00:43:41 +04:00
|
|
|
#include "mozilla/Attributes.h"
|
2015-07-16 21:52:43 +03:00
|
|
|
#include "mozilla/ReentrantMonitor.h"
|
|
|
|
#include "mozilla/StateMirroring.h"
|
|
|
|
|
2016-06-07 23:10:18 +03:00
|
|
|
#include "nsAutoPtr.h"
|
2010-04-02 07:03:07 +04:00
|
|
|
#include "nsThreadUtils.h"
|
2012-11-14 23:46:40 +04:00
|
|
|
#include "MediaDecoder.h"
|
2013-09-06 00:25:17 +04:00
|
|
|
#include "MediaDecoderReader.h"
|
|
|
|
#include "MediaDecoderOwner.h"
|
2015-08-06 05:14:15 +03:00
|
|
|
#include "MediaEventSource.h"
|
2013-09-06 00:25:17 +04:00
|
|
|
#include "MediaMetadataManager.h"
|
2015-09-15 05:04:50 +03:00
|
|
|
#include "MediaStatistics.h"
|
2015-03-10 19:59:30 +03:00
|
|
|
#include "MediaTimer.h"
|
2015-07-03 10:33:56 +03:00
|
|
|
#include "ImageContainer.h"
|
2016-07-07 09:44:53 +03:00
|
|
|
#include "SeekJob.h"
|
2016-04-14 07:01:55 +03:00
|
|
|
#include "SeekTask.h"
|
2013-09-06 00:25:17 +04:00
|
|
|
|
2012-11-14 23:45:33 +04:00
|
|
|
namespace mozilla {
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2015-08-18 06:55:01 +03:00
|
|
|
namespace media {
|
2015-09-02 09:14:40 +03:00
|
|
|
class MediaSink;
|
2015-08-18 06:55:01 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
class AudioSegment;
|
2015-09-07 14:32:04 +03:00
|
|
|
class DecodedStream;
|
2016-03-31 11:35:47 +03:00
|
|
|
class MediaDecoderReaderWrapper;
|
2015-12-24 05:14:16 +03:00
|
|
|
class OutputStreamManager;
|
2015-07-16 21:52:43 +03:00
|
|
|
class TaskQueue;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2015-11-15 16:49:01 +03:00
|
|
|
extern LazyLogModule gMediaDecoderLog;
|
|
|
|
extern LazyLogModule gMediaSampleLog;
|
2015-06-07 00:42:40 +03:00
|
|
|
|
2015-12-01 04:34:02 +03:00
|
|
|
enum class MediaEventType : int8_t {
|
|
|
|
PlaybackStarted,
|
|
|
|
PlaybackStopped,
|
|
|
|
PlaybackEnded,
|
2016-08-03 12:18:54 +03:00
|
|
|
SeekStarted,
|
2015-12-01 04:34:02 +03:00
|
|
|
DecodeError,
|
2016-07-04 07:35:21 +03:00
|
|
|
Invalidate,
|
|
|
|
EnterVideoSuspend,
|
|
|
|
ExitVideoSuspend
|
2015-12-01 04:34:02 +03:00
|
|
|
};
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
/*
|
2011-07-12 07:39:39 +04:00
|
|
|
The state machine class. This manages the decoding and seeking in the
|
2014-06-18 09:07:02 +04:00
|
|
|
MediaDecoderReader on the decode task queue, and A/V sync on the shared
|
2010-04-02 07:03:07 +04:00
|
|
|
state machine thread, and controls the audio "push" thread.
|
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
All internal state is synchronised via the decoder monitor. State changes
|
2015-08-10 04:54:48 +03:00
|
|
|
are propagated by scheduling the state machine to run another cycle on the
|
|
|
|
shared state machine thread.
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-14 23:46:40 +04:00
|
|
|
See MediaDecoder.h for more details.
|
2010-04-02 07:03:07 +04:00
|
|
|
*/
|
2014-04-02 16:51:47 +04:00
|
|
|
class MediaDecoderStateMachine
|
2010-04-02 07:03:07 +04:00
|
|
|
{
|
2014-04-02 16:51:47 +04:00
|
|
|
NS_INLINE_DECL_THREADSAFE_REFCOUNTING(MediaDecoderStateMachine)
|
2016-05-31 07:32:37 +03:00
|
|
|
|
|
|
|
using TrackSet = MediaDecoderReader::TrackSet;
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
public:
|
2015-04-01 04:44:36 +03:00
|
|
|
typedef MediaDecoderOwner::NextFrameStatus NextFrameStatus;
|
2015-07-03 10:33:56 +03:00
|
|
|
typedef mozilla::layers::ImageContainer::FrameID FrameID;
|
2012-11-14 23:46:40 +04:00
|
|
|
MediaDecoderStateMachine(MediaDecoder* aDecoder,
|
2016-08-15 09:54:10 +03:00
|
|
|
MediaDecoderReader* aReader);
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2016-04-01 18:10:44 +03:00
|
|
|
nsresult Init(MediaDecoder* aDecoder);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2016-04-27 09:50:39 +03:00
|
|
|
void SetMediaDecoderReaderWrapperCallback();
|
|
|
|
void CancelMediaDecoderReaderWrapperCallback();
|
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
// Enumeration for the valid decoding states
|
|
|
|
enum State {
|
|
|
|
DECODER_STATE_DECODING_METADATA,
|
2015-03-23 05:31:15 +03:00
|
|
|
DECODER_STATE_WAIT_FOR_CDM,
|
2013-06-10 16:22:05 +04:00
|
|
|
DECODER_STATE_DORMANT,
|
2016-09-01 09:28:45 +03:00
|
|
|
DECODER_STATE_DECODING_FIRSTFRAME,
|
2012-11-07 02:33:02 +04:00
|
|
|
DECODER_STATE_DECODING,
|
|
|
|
DECODER_STATE_SEEKING,
|
|
|
|
DECODER_STATE_BUFFERING,
|
|
|
|
DECODER_STATE_COMPLETED,
|
2016-08-26 08:41:45 +03:00
|
|
|
DECODER_STATE_SHUTDOWN
|
2012-11-07 02:33:02 +04:00
|
|
|
};
|
|
|
|
|
2016-04-22 09:18:26 +03:00
|
|
|
void DumpDebugInfo();
|
|
|
|
|
2015-05-28 09:17:30 +03:00
|
|
|
void AddOutputStream(ProcessedMediaStream* aStream, bool aFinishWhenEnded);
|
2015-07-24 15:28:17 +03:00
|
|
|
// Remove an output stream added with AddOutputStream.
|
|
|
|
void RemoveOutputStream(MediaStream* aStream);
|
2013-06-10 16:22:05 +04:00
|
|
|
|
2015-09-22 06:03:08 +03:00
|
|
|
// Seeks to the decoder to aTarget asynchronously.
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaDecoder::SeekPromise> InvokeSeek(SeekTarget aTarget);
|
2015-09-22 06:03:08 +03:00
|
|
|
|
2013-06-10 16:22:05 +04:00
|
|
|
// Set/Unset dormant state.
|
2015-09-21 09:01:45 +03:00
|
|
|
void DispatchSetDormant(bool aDormant);
|
2015-03-25 01:12:08 +03:00
|
|
|
|
2015-12-03 10:59:44 +03:00
|
|
|
RefPtr<ShutdownPromise> BeginShutdown();
|
2015-09-22 06:03:08 +03:00
|
|
|
|
|
|
|
// Notifies the state machine that should minimize the number of samples
|
|
|
|
// decoded we preroll, until playback starts. The first time playback starts
|
|
|
|
// the state machine is free to return to prerolling normally. Note
|
|
|
|
// "prerolling" in this context refers to when we decode and buffer decoded
|
|
|
|
// samples in advance of when they're needed for playback.
|
|
|
|
void DispatchMinimizePrerollUntilPlaybackStarts()
|
|
|
|
{
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaDecoderStateMachine> self = this;
|
2015-09-22 06:03:08 +03:00
|
|
|
nsCOMPtr<nsIRunnable> r = NS_NewRunnableFunction([self] () -> void
|
|
|
|
{
|
|
|
|
MOZ_ASSERT(self->OnTaskQueue());
|
|
|
|
self->mMinimizePreroll = true;
|
|
|
|
|
|
|
|
// Make sure that this arrives before playback starts, otherwise this won't
|
|
|
|
// have the intended effect.
|
|
|
|
MOZ_DIAGNOSTIC_ASSERT(self->mPlayState == MediaDecoder::PLAY_STATE_LOADING);
|
|
|
|
});
|
|
|
|
OwnerThread()->Dispatch(r.forget());
|
|
|
|
}
|
|
|
|
|
|
|
|
// Set the media fragment end time. aEndTime is in microseconds.
|
|
|
|
void DispatchSetFragmentEndTime(int64_t aEndTime)
|
|
|
|
{
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaDecoderStateMachine> self = this;
|
2015-09-22 06:03:08 +03:00
|
|
|
nsCOMPtr<nsIRunnable> r = NS_NewRunnableFunction([self, aEndTime] () {
|
|
|
|
self->mFragmentEndTime = aEndTime;
|
|
|
|
});
|
|
|
|
OwnerThread()->Dispatch(r.forget());
|
|
|
|
}
|
|
|
|
|
2015-09-27 16:37:48 +03:00
|
|
|
void DispatchAudioOffloading(bool aAudioOffloading)
|
|
|
|
{
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaDecoderStateMachine> self = this;
|
2015-09-27 16:37:48 +03:00
|
|
|
nsCOMPtr<nsIRunnable> r = NS_NewRunnableFunction([=] () {
|
|
|
|
if (self->mAudioOffloading != aAudioOffloading) {
|
|
|
|
self->mAudioOffloading = aAudioOffloading;
|
|
|
|
self->ScheduleStateMachine();
|
|
|
|
}
|
|
|
|
});
|
|
|
|
OwnerThread()->Dispatch(r.forget());
|
|
|
|
}
|
|
|
|
|
2016-03-01 10:21:11 +03:00
|
|
|
// Drop reference to mResource. Only called during shutdown dance.
|
2015-09-22 06:03:08 +03:00
|
|
|
void BreakCycles() {
|
|
|
|
MOZ_ASSERT(NS_IsMainThread());
|
|
|
|
mResource = nullptr;
|
|
|
|
}
|
|
|
|
|
2015-09-07 06:38:34 +03:00
|
|
|
TimedMetadataEventSource& TimedMetadataEvent() {
|
|
|
|
return mMetadataManager.TimedMetadataEvent();
|
|
|
|
}
|
|
|
|
|
2016-04-20 11:02:37 +03:00
|
|
|
MediaEventSource<void>& OnMediaNotSeekable() const;
|
2015-12-02 10:42:32 +03:00
|
|
|
|
2015-11-23 05:35:18 +03:00
|
|
|
MediaEventSourceExc<nsAutoPtr<MediaInfo>,
|
|
|
|
nsAutoPtr<MetadataTags>,
|
|
|
|
MediaDecoderEventVisibility>&
|
|
|
|
MetadataLoadedEvent() { return mMetadataLoadedEvent; }
|
|
|
|
|
|
|
|
MediaEventSourceExc<nsAutoPtr<MediaInfo>,
|
|
|
|
MediaDecoderEventVisibility>&
|
|
|
|
FirstFrameLoadedEvent() { return mFirstFrameLoadedEvent; }
|
|
|
|
|
2015-12-01 04:34:02 +03:00
|
|
|
MediaEventSource<MediaEventType>&
|
|
|
|
OnPlaybackEvent() { return mOnPlaybackEvent; }
|
2015-11-30 08:06:19 +03:00
|
|
|
|
2016-04-20 09:45:40 +03:00
|
|
|
size_t SizeOfVideoQueue() const;
|
2015-09-22 06:03:08 +03:00
|
|
|
|
2016-04-20 09:45:40 +03:00
|
|
|
size_t SizeOfAudioQueue() const;
|
2015-09-22 06:03:08 +03:00
|
|
|
|
2015-03-25 01:12:08 +03:00
|
|
|
private:
|
2016-08-12 09:54:12 +03:00
|
|
|
static const char* ToStateStr(State aState);
|
|
|
|
const char* ToStateStr();
|
|
|
|
|
2015-11-02 05:36:50 +03:00
|
|
|
// Functions used by assertions to ensure we're calling things
|
|
|
|
// on the appropriate threads.
|
|
|
|
bool OnTaskQueue() const;
|
|
|
|
|
2015-05-01 01:46:49 +03:00
|
|
|
// Initialization that needs to happen on the task queue. This is the first
|
|
|
|
// task that gets run on the task queue, and is dispatched from the MDSM
|
|
|
|
// constructor immediately after the task queue is created.
|
2015-12-03 10:59:30 +03:00
|
|
|
void InitializationTask(MediaDecoder* aDecoder);
|
2015-05-01 01:46:49 +03:00
|
|
|
|
2015-09-21 09:01:45 +03:00
|
|
|
void SetDormant(bool aDormant);
|
|
|
|
|
2015-09-09 05:12:35 +03:00
|
|
|
void SetAudioCaptured(bool aCaptured);
|
2015-05-28 09:17:30 +03:00
|
|
|
|
2016-03-19 02:51:06 +03:00
|
|
|
void ReadMetadata();
|
|
|
|
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaDecoder::SeekPromise> Seek(SeekTarget aTarget);
|
2015-09-21 09:01:45 +03:00
|
|
|
|
2015-12-03 10:59:44 +03:00
|
|
|
RefPtr<ShutdownPromise> Shutdown();
|
2015-03-25 01:12:08 +03:00
|
|
|
|
2015-12-03 10:59:44 +03:00
|
|
|
RefPtr<ShutdownPromise> FinishShutdown();
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Update the playback position. This can result in a timeupdate event
|
|
|
|
// and an invalidate of the frame being dispatched asynchronously if
|
|
|
|
// there is no such event currently queued.
|
|
|
|
// Only called on the decoder thread. Must be called with
|
|
|
|
// the decode monitor held.
|
2012-11-07 02:33:02 +04:00
|
|
|
void UpdatePlaybackPosition(int64_t aTime);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Causes the state machine to switch to buffering state, and to
|
2015-03-25 03:12:54 +03:00
|
|
|
// immediately stop playback and buffer downloaded data. Called on
|
|
|
|
// the state machine thread.
|
2012-11-07 02:33:02 +04:00
|
|
|
void StartBuffering();
|
2015-03-25 03:12:54 +03:00
|
|
|
|
2015-09-10 11:40:52 +03:00
|
|
|
bool CanPlayThrough();
|
|
|
|
|
2015-09-15 05:04:50 +03:00
|
|
|
MediaStatistics GetStatistics();
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// This is called on the state machine thread and audio thread.
|
|
|
|
// The decoder monitor must be obtained before calling this.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool HasAudio() const {
|
2015-06-23 19:03:09 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2013-09-27 09:22:38 +04:00
|
|
|
return mInfo.HasAudio();
|
2010-04-02 07:03:07 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
// This is called on the state machine thread and audio thread.
|
|
|
|
// The decoder monitor must be obtained before calling this.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool HasVideo() const {
|
2015-06-23 19:03:09 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2013-09-27 09:22:38 +04:00
|
|
|
return mInfo.HasVideo();
|
2010-04-02 07:03:07 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
// Should be called by main thread.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HaveNextFrameData();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-03-10 19:59:30 +03:00
|
|
|
// Returns the state machine task queue.
|
2015-07-16 21:13:49 +03:00
|
|
|
TaskQueue* OwnerThread() const { return mTaskQueue; }
|
2011-07-12 07:39:34 +04:00
|
|
|
|
2015-03-10 19:59:30 +03:00
|
|
|
// Schedules the shared state machine thread to run the state machine.
|
|
|
|
void ScheduleStateMachine();
|
2011-07-12 07:39:32 +04:00
|
|
|
|
2015-03-10 19:59:30 +03:00
|
|
|
// Invokes ScheduleStateMachine to run in |aMicroseconds| microseconds,
|
|
|
|
// unless it's already scheduled to run earlier, in which case the
|
|
|
|
// request is discarded.
|
|
|
|
void ScheduleStateMachineIn(int64_t aMicroseconds);
|
|
|
|
|
2015-08-24 05:04:21 +03:00
|
|
|
// Discard audio/video data that are already played by MSG.
|
|
|
|
void DiscardStreamData();
|
2016-03-21 12:23:30 +03:00
|
|
|
bool HaveEnoughDecodedAudio();
|
2012-04-30 07:12:42 +04:00
|
|
|
bool HaveEnoughDecodedVideo();
|
|
|
|
|
2016-09-05 11:50:06 +03:00
|
|
|
// True if shutdown process has begun.
|
|
|
|
bool IsShutdown() const;
|
2012-11-07 02:33:01 +04:00
|
|
|
|
2013-11-23 13:48:24 +04:00
|
|
|
// Returns true if we're currently playing. The decoder monitor must
|
|
|
|
// be held.
|
2014-10-30 23:12:00 +03:00
|
|
|
bool IsPlaying() const;
|
2013-11-23 13:48:24 +04:00
|
|
|
|
2015-10-20 12:33:00 +03:00
|
|
|
// TODO: Those callback function may receive demuxed-only data.
|
|
|
|
// Need to figure out a suitable API name for this case.
|
2015-09-10 04:06:00 +03:00
|
|
|
void OnAudioDecoded(MediaData* aAudioSample);
|
2016-03-05 04:32:26 +03:00
|
|
|
void OnVideoDecoded(MediaData* aVideoSample, TimeStamp aDecodeStartTime);
|
2014-12-09 01:45:36 +03:00
|
|
|
void OnNotDecoded(MediaData::Type aType, MediaDecoderReader::NotDecodedReason aReason);
|
2014-12-11 01:03:56 +03:00
|
|
|
|
2015-03-19 19:53:01 +03:00
|
|
|
// Resets all state related to decoding and playback, emptying all buffers
|
|
|
|
// and aborting all pending operations on the decode task queue.
|
2016-05-31 07:32:37 +03:00
|
|
|
void Reset(TrackSet aTracks = TrackSet(TrackInfo::kAudioTrack,
|
|
|
|
TrackInfo::kVideoTrack));
|
2015-02-17 18:50:49 +03:00
|
|
|
|
2012-11-28 06:34:53 +04:00
|
|
|
protected:
|
2014-04-22 15:58:00 +04:00
|
|
|
virtual ~MediaDecoderStateMachine();
|
2012-11-28 06:34:53 +04:00
|
|
|
|
2014-08-28 04:46:00 +04:00
|
|
|
void SetState(State aState);
|
2016-09-05 10:28:37 +03:00
|
|
|
void ExitState();
|
|
|
|
void EnterState();
|
2014-08-28 04:46:00 +04:00
|
|
|
|
2015-06-18 00:22:10 +03:00
|
|
|
void BufferedRangeUpdated();
|
|
|
|
|
2016-06-08 15:28:37 +03:00
|
|
|
void ReaderSuspendedChanged();
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Inserts MediaData* samples into their respective MediaQueues.
|
|
|
|
// aSample must not be null.
|
2015-09-10 04:06:00 +03:00
|
|
|
|
|
|
|
void Push(MediaData* aSample, MediaData::Type aSampleType);
|
2015-04-01 04:44:36 +03:00
|
|
|
|
2015-10-18 08:24:48 +03:00
|
|
|
void OnAudioPopped(const RefPtr<MediaData>& aSample);
|
|
|
|
void OnVideoPopped(const RefPtr<MediaData>& aSample);
|
2015-04-01 04:44:36 +03:00
|
|
|
|
2016-05-12 09:18:53 +03:00
|
|
|
void AudioAudibleChanged(bool aAudible);
|
|
|
|
|
2015-04-30 03:27:32 +03:00
|
|
|
void VolumeChanged();
|
2015-05-01 21:39:42 +03:00
|
|
|
void LogicalPlaybackRateChanged();
|
2015-05-02 00:27:29 +03:00
|
|
|
void PreservesPitchChanged();
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2015-07-27 19:21:33 +03:00
|
|
|
MediaQueue<MediaData>& AudioQueue() { return mAudioQueue; }
|
|
|
|
MediaQueue<MediaData>& VideoQueue() { return mVideoQueue; }
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// True if our buffers of decoded audio are not full, and we should
|
|
|
|
// decode more.
|
|
|
|
bool NeedToDecodeAudio();
|
|
|
|
|
|
|
|
// True if our buffers of decoded video are not full, and we should
|
|
|
|
// decode more.
|
|
|
|
bool NeedToDecodeVideo();
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we've got less than aAudioUsecs microseconds of decoded
|
2011-04-14 02:12:23 +04:00
|
|
|
// and playable data. The decoder monitor must be held.
|
2014-12-22 11:20:31 +03:00
|
|
|
//
|
|
|
|
// May not be invoked when mReader->UseBufferingHeuristics() is false.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasLowDecodedData(int64_t aAudioUsecs);
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2015-02-27 00:37:03 +03:00
|
|
|
bool OutOfDecodedAudio();
|
2014-12-22 11:20:31 +03:00
|
|
|
|
|
|
|
bool OutOfDecodedVideo()
|
|
|
|
{
|
2015-05-06 07:32:33 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2016-07-14 09:21:43 +03:00
|
|
|
return IsVideoDecoding() && VideoQueue().GetSize() <= 1;
|
2014-12-22 11:20:31 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we're running low on data which is not yet decoded.
|
2011-03-24 01:28:57 +03:00
|
|
|
// The decoder monitor must be held.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasLowUndecodedData();
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2013-11-19 18:01:14 +04:00
|
|
|
// Returns true if we have less than aUsecs of undecoded data available.
|
2014-11-12 07:50:21 +03:00
|
|
|
bool HasLowUndecodedData(int64_t aUsecs);
|
2010-11-28 23:06:38 +03:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true when there's decoded audio waiting to play.
|
2010-05-13 04:59:42 +04:00
|
|
|
// The decoder monitor must be held.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasFutureAudio();
|
2010-05-13 04:59:42 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we recently exited "quick buffering" mode.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool JustExitedQuickBuffering();
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2015-04-01 04:44:36 +03:00
|
|
|
// Recomputes mNextFrameStatus, possibly dispatching notifications to interested
|
|
|
|
// parties.
|
|
|
|
void UpdateNextFrameStatus();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-22 14:38:28 +04:00
|
|
|
// Return the current time, either the audio clock if available (if the media
|
|
|
|
// has audio, and the playback is possible), or a clock for the video.
|
|
|
|
// Called on the state machine thread.
|
2015-03-30 03:40:06 +03:00
|
|
|
// If aTimeStamp is non-null, set *aTimeStamp to the TimeStamp corresponding
|
|
|
|
// to the returned stream time.
|
|
|
|
int64_t GetClock(TimeStamp* aTimeStamp = nullptr) const;
|
2012-11-22 14:38:28 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
void SetStartTime(int64_t aStartTimeUsecs);
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-02-01 05:57:13 +03:00
|
|
|
// Update only the state machine's current playback position (and duration,
|
|
|
|
// if unknown). Does not update the playback position on the decoder or
|
|
|
|
// media element -- use UpdatePlaybackPosition for that. Called on the state
|
|
|
|
// machine thread, caller must hold the decoder lock.
|
2012-08-22 19:56:38 +04:00
|
|
|
void UpdatePlaybackPositionInternal(int64_t aTime);
|
2011-02-01 05:57:13 +03:00
|
|
|
|
2015-10-19 13:08:11 +03:00
|
|
|
// Update playback position and trigger next update by default time period.
|
|
|
|
// Called on the state machine thread.
|
|
|
|
void UpdatePlaybackPositionPeriodically();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-09-09 05:12:35 +03:00
|
|
|
media::MediaSink* CreateAudioSink();
|
|
|
|
|
2015-10-19 12:32:16 +03:00
|
|
|
// Always create mediasink which contains an AudioSink or StreamSink inside.
|
|
|
|
already_AddRefed<media::MediaSink> CreateMediaSink(bool aAudioCaptured);
|
|
|
|
|
2015-09-09 05:12:23 +03:00
|
|
|
// Stops the media sink and shut it down.
|
2015-08-18 06:55:01 +03:00
|
|
|
// The decoder monitor must be held with exactly one lock count.
|
|
|
|
// Called on the state machine thread.
|
2015-09-09 05:12:23 +03:00
|
|
|
void StopMediaSink();
|
2011-07-12 07:39:10 +04:00
|
|
|
|
2015-09-09 05:12:23 +03:00
|
|
|
// Create and start the media sink.
|
2015-08-18 06:55:01 +03:00
|
|
|
// The decoder monitor must be held with exactly one lock count.
|
|
|
|
// Called on the state machine thread.
|
2015-09-09 05:12:23 +03:00
|
|
|
void StartMediaSink();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-05-07 18:19:33 +03:00
|
|
|
// Notification method invoked when mPlayState changes.
|
|
|
|
void PlayStateChanged();
|
|
|
|
|
2016-04-12 10:00:29 +03:00
|
|
|
// Notification method invoked when mIsVisible changes.
|
|
|
|
void VisibilityChanged();
|
|
|
|
|
2011-07-12 07:39:30 +04:00
|
|
|
// Sets internal state which causes playback of media to pause.
|
2013-11-05 08:12:01 +04:00
|
|
|
// The decoder monitor must be held.
|
2011-07-12 07:39:30 +04:00
|
|
|
void StopPlayback();
|
|
|
|
|
2014-12-30 10:16:48 +03:00
|
|
|
// If the conditions are right, sets internal state which causes playback
|
|
|
|
// of media to begin or resume.
|
2013-11-05 08:12:01 +04:00
|
|
|
// Must be called with the decode monitor held.
|
2014-12-30 10:16:48 +03:00
|
|
|
void MaybeStartPlayback();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-09-15 12:47:26 +03:00
|
|
|
// Check to see if we don't have enough data to play up to the next frame.
|
|
|
|
// If we don't, switch to buffering mode.
|
|
|
|
void MaybeStartBuffering();
|
|
|
|
|
2016-09-01 07:54:43 +03:00
|
|
|
// The entry action of DECODER_STATE_DECODING_FIRSTFRAME.
|
|
|
|
void DecodeFirstFrame();
|
|
|
|
|
|
|
|
// The entry action of DECODER_STATE_DECODING.
|
2011-03-24 01:28:57 +03:00
|
|
|
void StartDecoding();
|
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// Moves the decoder into the shutdown state, and dispatches an error
|
|
|
|
// event to the media element. This begins shutting down the decoder.
|
|
|
|
// The decoder monitor must be held. This is only called on the
|
|
|
|
// decode thread.
|
|
|
|
void DecodeError();
|
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Dispatches a LoadedMetadataEvent.
|
|
|
|
// This is threadsafe and can be called on any thread.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
void EnqueueLoadedMetadataEvent();
|
|
|
|
|
2015-01-16 18:56:19 +03:00
|
|
|
void EnqueueFirstFrameLoadedEvent();
|
|
|
|
|
2016-04-18 09:33:52 +03:00
|
|
|
// Clears any previous seeking state and initiates a new seek on the decoder.
|
2016-07-07 11:55:24 +03:00
|
|
|
RefPtr<MediaDecoder::SeekPromise> InitiateSeek(SeekJob aSeekJob);
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2016-08-31 10:46:59 +03:00
|
|
|
void DispatchAudioDecodeTaskIfNeeded();
|
|
|
|
void DispatchVideoDecodeTaskIfNeeded();
|
|
|
|
|
|
|
|
// Dispatch a task to decode audio if there is not.
|
|
|
|
void EnsureAudioDecodeTaskQueued();
|
|
|
|
|
|
|
|
// Dispatch a task to decode video if there is not.
|
|
|
|
void EnsureVideoDecodeTaskQueued();
|
2014-03-11 07:44:10 +04:00
|
|
|
|
2015-07-20 09:28:29 +03:00
|
|
|
// Start a task to decode audio.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
void RequestAudioData();
|
2014-03-11 07:44:09 +04:00
|
|
|
|
2015-07-20 09:28:29 +03:00
|
|
|
// Start a task to decode video.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
void RequestVideoData();
|
2014-03-11 07:44:09 +04:00
|
|
|
|
2014-03-11 07:44:10 +04:00
|
|
|
// Re-evaluates the state and determines whether we need to dispatch
|
|
|
|
// events to run the decode, or if not whether we should set the reader
|
2014-03-11 07:44:10 +04:00
|
|
|
// to idle mode. This is threadsafe, and can be called from any thread.
|
|
|
|
// The decoder monitor must be held.
|
2014-03-11 07:44:10 +04:00
|
|
|
void DispatchDecodeTasksIfNeeded();
|
2014-03-11 07:44:10 +04:00
|
|
|
|
2010-09-15 03:24:47 +04:00
|
|
|
// Returns the "media time". This is the absolute time which the media
|
|
|
|
// playback has reached. i.e. this returns values in the range
|
|
|
|
// [mStartTime, mEndTime], and mStartTime will not be 0 if the media does
|
2015-05-08 03:04:22 +03:00
|
|
|
// not start at 0. Note this is different than the "current playback position",
|
|
|
|
// which is in the range [0,duration].
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t GetMediaTime() const {
|
2015-06-23 19:03:09 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2015-05-23 02:05:19 +03:00
|
|
|
return mCurrentPosition;
|
2010-09-15 03:24:47 +04:00
|
|
|
}
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// Returns an upper bound on the number of microseconds of audio that is
|
|
|
|
// decoded and playable. This is the sum of the number of usecs of audio which
|
|
|
|
// is decoded and in the reader's audio queue, and the usecs of unplayed audio
|
2011-01-13 04:06:15 +03:00
|
|
|
// which has been pushed to the audio hardware for playback. Note that after
|
|
|
|
// calling this, the audio hardware may play some of the audio pushed to
|
|
|
|
// hardware, so this can only be used as a upper bound. The decoder monitor
|
2011-07-12 07:39:39 +04:00
|
|
|
// must be held when calling this. Called on the decode thread.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t GetDecodedAudioDuration();
|
2011-01-13 04:06:15 +03:00
|
|
|
|
2015-03-20 22:53:32 +03:00
|
|
|
// Promise callbacks for metadata reading.
|
|
|
|
void OnMetadataRead(MetadataHolder* aMetadata);
|
|
|
|
void OnMetadataNotRead(ReadMetadataFailureReason aReason);
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2016-09-01 08:01:48 +03:00
|
|
|
// Notify FirstFrameLoaded if having decoded first frames and
|
|
|
|
// transition to SEEKING if there is any pending seek, or DECODING otherwise.
|
|
|
|
void MaybeFinishDecodeFirstFrame();
|
2016-08-30 11:48:09 +03:00
|
|
|
|
2015-07-25 11:25:13 +03:00
|
|
|
void FinishDecodeFirstFrame();
|
2014-06-18 09:07:02 +04:00
|
|
|
|
|
|
|
// Completes the seek operation, moves onto the next appropriate state.
|
|
|
|
void SeekCompleted();
|
|
|
|
|
|
|
|
// Queries our state to see whether the decode has finished for all streams.
|
2016-08-25 21:18:53 +03:00
|
|
|
bool CheckIfDecodeComplete();
|
2011-07-12 07:39:25 +04:00
|
|
|
|
2016-08-31 10:46:59 +03:00
|
|
|
// Performs one "cycle" of the state machine.
|
|
|
|
void RunStateMachine();
|
2016-09-05 10:48:37 +03:00
|
|
|
// Perform one cycle of the DECODING state.
|
|
|
|
void StepDecoding();
|
|
|
|
// Perform one cycle of the BUFFERING state.
|
|
|
|
void StepBuffering();
|
|
|
|
// Perform one cycle of the COMPLETED state.
|
|
|
|
void StepCompleted();
|
2011-07-12 07:39:34 +04:00
|
|
|
|
2014-07-11 11:11:00 +04:00
|
|
|
bool IsStateMachineScheduled() const;
|
2011-07-12 07:39:32 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// These return true if the respective stream's decode has not yet reached
|
|
|
|
// the end of stream.
|
|
|
|
bool IsAudioDecoding();
|
|
|
|
bool IsVideoDecoding();
|
|
|
|
|
2015-04-08 02:10:51 +03:00
|
|
|
private:
|
2015-10-19 12:32:16 +03:00
|
|
|
// Resolved by the MediaSink to signal that all audio/video outstanding
|
|
|
|
// work is complete and identify which part(a/v) of the sink is shutting down.
|
|
|
|
void OnMediaSinkAudioComplete();
|
|
|
|
void OnMediaSinkVideoComplete();
|
2013-12-11 09:03:30 +04:00
|
|
|
|
2015-10-19 12:32:16 +03:00
|
|
|
// Rejected by the MediaSink to signal errors for audio/video.
|
|
|
|
void OnMediaSinkAudioError();
|
|
|
|
void OnMediaSinkVideoError();
|
2014-09-30 01:42:00 +04:00
|
|
|
|
2014-12-22 06:32:31 +03:00
|
|
|
// Return true if the video decoder's decode speed can not catch up the
|
|
|
|
// play time.
|
|
|
|
bool NeedToSkipToNextKeyframe();
|
|
|
|
|
2015-12-03 10:59:30 +03:00
|
|
|
void* const mDecoderID;
|
|
|
|
const RefPtr<FrameStatistics> mFrameStats;
|
|
|
|
const RefPtr<VideoFrameContainer> mVideoFrameContainer;
|
|
|
|
const dom::AudioChannel mAudioChannel;
|
2011-09-21 11:01:00 +04:00
|
|
|
|
2015-03-10 19:59:30 +03:00
|
|
|
// Task queue for running the state machine.
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<TaskQueue> mTaskQueue;
|
2015-03-10 19:59:30 +03:00
|
|
|
|
2015-04-29 05:02:31 +03:00
|
|
|
// State-watching manager.
|
|
|
|
WatchManager<MediaDecoderStateMachine> mWatchManager;
|
|
|
|
|
2015-03-10 19:59:30 +03:00
|
|
|
// True if we've dispatched a task to run the state machine but the task has
|
|
|
|
// yet to run.
|
|
|
|
bool mDispatchedStateMachine;
|
|
|
|
|
2015-10-13 10:39:01 +03:00
|
|
|
// Used to dispatch another round schedule with specific target time.
|
|
|
|
DelayedScheduler mDelayedScheduler;
|
2014-07-11 11:11:00 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Queue of audio frames. This queue is threadsafe, and is accessed from
|
|
|
|
// the audio, decoder, state machine, and main threads.
|
2015-07-27 19:21:33 +03:00
|
|
|
MediaQueue<MediaData> mAudioQueue;
|
2014-06-18 09:07:02 +04:00
|
|
|
// Queue of video frames. This queue is threadsafe, and is accessed from
|
|
|
|
// the decoder, state machine, and main threads.
|
2015-07-27 19:21:33 +03:00
|
|
|
MediaQueue<MediaData> mVideoQueue;
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2011-09-21 11:01:00 +04:00
|
|
|
// The decoder monitor must be obtained before modifying this state.
|
|
|
|
// Accessed on state machine, audio, main, and AV thread.
|
2015-04-27 21:51:46 +03:00
|
|
|
Watchable<State> mState;
|
2011-09-21 11:01:00 +04:00
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// Time that buffering started. Used for buffering timeout and only
|
2011-01-18 03:53:18 +03:00
|
|
|
// accessed on the state machine thread. This is null while we're not
|
|
|
|
// buffering.
|
2010-04-02 07:03:07 +04:00
|
|
|
TimeStamp mBufferingStart;
|
|
|
|
|
2015-06-09 22:16:18 +03:00
|
|
|
media::TimeUnit Duration() const { MOZ_ASSERT(OnTaskQueue()); return mDuration.Ref().ref(); }
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-05-28 22:13:56 +03:00
|
|
|
// Recomputes the canonical duration from various sources.
|
|
|
|
void RecomputeDuration();
|
|
|
|
|
2015-07-03 10:33:56 +03:00
|
|
|
|
|
|
|
// FrameID which increments every time a frame is pushed to our queue.
|
|
|
|
FrameID mCurrentFrameID;
|
2015-06-03 00:44:26 +03:00
|
|
|
|
2015-06-03 05:01:55 +03:00
|
|
|
// The highest timestamp that our position has reached. Monotonically
|
|
|
|
// increasing.
|
|
|
|
Watchable<media::TimeUnit> mObservedDuration;
|
|
|
|
|
2015-04-25 06:35:06 +03:00
|
|
|
// Returns true if we're logically playing, that is, if the Play() has
|
|
|
|
// been called and Pause() has not or we have not yet reached the end
|
|
|
|
// of media. This is irrespective of the seeking state; if the owner
|
|
|
|
// calls Play() and then Seek(), we still count as logically playing.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
bool IsLogicallyPlaying()
|
|
|
|
{
|
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
|
|
|
return mPlayState == MediaDecoder::PLAY_STATE_PLAYING ||
|
|
|
|
mNextPlayState == MediaDecoder::PLAY_STATE_PLAYING;
|
|
|
|
}
|
|
|
|
|
2016-02-29 12:00:52 +03:00
|
|
|
// Queued seek - moves to mCurrentSeek when DecodeFirstFrame completes.
|
2015-03-03 03:46:32 +03:00
|
|
|
SeekJob mQueuedSeek;
|
2016-07-05 08:43:46 +03:00
|
|
|
SeekJob mCurrentSeek;
|
2015-03-03 03:46:32 +03:00
|
|
|
|
2016-04-07 09:30:39 +03:00
|
|
|
// mSeekTask is responsible for executing the current seek request.
|
2016-04-18 10:39:15 +03:00
|
|
|
RefPtr<SeekTask> mSeekTask;
|
|
|
|
MozPromiseRequestHolder<SeekTask::SeekTaskPromise> mSeekTaskRequest;
|
2016-04-07 09:30:39 +03:00
|
|
|
|
2016-04-18 10:39:15 +03:00
|
|
|
void OnSeekTaskResolved(SeekTaskResolveValue aValue);
|
|
|
|
void OnSeekTaskRejected(SeekTaskRejectValue aValue);
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2016-04-27 09:50:39 +03:00
|
|
|
// This method discards the seek task and then get the ownership of
|
|
|
|
// MedaiDecoderReaderWarpper back via registering MDSM's callback into it.
|
|
|
|
void DiscardSeekTaskIfExist();
|
|
|
|
|
2011-08-25 03:42:23 +04:00
|
|
|
// Media Fragment end time in microseconds. Access controlled by decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mFragmentEndTime;
|
2011-08-25 03:42:23 +04:00
|
|
|
|
2015-09-09 05:12:23 +03:00
|
|
|
// The media sink resource. Used on the state machine thread.
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<media::MediaSink> mMediaSink;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2016-04-20 09:45:41 +03:00
|
|
|
const RefPtr<MediaDecoderReaderWrapper> mReader;
|
2016-03-31 11:35:47 +03:00
|
|
|
|
2015-09-09 05:12:23 +03:00
|
|
|
// The end time of the last audio frame that's been pushed onto the media sink
|
|
|
|
// in microseconds. This will approximately be the end time
|
2015-07-17 05:18:15 +03:00
|
|
|
// of the audio stream, unless another frame is pushed to the hardware.
|
2015-07-16 05:12:52 +03:00
|
|
|
int64_t AudioEndTime() const;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-10-19 13:08:11 +03:00
|
|
|
// The end time of the last rendered video frame that's been sent to
|
|
|
|
// compositor.
|
|
|
|
int64_t VideoEndTime() const;
|
|
|
|
|
2014-11-21 00:24:00 +03:00
|
|
|
// The end time of the last decoded audio frame. This signifies the end of
|
|
|
|
// decoded audio data. Used to check if we are low in decoded data.
|
|
|
|
int64_t mDecodedAudioEndTime;
|
|
|
|
|
2014-12-22 06:32:31 +03:00
|
|
|
// The end time of the last decoded video frame. Used to check if we are low
|
|
|
|
// on decoded video data.
|
|
|
|
int64_t mDecodedVideoEndTime;
|
|
|
|
|
2015-05-01 21:39:42 +03:00
|
|
|
// Playback rate. 1.0 : normal speed, 0.5 : two times slower.
|
2012-11-22 14:38:28 +04:00
|
|
|
double mPlaybackRate;
|
|
|
|
|
2011-03-24 01:28:57 +03:00
|
|
|
// Time at which we started decoding. Synchronised via decoder monitor.
|
|
|
|
TimeStamp mDecodeStartTime;
|
|
|
|
|
2012-08-16 10:07:26 +04:00
|
|
|
// The maximum number of second we spend buffering when we are short on
|
|
|
|
// unbuffered data.
|
2012-08-22 19:56:38 +04:00
|
|
|
uint32_t mBufferingWait;
|
|
|
|
int64_t mLowDataThresholdUsecs;
|
2012-08-16 10:07:26 +04:00
|
|
|
|
2015-03-09 06:17:30 +03:00
|
|
|
// If we've got more than this number of decoded video frames waiting in
|
2012-09-28 21:34:03 +04:00
|
|
|
// the video queue, we will not decode any more video frames until some have
|
|
|
|
// been consumed by the play state machine thread.
|
2015-03-09 06:17:30 +03:00
|
|
|
// Must hold monitor.
|
|
|
|
uint32_t GetAmpleVideoFrames() const;
|
2014-03-11 07:44:08 +04:00
|
|
|
|
|
|
|
// Low audio threshold. If we've decoded less than this much audio we
|
|
|
|
// consider our audio decode "behind", and we may skip video decoding
|
|
|
|
// in order to allow our audio decoding to catch up. We favour audio
|
|
|
|
// decoding over video. We increase this threshold if we're slow to
|
|
|
|
// decode video frames, in order to reduce the chance of audio underruns.
|
|
|
|
// Note that we don't ever reset this threshold, it only ever grows as
|
|
|
|
// we detect that the decode can't keep up with rendering.
|
|
|
|
int64_t mLowAudioThresholdUsecs;
|
|
|
|
|
|
|
|
// Our "ample" audio threshold. Once we've this much audio decoded, we
|
|
|
|
// pause decoding. If we increase mLowAudioThresholdUsecs, we'll also
|
|
|
|
// increase this too appropriately (we don't want mLowAudioThresholdUsecs
|
|
|
|
// to be greater than ampleAudioThreshold, else we'd stop decoding!).
|
|
|
|
// Note that we don't ever reset this threshold, it only ever grows as
|
|
|
|
// we detect that the decode can't keep up with rendering.
|
|
|
|
int64_t mAmpleAudioThresholdUsecs;
|
|
|
|
|
2015-01-22 08:53:04 +03:00
|
|
|
// If we're quick buffering, we'll remain in buffering mode while we have less than
|
|
|
|
// QUICK_BUFFERING_LOW_DATA_USECS of decoded data available.
|
|
|
|
int64_t mQuickBufferingLowDataThresholdUsecs;
|
|
|
|
|
2014-03-11 07:44:08 +04:00
|
|
|
// At the start of decoding we want to "preroll" the decode until we've
|
|
|
|
// got a few frames decoded before we consider whether decode is falling
|
|
|
|
// behind. Otherwise our "we're falling behind" logic will trigger
|
2016-04-18 09:33:52 +03:00
|
|
|
// unnecessarily if we start playing as soon as the first sample is
|
2014-03-11 07:44:08 +04:00
|
|
|
// decoded. These two fields store how many video frames and audio
|
|
|
|
// samples we must consume before are considered to be finished prerolling.
|
2014-12-30 10:16:48 +03:00
|
|
|
uint32_t AudioPrerollUsecs() const
|
|
|
|
{
|
2015-05-06 07:32:33 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2016-08-15 09:54:10 +03:00
|
|
|
return mAmpleAudioThresholdUsecs / 2;
|
2014-12-30 10:16:48 +03:00
|
|
|
}
|
2015-03-09 06:17:30 +03:00
|
|
|
|
|
|
|
uint32_t VideoPrerollFrames() const
|
|
|
|
{
|
2015-05-06 07:32:33 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2016-08-15 09:54:10 +03:00
|
|
|
return GetAmpleVideoFrames() / 2;
|
2015-03-09 06:17:30 +03:00
|
|
|
}
|
2014-03-11 07:44:08 +04:00
|
|
|
|
2014-12-30 10:16:48 +03:00
|
|
|
bool DonePrerollingAudio()
|
|
|
|
{
|
2015-04-30 03:07:55 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2015-07-03 10:29:30 +03:00
|
|
|
return !IsAudioDecoding() ||
|
|
|
|
GetDecodedAudioDuration() >= AudioPrerollUsecs() * mPlaybackRate;
|
2014-12-30 10:16:48 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
bool DonePrerollingVideo()
|
|
|
|
{
|
2015-04-30 03:07:55 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2016-04-18 09:33:52 +03:00
|
|
|
return !mIsVisible ||
|
|
|
|
!IsVideoDecoding() ||
|
2015-07-03 10:29:30 +03:00
|
|
|
static_cast<uint32_t>(VideoQueue().GetSize()) >=
|
|
|
|
VideoPrerollFrames() * mPlaybackRate + 1;
|
2014-12-30 10:16:48 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
void StopPrerollingAudio()
|
|
|
|
{
|
2015-05-06 07:32:33 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2014-12-30 10:16:48 +03:00
|
|
|
if (mIsAudioPrerolling) {
|
|
|
|
mIsAudioPrerolling = false;
|
|
|
|
ScheduleStateMachine();
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
void StopPrerollingVideo()
|
|
|
|
{
|
2015-05-06 07:32:33 +03:00
|
|
|
MOZ_ASSERT(OnTaskQueue());
|
2014-12-30 10:16:48 +03:00
|
|
|
if (mIsVideoPrerolling) {
|
|
|
|
mIsVideoPrerolling = false;
|
|
|
|
ScheduleStateMachine();
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-03-11 07:44:08 +04:00
|
|
|
// When we start decoding (either for the first time, or after a pause)
|
|
|
|
// we may be low on decoded data. We don't want our "low data" logic to
|
|
|
|
// kick in and decide that we're low on decoded data because the download
|
|
|
|
// can't keep up with the decode, and cause us to pause playback. So we
|
|
|
|
// have a "preroll" stage, where we ignore the results of our "low data"
|
|
|
|
// logic during the first few frames of our decode. This occurs during
|
|
|
|
// playback. The flags below are true when the corresponding stream is
|
|
|
|
// being "prerolled".
|
|
|
|
bool mIsAudioPrerolling;
|
|
|
|
bool mIsVideoPrerolling;
|
|
|
|
|
2015-02-04 07:09:55 +03:00
|
|
|
// Only one of a given pair of ({Audio,Video}DataPromise, WaitForDataPromise)
|
|
|
|
// should exist at any given moment.
|
2014-12-22 11:20:31 +03:00
|
|
|
|
2016-06-14 06:22:45 +03:00
|
|
|
MediaEventListener mAudioCallback;
|
|
|
|
MediaEventListener mVideoCallback;
|
|
|
|
MediaEventListener mAudioWaitCallback;
|
|
|
|
MediaEventListener mVideoWaitCallback;
|
2015-02-04 07:09:55 +03:00
|
|
|
|
2016-06-14 06:22:45 +03:00
|
|
|
const char* AudioRequestStatus() const;
|
2016-04-27 09:50:39 +03:00
|
|
|
const char* VideoRequestStatus() const;
|
2014-12-22 11:20:31 +03:00
|
|
|
|
2016-05-30 10:25:10 +03:00
|
|
|
void OnSuspendTimerResolved();
|
|
|
|
void OnSuspendTimerRejected();
|
|
|
|
|
2012-04-30 07:12:42 +04:00
|
|
|
// True if we shouldn't play our audio (but still write it to any capturing
|
2015-06-08 06:27:22 +03:00
|
|
|
// streams). When this is true, the audio thread will never start again after
|
|
|
|
// it has stopped.
|
2016-01-12 16:48:25 +03:00
|
|
|
bool mAudioCaptured;
|
2012-04-30 07:12:42 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if the audio playback thread has finished. It is finished
|
2013-12-02 01:09:06 +04:00
|
|
|
// when either all the audio frames have completed playing, or we've moved
|
|
|
|
// into shutdown state, and the threads are to be
|
2010-04-02 07:03:07 +04:00
|
|
|
// destroyed. Written by the audio playback thread and read and written by
|
|
|
|
// the state machine thread. Synchronised via decoder monitor.
|
2013-12-02 01:09:06 +04:00
|
|
|
// When data is being sent to a MediaStream, this is true when all data has
|
|
|
|
// been written to the MediaStream.
|
2015-04-27 21:51:46 +03:00
|
|
|
Watchable<bool> mAudioCompleted;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-12-22 03:42:38 +03:00
|
|
|
// True if all video frames are already rendered.
|
|
|
|
Watchable<bool> mVideoCompleted;
|
|
|
|
|
2016-07-12 08:57:10 +03:00
|
|
|
// True if we need to enter dormant state after reading metadata. Note that
|
|
|
|
// we can't enter dormant state until reading metadata is done for some
|
|
|
|
// limitations of the reader.
|
|
|
|
bool mPendingDormant = false;
|
2015-10-21 12:19:42 +03:00
|
|
|
|
2015-06-08 20:56:00 +03:00
|
|
|
// Flag whether we notify metadata before decoding the first frame or after.
|
|
|
|
//
|
|
|
|
// Note that the odd semantics here are designed to replicate the current
|
|
|
|
// behavior where we notify the decoder each time we come out of dormant, but
|
|
|
|
// send suppressed event visibility for those cases. This code can probably be
|
|
|
|
// simplified.
|
|
|
|
bool mNotifyMetadataBeforeFirstFrame;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// If this is true while we're in buffering mode, we can exit early,
|
2011-03-24 01:28:57 +03:00
|
|
|
// as it's likely we may be able to playback. This happens when we enter
|
|
|
|
// buffering mode soon after the decode starts, because the decode-ahead
|
|
|
|
// ran fast enough to exhaust all data while the download is starting up.
|
|
|
|
// Synchronised via decoder monitor.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mQuickBuffering;
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2014-04-01 07:43:57 +04:00
|
|
|
// True if we should not decode/preroll unnecessary samples, unless we're
|
|
|
|
// played. "Prerolling" in this context refers to when we decode and
|
|
|
|
// buffer decoded samples in advance of when they're needed for playback.
|
|
|
|
// This flag is set for preload=metadata media, and means we won't
|
|
|
|
// decode more than the first video frame and first block of audio samples
|
|
|
|
// for that media when we startup, or after a seek. When Play() is called,
|
|
|
|
// we reset this flag, as we assume the user is playing the media, so
|
|
|
|
// prerolling is appropriate then. This flag is used to reduce the overhead
|
|
|
|
// of prerolling samples for media elements that may not play, both
|
|
|
|
// memory and CPU overhead.
|
|
|
|
bool mMinimizePreroll;
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if the decode thread has gone filled its buffers and is now
|
2011-07-12 07:39:37 +04:00
|
|
|
// waiting to be awakened before it continues decoding. Synchronized
|
|
|
|
// by the decoder monitor.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mDecodeThreadWaiting;
|
2011-07-12 07:39:37 +04:00
|
|
|
|
2015-03-20 22:53:32 +03:00
|
|
|
// Track our request for metadata from the reader.
|
2015-07-16 21:06:49 +03:00
|
|
|
MozPromiseRequestHolder<MediaDecoderReader::MetadataPromise> mMetadataRequest;
|
2015-03-20 22:53:32 +03:00
|
|
|
|
2011-03-24 06:53:03 +03:00
|
|
|
// Stores presentation info required for playback. The decoder monitor
|
|
|
|
// must be held when accessing this.
|
2013-09-27 09:22:38 +04:00
|
|
|
MediaInfo mInfo;
|
2012-11-30 17:17:54 +04:00
|
|
|
|
2015-03-20 22:53:32 +03:00
|
|
|
nsAutoPtr<MetadataTags> mMetadataTags;
|
|
|
|
|
2012-11-30 17:17:54 +04:00
|
|
|
mozilla::MediaMetadataManager mMetadataManager;
|
2012-12-19 08:48:32 +04:00
|
|
|
|
2016-03-09 05:32:49 +03:00
|
|
|
// Track our request to update the buffered ranges
|
|
|
|
MozPromiseRequestHolder<MediaDecoderReader::BufferedUpdatePromise> mBufferedUpdateRequest;
|
|
|
|
|
2015-01-16 18:56:19 +03:00
|
|
|
// True if we are back from DECODER_STATE_DORMANT state and
|
|
|
|
// LoadedMetadataEvent was already sent.
|
|
|
|
bool mSentLoadedMetadataEvent;
|
2016-08-30 11:11:58 +03:00
|
|
|
|
|
|
|
// True if we've decoded first frames (thus having the start time) and
|
|
|
|
// notified the FirstFrameLoaded event. Note we can't initiate seek until the
|
|
|
|
// start time is known which happens when the first frames are decoded or we
|
|
|
|
// are playing an MSE stream (the start time is always assumed 0).
|
2016-01-12 16:48:25 +03:00
|
|
|
bool mSentFirstFrameLoadedEvent;
|
2015-04-08 02:20:20 +03:00
|
|
|
|
|
|
|
bool mSentPlaybackEndedEvent;
|
2015-05-28 09:17:30 +03:00
|
|
|
|
2016-05-30 10:25:10 +03:00
|
|
|
// True if video decoding is suspended.
|
|
|
|
bool mVideoDecodeSuspended;
|
|
|
|
|
|
|
|
// Track enabling video decode suspension via timer
|
|
|
|
DelayedScheduler mVideoDecodeSuspendTimer;
|
|
|
|
|
2015-12-24 05:14:16 +03:00
|
|
|
// Data about MediaStreams that are being fed by the decoder.
|
|
|
|
const RefPtr<OutputStreamManager> mOutputStreamManager;
|
|
|
|
|
2015-07-20 09:29:10 +03:00
|
|
|
// Media data resource from the decoder.
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<MediaResource> mResource;
|
2015-07-20 09:29:10 +03:00
|
|
|
|
2015-10-19 12:32:16 +03:00
|
|
|
// Track the complete & error for audio/video separately
|
|
|
|
MozPromiseRequestHolder<GenericPromise> mMediaSinkAudioPromise;
|
|
|
|
MozPromiseRequestHolder<GenericPromise> mMediaSinkVideoPromise;
|
2015-07-22 04:54:06 +03:00
|
|
|
|
2015-08-06 05:14:15 +03:00
|
|
|
MediaEventListener mAudioQueueListener;
|
|
|
|
MediaEventListener mVideoQueueListener;
|
2016-05-12 09:18:53 +03:00
|
|
|
MediaEventListener mAudibleListener;
|
2015-08-06 05:14:15 +03:00
|
|
|
|
2015-11-23 05:35:18 +03:00
|
|
|
MediaEventProducerExc<nsAutoPtr<MediaInfo>,
|
|
|
|
nsAutoPtr<MetadataTags>,
|
|
|
|
MediaDecoderEventVisibility> mMetadataLoadedEvent;
|
|
|
|
MediaEventProducerExc<nsAutoPtr<MediaInfo>,
|
|
|
|
MediaDecoderEventVisibility> mFirstFrameLoadedEvent;
|
|
|
|
|
2015-12-01 04:34:02 +03:00
|
|
|
MediaEventProducer<MediaEventType> mOnPlaybackEvent;
|
2015-11-30 08:06:19 +03:00
|
|
|
|
2015-09-27 16:37:48 +03:00
|
|
|
// True if audio is offloading.
|
|
|
|
// Playback will not start when audio is offloading.
|
|
|
|
bool mAudioOffloading;
|
|
|
|
|
2015-09-27 13:48:59 +03:00
|
|
|
#ifdef MOZ_EME
|
2015-10-18 08:24:48 +03:00
|
|
|
void OnCDMProxyReady(RefPtr<CDMProxy> aProxy);
|
2015-09-27 13:48:59 +03:00
|
|
|
void OnCDMProxyNotReady();
|
2015-10-18 08:24:48 +03:00
|
|
|
RefPtr<CDMProxy> mCDMProxy;
|
2015-09-27 13:48:59 +03:00
|
|
|
MozPromiseRequestHolder<MediaDecoder::CDMProxyPromise> mCDMProxyPromise;
|
|
|
|
#endif
|
|
|
|
|
2015-07-17 08:00:26 +03:00
|
|
|
private:
|
|
|
|
// The buffered range. Mirrored from the decoder thread.
|
|
|
|
Mirror<media::TimeIntervals> mBuffered;
|
|
|
|
|
2016-06-08 15:28:37 +03:00
|
|
|
Mirror<bool> mIsReaderSuspended;
|
|
|
|
|
2015-07-17 08:00:26 +03:00
|
|
|
// The duration according to the demuxer's current estimate, mirrored from the main thread.
|
|
|
|
Mirror<media::NullableTimeUnit> mEstimatedDuration;
|
|
|
|
|
|
|
|
// The duration explicitly set by JS, mirrored from the main thread.
|
|
|
|
Mirror<Maybe<double>> mExplicitDuration;
|
|
|
|
|
|
|
|
// The current play state and next play state, mirrored from the main thread.
|
|
|
|
Mirror<MediaDecoder::PlayState> mPlayState;
|
|
|
|
Mirror<MediaDecoder::PlayState> mNextPlayState;
|
|
|
|
|
|
|
|
// Volume of playback. 0.0 = muted. 1.0 = full volume.
|
|
|
|
Mirror<double> mVolume;
|
|
|
|
|
|
|
|
// TODO: The separation between mPlaybackRate and mLogicalPlaybackRate is a
|
|
|
|
// kludge to preserve existing fragile logic while converting this setup to
|
|
|
|
// state-mirroring. Some hero should clean this up.
|
|
|
|
Mirror<double> mLogicalPlaybackRate;
|
|
|
|
|
|
|
|
// Pitch preservation for the playback rate.
|
|
|
|
Mirror<bool> mPreservesPitch;
|
|
|
|
|
2015-08-06 13:05:30 +03:00
|
|
|
// True if the media is same-origin with the element. Data can only be
|
|
|
|
// passed to MediaStreams when this is true.
|
|
|
|
Mirror<bool> mSameOriginMedia;
|
|
|
|
|
2016-02-02 08:14:13 +03:00
|
|
|
// An identifier for the principal of the media. Used to track when
|
|
|
|
// main-thread induced principal changes get reflected on MSG thread.
|
|
|
|
Mirror<PrincipalHandle> mMediaPrincipalHandle;
|
|
|
|
|
2015-09-15 05:04:50 +03:00
|
|
|
// Estimate of the current playback rate (bytes/second).
|
|
|
|
Mirror<double> mPlaybackBytesPerSecond;
|
|
|
|
|
|
|
|
// True if mPlaybackBytesPerSecond is a reliable estimate.
|
|
|
|
Mirror<bool> mPlaybackRateReliable;
|
|
|
|
|
|
|
|
// Current decoding position in the stream.
|
|
|
|
Mirror<int64_t> mDecoderPosition;
|
|
|
|
|
2015-09-21 08:49:01 +03:00
|
|
|
// True if the media is seekable (i.e. supports random access).
|
|
|
|
Mirror<bool> mMediaSeekable;
|
|
|
|
|
2016-02-04 07:31:21 +03:00
|
|
|
// True if the media is seekable only in buffered ranges.
|
|
|
|
Mirror<bool> mMediaSeekableOnlyInBufferedRanges;
|
|
|
|
|
2016-04-12 10:00:29 +03:00
|
|
|
// IsVisible, mirrored from the media decoder.
|
|
|
|
Mirror<bool> mIsVisible;
|
|
|
|
|
2015-07-17 08:00:26 +03:00
|
|
|
// Duration of the media. This is guaranteed to be non-null after we finish
|
|
|
|
// decoding the first frame.
|
|
|
|
Canonical<media::NullableTimeUnit> mDuration;
|
|
|
|
|
|
|
|
// Whether we're currently in or transitioning to shutdown state.
|
|
|
|
Canonical<bool> mIsShutdown;
|
|
|
|
|
|
|
|
// The status of our next frame. Mirrored on the main thread and used to
|
|
|
|
// compute ready state.
|
|
|
|
Canonical<NextFrameStatus> mNextFrameStatus;
|
|
|
|
|
|
|
|
// The time of the current frame in microseconds, corresponding to the "current
|
|
|
|
// playback position" in HTML5. This is referenced from 0, which is the initial
|
|
|
|
// playback position.
|
|
|
|
Canonical<int64_t> mCurrentPosition;
|
|
|
|
|
2015-09-15 08:51:12 +03:00
|
|
|
// Current playback position in the stream in bytes.
|
|
|
|
Canonical<int64_t> mPlaybackOffset;
|
|
|
|
|
2016-04-12 10:00:29 +03:00
|
|
|
// Used to distinguish whether the audio is producing sound.
|
2016-01-21 05:27:38 +03:00
|
|
|
Canonical<bool> mIsAudioDataAudible;
|
|
|
|
|
2015-07-17 08:00:26 +03:00
|
|
|
public:
|
2016-04-20 11:02:37 +03:00
|
|
|
AbstractCanonical<media::TimeIntervals>* CanonicalBuffered() const;
|
2016-04-20 09:45:40 +03:00
|
|
|
|
2015-07-17 08:00:26 +03:00
|
|
|
AbstractCanonical<media::NullableTimeUnit>* CanonicalDuration() {
|
|
|
|
return &mDuration;
|
|
|
|
}
|
|
|
|
AbstractCanonical<bool>* CanonicalIsShutdown() {
|
|
|
|
return &mIsShutdown;
|
|
|
|
}
|
|
|
|
AbstractCanonical<NextFrameStatus>* CanonicalNextFrameStatus() {
|
|
|
|
return &mNextFrameStatus;
|
|
|
|
}
|
|
|
|
AbstractCanonical<int64_t>* CanonicalCurrentPosition() {
|
|
|
|
return &mCurrentPosition;
|
|
|
|
}
|
2015-09-15 08:51:12 +03:00
|
|
|
AbstractCanonical<int64_t>* CanonicalPlaybackOffset() {
|
|
|
|
return &mPlaybackOffset;
|
|
|
|
}
|
2016-01-21 05:27:38 +03:00
|
|
|
AbstractCanonical<bool>* CanonicalIsAudioDataAudible() {
|
|
|
|
return &mIsAudioDataAudible;
|
|
|
|
}
|
2010-08-25 17:10:00 +04:00
|
|
|
};
|
2010-08-18 21:04:31 +04:00
|
|
|
|
2015-07-13 18:25:42 +03:00
|
|
|
} // namespace mozilla
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
#endif
|