2010-04-02 07:03:07 +04:00
|
|
|
/* -*- Mode: C++; tab-width: 2; indent-tabs-mode: nil; c-basic-offset: 2 -*- */
|
|
|
|
/* vim:set ts=2 sw=2 sts=2 et cindent: */
|
2012-05-21 15:12:37 +04:00
|
|
|
/* This Source Code Form is subject to the terms of the Mozilla Public
|
|
|
|
* License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
|
|
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
|
2010-04-27 12:53:44 +04:00
|
|
|
/*
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
Each media element for a media file has one thread called the "audio thread".
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The audio thread writes the decoded audio data to the audio
|
|
|
|
hardware. This is done in a separate thread to ensure that the
|
|
|
|
audio hardware gets a constant stream of data without
|
|
|
|
interruption due to decoding or display. At some point
|
|
|
|
AudioStream will be refactored to have a callback interface
|
|
|
|
where it asks for data and this thread will no longer be
|
|
|
|
needed.
|
|
|
|
|
|
|
|
The element/state machine also has a MediaTaskQueue which runs in a
|
|
|
|
SharedThreadPool that is shared with all other elements/decoders. The state
|
|
|
|
machine dispatches tasks to this to call into the MediaDecoderReader to
|
|
|
|
request decoded audio or video data. The Reader will callback with decoded
|
|
|
|
sampled when it has them available, and the state machine places the decoded
|
|
|
|
samples into its queues for the consuming threads to pull from.
|
|
|
|
|
|
|
|
The MediaDecoderReader can choose to decode asynchronously, or synchronously
|
|
|
|
and return requested samples synchronously inside it's Request*Data()
|
|
|
|
functions via callback. Asynchronous decoding is preferred, and should be
|
|
|
|
used for any new readers.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
|
|
|
Synchronisation of state between the thread is done via a monitor owned
|
2012-11-14 23:46:40 +04:00
|
|
|
by MediaDecoder.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The lifetime of the audio thread is controlled by the state machine when
|
|
|
|
it runs on the shared state machine thread. When playback needs to occur
|
|
|
|
the audio thread is created and an event dispatched to run it. The audio
|
|
|
|
thread exits when audio playback is completed or no longer required.
|
2011-07-12 07:39:39 +04:00
|
|
|
|
|
|
|
A/V synchronisation is handled by the state machine. It examines the audio
|
|
|
|
playback time and compares this to the next frame in the queue of video
|
|
|
|
frames. If it is time to play the video frame it is then displayed, otherwise
|
|
|
|
it schedules the state machine to run again at the time of the next frame.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
Frame skipping is done in the following ways:
|
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
1) The state machine will skip all frames in the video queue whose
|
2010-04-27 12:53:44 +04:00
|
|
|
display time is less than the current audio time. This ensures
|
|
|
|
the correct frame for the current time is always displayed.
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
2) The decode tasks will stop decoding interframes and read to the
|
2010-04-27 12:53:44 +04:00
|
|
|
next keyframe if it determines that decoding the remaining
|
|
|
|
interframes will cause playback issues. It detects this by:
|
|
|
|
a) If the amount of audio data in the audio queue drops
|
|
|
|
below a threshold whereby audio may start to skip.
|
|
|
|
b) If the video queue drops below a threshold where it
|
|
|
|
will be decoding video data that won't be displayed due
|
|
|
|
to the decode thread dropping the frame immediately.
|
2014-06-18 09:07:02 +04:00
|
|
|
TODO: In future we should only do this when the Reader is decoding
|
|
|
|
synchronously.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
When hardware accelerated graphics is not available, YCbCr conversion
|
2014-06-18 09:07:02 +04:00
|
|
|
is done on the decode task queue when video frames are decoded.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
The decode task queue pushes decoded audio and videos frames into two
|
2010-04-27 12:53:44 +04:00
|
|
|
separate queues - one for audio and one for video. These are kept
|
2011-08-16 09:19:51 +04:00
|
|
|
separate to make it easy to constantly feed audio data to the audio
|
2010-04-27 12:53:44 +04:00
|
|
|
hardware while allowing frame skipping of video data. These queues are
|
2011-07-12 07:39:39 +04:00
|
|
|
threadsafe, and neither the decode, audio, or state machine should
|
2010-04-27 12:53:44 +04:00
|
|
|
be able to monopolize them, and cause starvation of the other threads.
|
|
|
|
|
|
|
|
Both queues are bounded by a maximum size. When this size is reached
|
2014-06-18 09:07:02 +04:00
|
|
|
the decode tasks will no longer request video or audio depending on the
|
|
|
|
queue that has reached the threshold. If both queues are full, no more
|
|
|
|
decode tasks will be dispatched to the decode task queue, so other
|
|
|
|
decoders will have an opportunity to run.
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
During playback the audio thread will be idle (via a Wait() on the
|
2011-07-12 07:39:39 +04:00
|
|
|
monitor) if the audio queue is empty. Otherwise it constantly pops
|
2011-08-16 09:19:51 +04:00
|
|
|
audio data off the queue and plays it with a blocking write to the audio
|
2013-03-19 08:12:36 +04:00
|
|
|
hardware (via AudioStream).
|
2010-04-27 12:53:44 +04:00
|
|
|
|
|
|
|
*/
|
2012-11-14 23:46:40 +04:00
|
|
|
#if !defined(MediaDecoderStateMachine_h__)
|
|
|
|
#define MediaDecoderStateMachine_h__
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2013-05-30 00:43:41 +04:00
|
|
|
#include "mozilla/Attributes.h"
|
2010-04-02 07:03:07 +04:00
|
|
|
#include "nsThreadUtils.h"
|
2012-11-14 23:46:40 +04:00
|
|
|
#include "MediaDecoder.h"
|
2011-04-29 23:21:57 +04:00
|
|
|
#include "mozilla/ReentrantMonitor.h"
|
2013-09-06 00:25:17 +04:00
|
|
|
#include "MediaDecoderReader.h"
|
|
|
|
#include "MediaDecoderOwner.h"
|
|
|
|
#include "MediaMetadataManager.h"
|
2014-12-30 10:16:48 +03:00
|
|
|
#include "MediaDecoderStateMachineScheduler.h"
|
2013-09-06 00:25:17 +04:00
|
|
|
|
|
|
|
class nsITimer;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-14 23:45:33 +04:00
|
|
|
namespace mozilla {
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2013-09-06 00:25:17 +04:00
|
|
|
class AudioSegment;
|
|
|
|
class VideoSegment;
|
2014-02-18 02:53:52 +04:00
|
|
|
class MediaTaskQueue;
|
|
|
|
class SharedThreadPool;
|
2013-12-11 09:03:30 +04:00
|
|
|
class AudioSink;
|
2014-07-11 11:11:00 +04:00
|
|
|
class MediaDecoderStateMachineScheduler;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2014-02-06 03:11:25 +04:00
|
|
|
// GetCurrentTime is defined in winbase.h as zero argument macro forwarding to
|
|
|
|
// GetTickCount() and conflicts with MediaDecoderStateMachine::GetCurrentTime
|
|
|
|
// implementation.
|
|
|
|
#ifdef GetCurrentTime
|
|
|
|
#undef GetCurrentTime
|
|
|
|
#endif
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
/*
|
2011-07-12 07:39:39 +04:00
|
|
|
The state machine class. This manages the decoding and seeking in the
|
2014-06-18 09:07:02 +04:00
|
|
|
MediaDecoderReader on the decode task queue, and A/V sync on the shared
|
2010-04-02 07:03:07 +04:00
|
|
|
state machine thread, and controls the audio "push" thread.
|
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
All internal state is synchronised via the decoder monitor. State changes
|
|
|
|
are either propagated by NotifyAll on the monitor (typically when state
|
|
|
|
changes need to be propagated to non-state machine threads) or by scheduling
|
|
|
|
the state machine to run another cycle on the shared state machine thread.
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-14 23:46:40 +04:00
|
|
|
See MediaDecoder.h for more details.
|
2010-04-02 07:03:07 +04:00
|
|
|
*/
|
2014-04-02 16:51:47 +04:00
|
|
|
class MediaDecoderStateMachine
|
2010-04-02 07:03:07 +04:00
|
|
|
{
|
2013-12-11 09:03:30 +04:00
|
|
|
friend class AudioSink;
|
2014-04-02 16:51:47 +04:00
|
|
|
NS_INLINE_DECL_THREADSAFE_REFCOUNTING(MediaDecoderStateMachine)
|
2010-04-02 07:03:07 +04:00
|
|
|
public:
|
2012-11-14 23:46:40 +04:00
|
|
|
typedef MediaDecoder::DecodedStreamData DecodedStreamData;
|
|
|
|
MediaDecoderStateMachine(MediaDecoder* aDecoder,
|
|
|
|
MediaDecoderReader* aReader,
|
2012-11-14 23:45:33 +04:00
|
|
|
bool aRealTime = false);
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-14 23:46:40 +04:00
|
|
|
nsresult Init(MediaDecoderStateMachine* aCloneDonor);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Enumeration for the valid decoding states
|
|
|
|
enum State {
|
2014-08-21 07:37:00 +04:00
|
|
|
DECODER_STATE_DECODING_NONE,
|
2012-11-07 02:33:02 +04:00
|
|
|
DECODER_STATE_DECODING_METADATA,
|
2013-06-10 16:22:05 +04:00
|
|
|
DECODER_STATE_WAIT_FOR_RESOURCES,
|
2014-11-06 12:52:44 +03:00
|
|
|
DECODER_STATE_DECODING_FIRSTFRAME,
|
2013-06-10 16:22:05 +04:00
|
|
|
DECODER_STATE_DORMANT,
|
2012-11-07 02:33:02 +04:00
|
|
|
DECODER_STATE_DECODING,
|
|
|
|
DECODER_STATE_SEEKING,
|
|
|
|
DECODER_STATE_BUFFERING,
|
|
|
|
DECODER_STATE_COMPLETED,
|
|
|
|
DECODER_STATE_SHUTDOWN
|
|
|
|
};
|
|
|
|
|
|
|
|
State GetState() {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2012-11-07 02:33:01 +04:00
|
|
|
return mState;
|
2010-05-06 06:31:02 +04:00
|
|
|
}
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Set the audio volume. The decoder monitor must be obtained before
|
|
|
|
// calling this.
|
2012-11-07 02:33:02 +04:00
|
|
|
void SetVolume(double aVolume);
|
2015-01-29 02:57:00 +03:00
|
|
|
void SetAudioCaptured();
|
2013-06-10 16:22:05 +04:00
|
|
|
|
|
|
|
// Check if the decoder needs to become dormant state.
|
|
|
|
bool IsDormantNeeded();
|
|
|
|
// Set/Unset dormant state.
|
|
|
|
void SetDormant(bool aDormant);
|
2012-11-07 02:33:02 +04:00
|
|
|
void Shutdown();
|
2014-12-09 22:43:21 +03:00
|
|
|
void ShutdownReader();
|
2014-12-13 01:22:23 +03:00
|
|
|
void FinishShutdown();
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2015-01-06 04:57:53 +03:00
|
|
|
bool IsRealTime() const;
|
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
// Called from the main thread to get the duration. The decoder monitor
|
|
|
|
// must be obtained before calling this. It is in units of microseconds.
|
2012-11-07 02:33:02 +04:00
|
|
|
int64_t GetDuration();
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2015-01-16 15:49:01 +03:00
|
|
|
// Time of the last frame in the media, in microseconds or INT64_MAX if
|
|
|
|
// media has an infinite duration.
|
|
|
|
// Accessed on state machine, decode, and main threads.
|
|
|
|
// Access controlled by decoder monitor.
|
|
|
|
int64_t GetEndTime();
|
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
// Called from the main thread to set the duration of the media resource
|
|
|
|
// if it is able to be obtained via HTTP headers. Called from the
|
|
|
|
// state machine thread to set the duration if it is obtained from the
|
|
|
|
// media metadata. The decoder monitor must be obtained before calling this.
|
|
|
|
// aDuration is in microseconds.
|
2015-01-16 15:49:01 +03:00
|
|
|
// A value of INT64_MAX will be treated as infinity.
|
2012-11-07 02:33:02 +04:00
|
|
|
void SetDuration(int64_t aDuration);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Called while decoding metadata to set the end time of the media
|
|
|
|
// resource. The decoder monitor must be obtained before calling this.
|
|
|
|
// aEndTime is in microseconds.
|
2012-11-19 19:11:21 +04:00
|
|
|
void SetMediaEndTime(int64_t aEndTime);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2013-09-10 04:45:33 +04:00
|
|
|
// Called from main thread to update the duration with an estimated value.
|
|
|
|
// The duration is only changed if its significantly different than the
|
|
|
|
// the current duration, as the incoming duration is an estimate and so
|
|
|
|
// often is unstable as more data is read and the estimate is updated.
|
|
|
|
// Can result in a durationchangeevent. aDuration is in microseconds.
|
|
|
|
void UpdateEstimatedDuration(int64_t aDuration);
|
2013-05-03 11:48:37 +04:00
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
// Functions used by assertions to ensure we're calling things
|
|
|
|
// on the appropriate threads.
|
2014-02-18 02:53:52 +04:00
|
|
|
bool OnDecodeThread() const;
|
2012-11-07 02:33:02 +04:00
|
|
|
bool OnStateMachineThread() const;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2012-11-14 23:45:31 +04:00
|
|
|
MediaDecoderOwner::NextFrameStatus GetNextFrameStatus();
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Cause state transitions. These methods obtain the decoder monitor
|
|
|
|
// to synchronise the change of state, and to notify other threads
|
|
|
|
// that the state has changed.
|
2014-12-30 10:16:48 +03:00
|
|
|
void Play()
|
|
|
|
{
|
|
|
|
MOZ_ASSERT(NS_IsMainThread());
|
|
|
|
nsRefPtr<nsRunnable> r = NS_NewRunnableMethod(this, &MediaDecoderStateMachine::PlayInternal);
|
|
|
|
GetStateMachineThread()->Dispatch(r, NS_DISPATCH_NORMAL);
|
|
|
|
}
|
|
|
|
|
|
|
|
private:
|
|
|
|
// The actual work for the above, which happens asynchronously on the state
|
|
|
|
// machine thread.
|
|
|
|
void PlayInternal();
|
|
|
|
public:
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2014-04-01 07:39:04 +04:00
|
|
|
// Seeks to the decoder to aTarget asynchronously.
|
2014-11-06 12:52:44 +03:00
|
|
|
// Must be called from the main thread.
|
2014-04-01 07:39:04 +04:00
|
|
|
void Seek(const SeekTarget& aTarget);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Dispatches a task to the main thread to seek to mQueuedSeekTarget.
|
|
|
|
// This is threadsafe and can be called on any thread.
|
|
|
|
void EnqueueStartQueuedSeekTask();
|
|
|
|
|
|
|
|
// Seeks to the decoder to mQueuedSeekTarget asynchronously.
|
|
|
|
// Must be called from the main thread.
|
|
|
|
void StartQueuedSeek();
|
|
|
|
|
|
|
|
// Seeks to the decoder to aTarget asynchronously.
|
|
|
|
// Must be called from the main thread.
|
|
|
|
// The decoder monitor must be held with exactly one lock count.
|
|
|
|
void StartSeek(const SeekTarget& aTarget);
|
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
// Returns the current playback position in seconds.
|
|
|
|
// Called from the main thread to get the current frame time. The decoder
|
|
|
|
// monitor must be obtained before calling this.
|
2012-11-07 02:33:02 +04:00
|
|
|
double GetCurrentTime() const;
|
2015-01-29 02:57:00 +03:00
|
|
|
int64_t GetCurrentTimeUs() const;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Clear the flag indicating that a playback position change event
|
|
|
|
// is currently queued. This is called from the main thread and must
|
|
|
|
// be called with the decode monitor held.
|
2012-11-07 02:33:02 +04:00
|
|
|
void ClearPositionChangeFlag();
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Update the playback position. This can result in a timeupdate event
|
|
|
|
// and an invalidate of the frame being dispatched asynchronously if
|
|
|
|
// there is no such event currently queued.
|
|
|
|
// Only called on the decoder thread. Must be called with
|
|
|
|
// the decode monitor held.
|
2012-11-07 02:33:02 +04:00
|
|
|
void UpdatePlaybackPosition(int64_t aTime);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
|
|
|
// Causes the state machine to switch to buffering state, and to
|
|
|
|
// immediately stop playback and buffer downloaded data. Must be called
|
|
|
|
// with the decode monitor held. Called on the state machine thread and
|
|
|
|
// the main thread.
|
2012-11-07 02:33:02 +04:00
|
|
|
void StartBuffering();
|
2010-05-06 06:31:02 +04:00
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// This is called on the state machine thread and audio thread.
|
|
|
|
// The decoder monitor must be obtained before calling this.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool HasAudio() const {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2013-09-27 09:22:38 +04:00
|
|
|
return mInfo.HasAudio();
|
2010-04-02 07:03:07 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
// This is called on the state machine thread and audio thread.
|
|
|
|
// The decoder monitor must be obtained before calling this.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool HasVideo() const {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2013-09-27 09:22:38 +04:00
|
|
|
return mInfo.HasVideo();
|
2010-04-02 07:03:07 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
// Should be called by main thread.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HaveNextFrameData();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
|
|
|
// Must be called with the decode monitor held.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool IsBuffering() const {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
return mState == DECODER_STATE_BUFFERING;
|
2010-04-02 07:03:07 +04:00
|
|
|
}
|
|
|
|
|
|
|
|
// Must be called with the decode monitor held.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool IsSeeking() const {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-07 02:33:02 +04:00
|
|
|
return mState == DECODER_STATE_SEEKING;
|
2012-11-07 04:36:49 +04:00
|
|
|
}
|
|
|
|
|
2014-11-12 07:50:21 +03:00
|
|
|
nsresult GetBuffered(dom::TimeRanges* aBuffered) {
|
|
|
|
// It's possible for JS to query .buffered before we've determined the start
|
|
|
|
// time from metadata, in which case the reader isn't ready to be asked this
|
|
|
|
// question.
|
|
|
|
ReentrantMonitorAutoEnter mon(mDecoder->GetReentrantMonitor());
|
|
|
|
if (mStartTime < 0) {
|
|
|
|
return NS_OK;
|
|
|
|
}
|
|
|
|
|
|
|
|
return mReader->GetBuffered(aBuffered);
|
|
|
|
}
|
2010-08-05 11:40:35 +04:00
|
|
|
|
2012-11-22 14:38:28 +04:00
|
|
|
void SetPlaybackRate(double aPlaybackRate);
|
|
|
|
void SetPreservesPitch(bool aPreservesPitch);
|
|
|
|
|
2014-03-20 01:33:12 +04:00
|
|
|
size_t SizeOfVideoQueue() {
|
2011-07-22 07:17:23 +04:00
|
|
|
if (mReader) {
|
2014-03-20 01:33:12 +04:00
|
|
|
return mReader->SizeOfVideoQueueInBytes();
|
2011-07-22 07:17:23 +04:00
|
|
|
}
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
|
2014-01-23 07:14:45 +04:00
|
|
|
size_t SizeOfAudioQueue() {
|
2011-07-22 07:17:23 +04:00
|
|
|
if (mReader) {
|
2014-03-20 01:33:12 +04:00
|
|
|
return mReader->SizeOfAudioQueueInBytes();
|
2011-07-22 07:17:23 +04:00
|
|
|
}
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
|
2012-08-22 19:56:38 +04:00
|
|
|
void NotifyDataArrived(const char* aBuffer, uint32_t aLength, int64_t aOffset);
|
2010-09-13 12:45:50 +04:00
|
|
|
|
2011-07-12 07:39:34 +04:00
|
|
|
// Returns the shared state machine thread.
|
2014-07-11 11:11:00 +04:00
|
|
|
nsIEventTarget* GetStateMachineThread() const;
|
2011-07-12 07:39:34 +04:00
|
|
|
|
2012-04-30 07:12:42 +04:00
|
|
|
// Calls ScheduleStateMachine() after taking the decoder lock. Also
|
|
|
|
// notifies the decoder thread in case it's waiting on the decoder lock.
|
|
|
|
void ScheduleStateMachineWithLockAndWakeDecoder();
|
|
|
|
|
2011-07-12 07:39:34 +04:00
|
|
|
// Schedules the shared state machine thread to run the state machine
|
2011-07-12 07:39:32 +04:00
|
|
|
// in aUsecs microseconds from now, if it's not already scheduled to run
|
|
|
|
// earlier, in which case the request is discarded.
|
2012-12-19 08:48:32 +04:00
|
|
|
nsresult ScheduleStateMachine(int64_t aUsecs = 0);
|
2011-07-12 07:39:32 +04:00
|
|
|
|
2014-07-11 11:11:00 +04:00
|
|
|
// Callback function registered with MediaDecoderStateMachineScheduler
|
|
|
|
// to run state machine cycles.
|
|
|
|
static nsresult TimeoutExpired(void* aClosure);
|
2011-07-12 07:39:34 +04:00
|
|
|
|
2011-08-25 03:42:23 +04:00
|
|
|
// Set the media fragment end time. aEndTime is in microseconds.
|
2012-08-22 19:56:38 +04:00
|
|
|
void SetFragmentEndTime(int64_t aEndTime);
|
2011-08-25 03:42:23 +04:00
|
|
|
|
2011-09-21 11:01:00 +04:00
|
|
|
// Drop reference to decoder. Only called during shutdown dance.
|
2014-06-18 09:07:02 +04:00
|
|
|
void BreakCycles() {
|
2013-10-14 12:38:17 +04:00
|
|
|
if (mReader) {
|
2014-06-18 09:07:02 +04:00
|
|
|
mReader->BreakCycles();
|
2013-10-14 12:38:17 +04:00
|
|
|
}
|
|
|
|
mDecoder = nullptr;
|
|
|
|
}
|
2011-09-21 11:01:00 +04:00
|
|
|
|
2013-12-02 01:09:06 +04:00
|
|
|
// If we're playing into a MediaStream, record the current point in the
|
|
|
|
// MediaStream and the current point in our media resource so later we can
|
|
|
|
// convert MediaStream playback positions to media resource positions. Best to
|
|
|
|
// call this while we're not playing (while the MediaStream is blocked). Can
|
|
|
|
// be called on any thread with the decoder monitor held.
|
|
|
|
void SetSyncPointForMediaStream();
|
2014-12-04 21:29:00 +03:00
|
|
|
|
|
|
|
// Called when the decoded stream is destroyed. |mPlayStartTime| and
|
|
|
|
// |mPlayDuration| are updated to provide a good base for calculating video
|
|
|
|
// stream time using the system clock.
|
|
|
|
void ResyncMediaStreamClock();
|
2014-10-30 23:12:00 +03:00
|
|
|
int64_t GetCurrentTimeViaMediaStreamSync() const;
|
2013-12-02 01:09:06 +04:00
|
|
|
|
2012-04-30 07:12:42 +04:00
|
|
|
// Copy queued audio/video data in the reader to any output MediaStreams that
|
|
|
|
// need it.
|
2012-07-31 16:17:22 +04:00
|
|
|
void SendStreamData();
|
|
|
|
void FinishStreamData();
|
2012-08-22 19:56:38 +04:00
|
|
|
bool HaveEnoughDecodedAudio(int64_t aAmpleAudioUSecs);
|
2012-04-30 07:12:42 +04:00
|
|
|
bool HaveEnoughDecodedVideo();
|
|
|
|
|
2012-11-07 02:33:01 +04:00
|
|
|
// Returns true if the state machine has shutdown or is in the process of
|
|
|
|
// shutting down. The decoder monitor must be held while calling this.
|
2012-11-07 02:33:02 +04:00
|
|
|
bool IsShutdown();
|
2012-11-07 02:33:01 +04:00
|
|
|
|
2014-11-06 11:17:05 +03:00
|
|
|
void QueueMetadata(int64_t aPublishTime,
|
|
|
|
nsAutoPtr<MediaInfo> aInfo,
|
|
|
|
nsAutoPtr<MetadataTags> aTags);
|
2012-11-30 17:17:54 +04:00
|
|
|
|
2013-11-23 13:48:24 +04:00
|
|
|
// Returns true if we're currently playing. The decoder monitor must
|
|
|
|
// be held.
|
2014-10-30 23:12:00 +03:00
|
|
|
bool IsPlaying() const;
|
2013-11-23 13:48:24 +04:00
|
|
|
|
2014-12-02 08:51:03 +03:00
|
|
|
// Dispatch DoNotifyWaitingForResourcesStatusChanged task to the task queue.
|
2014-03-11 07:44:09 +04:00
|
|
|
// Called when the reader may have acquired the hardware resources required
|
2014-10-06 07:03:14 +04:00
|
|
|
// to begin decoding. The decoder monitor must be held while calling this.
|
2014-03-11 07:44:09 +04:00
|
|
|
void NotifyWaitingForResourcesStatusChanged();
|
|
|
|
|
2014-04-01 07:43:57 +04:00
|
|
|
// Notifies the state machine that should minimize the number of samples
|
|
|
|
// decoded we preroll, until playback starts. The first time playback starts
|
|
|
|
// the state machine is free to return to prerolling normally. Note
|
|
|
|
// "prerolling" in this context refers to when we decode and buffer decoded
|
|
|
|
// samples in advance of when they're needed for playback.
|
|
|
|
void SetMinimizePrerollUntilPlaybackStarts();
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
void OnAudioDecoded(AudioData* aSample);
|
|
|
|
void OnVideoDecoded(VideoData* aSample);
|
2014-12-09 01:45:36 +03:00
|
|
|
void OnNotDecoded(MediaData::Type aType, MediaDecoderReader::NotDecodedReason aReason);
|
2014-12-11 01:03:56 +03:00
|
|
|
void OnAudioNotDecoded(MediaDecoderReader::NotDecodedReason aReason)
|
|
|
|
{
|
|
|
|
OnNotDecoded(MediaData::AUDIO_DATA, aReason);
|
|
|
|
}
|
|
|
|
void OnVideoNotDecoded(MediaDecoderReader::NotDecodedReason aReason)
|
|
|
|
{
|
|
|
|
OnNotDecoded(MediaData::VIDEO_DATA, aReason);
|
|
|
|
}
|
|
|
|
|
2015-01-12 00:57:14 +03:00
|
|
|
void OnSeekCompleted(int64_t aTime);
|
2014-12-16 12:52:57 +03:00
|
|
|
void OnSeekFailed(nsresult aResult);
|
2014-11-03 11:20:14 +03:00
|
|
|
|
2014-12-22 11:20:31 +03:00
|
|
|
void OnWaitForDataResolved(MediaData::Type aType)
|
|
|
|
{
|
|
|
|
ReentrantMonitorAutoEnter mon(mDecoder->GetReentrantMonitor());
|
2015-02-04 07:09:55 +03:00
|
|
|
WaitRequestRef(aType).Complete();
|
|
|
|
DispatchDecodeTasksIfNeeded();
|
2014-12-22 11:20:31 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
void OnWaitForDataRejected(WaitForDataRejectValue aRejection)
|
|
|
|
{
|
2015-02-04 07:09:55 +03:00
|
|
|
ReentrantMonitorAutoEnter mon(mDecoder->GetReentrantMonitor());
|
|
|
|
WaitRequestRef(aRejection.mType).Complete();
|
2014-12-22 11:20:31 +03:00
|
|
|
}
|
|
|
|
|
2014-11-03 11:20:14 +03:00
|
|
|
private:
|
2014-11-03 11:20:14 +03:00
|
|
|
void AcquireMonitorAndInvokeDecodeError();
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2012-11-28 06:34:53 +04:00
|
|
|
protected:
|
2014-04-22 15:58:00 +04:00
|
|
|
virtual ~MediaDecoderStateMachine();
|
2012-11-28 06:34:53 +04:00
|
|
|
|
2013-11-04 02:11:09 +04:00
|
|
|
void AssertCurrentThreadInMonitor() const { mDecoder->GetReentrantMonitor().AssertCurrentThreadIn(); }
|
|
|
|
|
2014-08-28 04:46:00 +04:00
|
|
|
void SetState(State aState);
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Inserts MediaData* samples into their respective MediaQueues.
|
|
|
|
// aSample must not be null.
|
|
|
|
void Push(AudioData* aSample);
|
|
|
|
void Push(VideoData* aSample);
|
|
|
|
|
2012-07-31 16:17:22 +04:00
|
|
|
class WakeDecoderRunnable : public nsRunnable {
|
|
|
|
public:
|
2014-09-01 07:50:23 +04:00
|
|
|
explicit WakeDecoderRunnable(MediaDecoderStateMachine* aSM)
|
2012-07-31 16:17:22 +04:00
|
|
|
: mMutex("WakeDecoderRunnable"), mStateMachine(aSM) {}
|
2013-05-30 00:43:41 +04:00
|
|
|
NS_IMETHOD Run() MOZ_OVERRIDE
|
2012-07-31 16:17:22 +04:00
|
|
|
{
|
2012-11-14 23:46:40 +04:00
|
|
|
nsRefPtr<MediaDecoderStateMachine> stateMachine;
|
2012-07-31 16:17:22 +04:00
|
|
|
{
|
|
|
|
// Don't let Run() (called by media stream graph thread) race with
|
|
|
|
// Revoke() (called by decoder state machine thread)
|
2012-11-14 23:45:33 +04:00
|
|
|
MutexAutoLock lock(mMutex);
|
2012-07-31 16:17:22 +04:00
|
|
|
if (!mStateMachine)
|
|
|
|
return NS_OK;
|
|
|
|
stateMachine = mStateMachine;
|
|
|
|
}
|
|
|
|
stateMachine->ScheduleStateMachineWithLockAndWakeDecoder();
|
|
|
|
return NS_OK;
|
|
|
|
}
|
|
|
|
void Revoke()
|
|
|
|
{
|
2012-11-14 23:45:33 +04:00
|
|
|
MutexAutoLock lock(mMutex);
|
2012-07-31 16:17:22 +04:00
|
|
|
mStateMachine = nullptr;
|
|
|
|
}
|
|
|
|
|
2012-11-14 23:45:33 +04:00
|
|
|
Mutex mMutex;
|
2012-07-31 16:17:22 +04:00
|
|
|
// Protected by mMutex.
|
|
|
|
// We don't use an owning pointer here, because keeping mStateMachine alive
|
|
|
|
// would mean in some cases we'd have to destroy mStateMachine from this
|
2012-11-14 23:46:40 +04:00
|
|
|
// object, which would be problematic since MediaDecoderStateMachine can
|
2012-07-31 16:17:22 +04:00
|
|
|
// only be destroyed on the main thread whereas this object can be destroyed
|
|
|
|
// on the media stream graph thread.
|
2012-11-14 23:46:40 +04:00
|
|
|
MediaDecoderStateMachine* mStateMachine;
|
2012-07-31 16:17:22 +04:00
|
|
|
};
|
|
|
|
WakeDecoderRunnable* GetWakeDecoderRunnable();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
MediaQueue<AudioData>& AudioQueue() { return mAudioQueue; }
|
|
|
|
MediaQueue<VideoData>& VideoQueue() { return mVideoQueue; }
|
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
nsresult FinishDecodeFirstFrame();
|
2014-06-18 09:07:02 +04:00
|
|
|
|
|
|
|
nsAutoPtr<MetadataTags> mMetadataTags;
|
2014-04-28 05:12:50 +04:00
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// True if our buffers of decoded audio are not full, and we should
|
|
|
|
// decode more.
|
|
|
|
bool NeedToDecodeAudio();
|
|
|
|
|
|
|
|
// True if our buffers of decoded video are not full, and we should
|
|
|
|
// decode more.
|
|
|
|
bool NeedToDecodeVideo();
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we've got less than aAudioUsecs microseconds of decoded
|
2011-04-14 02:12:23 +04:00
|
|
|
// and playable data. The decoder monitor must be held.
|
2014-12-22 11:20:31 +03:00
|
|
|
//
|
|
|
|
// May not be invoked when mReader->UseBufferingHeuristics() is false.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasLowDecodedData(int64_t aAudioUsecs);
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2014-12-22 11:20:31 +03:00
|
|
|
bool OutOfDecodedAudio()
|
|
|
|
{
|
|
|
|
return IsAudioDecoding() && !AudioQueue().IsFinished() && AudioQueue().GetSize() == 0;
|
|
|
|
}
|
|
|
|
|
|
|
|
bool OutOfDecodedVideo()
|
|
|
|
{
|
|
|
|
// In buffering mode, we keep the last already-played frame in the queue.
|
|
|
|
int emptyVideoSize = mState == DECODER_STATE_BUFFERING ? 1 : 0;
|
|
|
|
return IsVideoDecoding() && !VideoQueue().IsFinished() && VideoQueue().GetSize() <= emptyVideoSize;
|
|
|
|
}
|
|
|
|
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we're running low on data which is not yet decoded.
|
2011-03-24 01:28:57 +03:00
|
|
|
// The decoder monitor must be held.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasLowUndecodedData();
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2013-11-19 18:01:14 +04:00
|
|
|
// Returns true if we have less than aUsecs of undecoded data available.
|
2014-11-12 07:50:21 +03:00
|
|
|
bool HasLowUndecodedData(int64_t aUsecs);
|
2010-11-28 23:06:38 +03:00
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// Returns the number of unplayed usecs of audio we've got decoded and/or
|
2010-09-15 03:24:47 +04:00
|
|
|
// pushed to the hardware waiting to play. This is how much audio we can
|
2011-03-24 01:28:57 +03:00
|
|
|
// play without having to run the audio decoder. The decoder monitor
|
|
|
|
// must be held.
|
2014-04-28 05:12:50 +04:00
|
|
|
int64_t AudioDecodedUsecs();
|
2010-09-15 03:24:47 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true when there's decoded audio waiting to play.
|
2010-05-13 04:59:42 +04:00
|
|
|
// The decoder monitor must be held.
|
2014-04-28 05:12:50 +04:00
|
|
|
bool HasFutureAudio();
|
2010-05-13 04:59:42 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we recently exited "quick buffering" mode.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool JustExitedQuickBuffering();
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// Dispatches an asynchronous event to update the media element's ready state.
|
|
|
|
void UpdateReadyState();
|
|
|
|
|
2011-07-12 07:39:25 +04:00
|
|
|
// Resets playback timing data. Called when we seek, on the decode thread.
|
2010-04-02 07:03:07 +04:00
|
|
|
void ResetPlayback();
|
|
|
|
|
2014-06-30 03:23:04 +04:00
|
|
|
// Orders the Reader to stop decoding, and blocks until the Reader
|
|
|
|
// has stopped decoding and finished delivering samples, then calls
|
2014-07-11 11:11:00 +04:00
|
|
|
// ResetPlayback() to discard all enqueued data.
|
2014-06-30 03:23:04 +04:00
|
|
|
void FlushDecoding();
|
|
|
|
|
2014-11-02 20:06:00 +03:00
|
|
|
// Called when AudioSink reaches the end. |mPlayStartTime| and
|
|
|
|
// |mPlayDuration| are updated to provide a good base for calculating video
|
|
|
|
// stream time.
|
|
|
|
void ResyncAudioClock();
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// Returns the audio clock, if we have audio, or -1 if we don't.
|
|
|
|
// Called on the state machine thread.
|
2014-10-30 23:12:00 +03:00
|
|
|
int64_t GetAudioClock() const;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-22 14:38:28 +04:00
|
|
|
// Get the video stream position, taking the |playbackRate| change into
|
|
|
|
// account. This is a position in the media, not the duration of the playback
|
|
|
|
// so far.
|
2014-10-30 23:12:00 +03:00
|
|
|
int64_t GetVideoStreamPosition() const;
|
2012-11-22 14:38:28 +04:00
|
|
|
|
|
|
|
// Return the current time, either the audio clock if available (if the media
|
|
|
|
// has audio, and the playback is possible), or a clock for the video.
|
|
|
|
// Called on the state machine thread.
|
2014-10-30 23:12:00 +03:00
|
|
|
int64_t GetClock() const;
|
2012-11-22 14:38:28 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
nsresult DropAudioUpToSeekTarget(AudioData* aSample);
|
|
|
|
nsresult DropVideoUpToSeekTarget(VideoData* aSample);
|
|
|
|
|
|
|
|
void SetStartTime(int64_t aStartTimeUsecs);
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-02-01 05:57:13 +03:00
|
|
|
// Update only the state machine's current playback position (and duration,
|
|
|
|
// if unknown). Does not update the playback position on the decoder or
|
|
|
|
// media element -- use UpdatePlaybackPosition for that. Called on the state
|
|
|
|
// machine thread, caller must hold the decoder lock.
|
2012-08-22 19:56:38 +04:00
|
|
|
void UpdatePlaybackPositionInternal(int64_t aTime);
|
2011-02-01 05:57:13 +03:00
|
|
|
|
2011-07-12 07:39:39 +04:00
|
|
|
// Pushes the image down the rendering pipeline. Called on the shared state
|
|
|
|
// machine thread. The decoder monitor must *not* be held when calling this.
|
2011-06-24 02:08:54 +04:00
|
|
|
void RenderVideoFrame(VideoData* aData, TimeStamp aTarget);
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// If we have video, display a video frame if it's time for display has
|
2011-09-27 07:31:18 +04:00
|
|
|
// arrived, otherwise sleep until it's time for the next frame. Update the
|
|
|
|
// current frame time as appropriate, and trigger ready state update. The
|
|
|
|
// decoder monitor must be held with exactly one lock count. Called on the
|
|
|
|
// state machine thread.
|
2010-04-02 07:03:07 +04:00
|
|
|
void AdvanceFrame();
|
|
|
|
|
2011-07-12 07:39:10 +04:00
|
|
|
// Stops the audio thread. The decoder monitor must be held with exactly
|
2010-04-02 07:03:07 +04:00
|
|
|
// one lock count. Called on the state machine thread.
|
2011-07-12 07:39:10 +04:00
|
|
|
void StopAudioThread();
|
|
|
|
|
|
|
|
// Starts the audio thread. The decoder monitor must be held with exactly
|
|
|
|
// one lock count. Called on the state machine thread.
|
|
|
|
nsresult StartAudioThread();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-07-12 07:39:30 +04:00
|
|
|
// Sets internal state which causes playback of media to pause.
|
2013-11-05 08:12:01 +04:00
|
|
|
// The decoder monitor must be held.
|
2011-07-12 07:39:30 +04:00
|
|
|
void StopPlayback();
|
|
|
|
|
2014-12-30 10:16:48 +03:00
|
|
|
// If the conditions are right, sets internal state which causes playback
|
|
|
|
// of media to begin or resume.
|
2013-11-05 08:12:01 +04:00
|
|
|
// Must be called with the decode monitor held.
|
2014-12-30 10:16:48 +03:00
|
|
|
void MaybeStartPlayback();
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-03-24 01:28:57 +03:00
|
|
|
// Moves the decoder into decoding state. Called on the state machine
|
|
|
|
// thread. The decoder monitor must be held.
|
|
|
|
void StartDecoding();
|
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// Moves the decoder into the shutdown state, and dispatches an error
|
|
|
|
// event to the media element. This begins shutting down the decoder.
|
|
|
|
// The decoder monitor must be held. This is only called on the
|
|
|
|
// decode thread.
|
|
|
|
void DecodeError();
|
|
|
|
|
2013-06-10 16:22:05 +04:00
|
|
|
void StartWaitForResources();
|
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// Dispatches a task to the decode task queue to begin decoding metadata.
|
2014-03-11 07:44:09 +04:00
|
|
|
// This is threadsafe and can be called on any thread.
|
2014-03-11 07:44:09 +04:00
|
|
|
// The decoder monitor must be held.
|
|
|
|
nsresult EnqueueDecodeMetadataTask();
|
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Dispatches a LoadedMetadataEvent.
|
|
|
|
// This is threadsafe and can be called on any thread.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
void EnqueueLoadedMetadataEvent();
|
|
|
|
|
2015-01-16 18:56:19 +03:00
|
|
|
void EnqueueFirstFrameLoadedEvent();
|
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Dispatches a task to the decode task queue to begin decoding content.
|
|
|
|
// This is threadsafe and can be called on any thread.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
nsresult EnqueueDecodeFirstFrameTask();
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Dispatches a task to the decode task queue to seek the decoder.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
nsresult EnqueueDecodeSeekTask();
|
|
|
|
|
2014-03-11 07:44:10 +04:00
|
|
|
nsresult DispatchAudioDecodeTaskIfNeeded();
|
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// Ensures a to decode audio has been dispatched to the decode task queue.
|
|
|
|
// If a task to decode has already been dispatched, this does nothing,
|
|
|
|
// otherwise this dispatches a task to do the decode.
|
2014-03-11 07:44:10 +04:00
|
|
|
// This is called on the state machine or decode threads.
|
|
|
|
// The decoder monitor must be held.
|
2014-03-11 07:44:09 +04:00
|
|
|
nsresult EnsureAudioDecodeTaskQueued();
|
|
|
|
|
2014-03-11 07:44:10 +04:00
|
|
|
nsresult DispatchVideoDecodeTaskIfNeeded();
|
|
|
|
|
2014-03-11 07:44:09 +04:00
|
|
|
// Ensures a to decode video has been dispatched to the decode task queue.
|
|
|
|
// If a task to decode has already been dispatched, this does nothing,
|
|
|
|
// otherwise this dispatches a task to do the decode.
|
2014-03-11 07:44:09 +04:00
|
|
|
// The decoder monitor must be held.
|
2014-03-11 07:44:09 +04:00
|
|
|
nsresult EnsureVideoDecodeTaskQueued();
|
2014-03-11 07:44:09 +04:00
|
|
|
|
2014-05-19 06:23:00 +04:00
|
|
|
// Calls the reader's SetIdle(). This is only called in a task dispatched to
|
|
|
|
// the decode task queue, don't call it directly.
|
2014-03-11 07:44:10 +04:00
|
|
|
void SetReaderIdle();
|
|
|
|
|
2014-03-11 07:44:10 +04:00
|
|
|
// Re-evaluates the state and determines whether we need to dispatch
|
|
|
|
// events to run the decode, or if not whether we should set the reader
|
2014-03-11 07:44:10 +04:00
|
|
|
// to idle mode. This is threadsafe, and can be called from any thread.
|
|
|
|
// The decoder monitor must be held.
|
2014-03-11 07:44:10 +04:00
|
|
|
void DispatchDecodeTasksIfNeeded();
|
2014-03-11 07:44:10 +04:00
|
|
|
|
2010-09-15 03:24:47 +04:00
|
|
|
// Returns the "media time". This is the absolute time which the media
|
|
|
|
// playback has reached. i.e. this returns values in the range
|
|
|
|
// [mStartTime, mEndTime], and mStartTime will not be 0 if the media does
|
|
|
|
// not start at 0. Note this is different to the value returned
|
|
|
|
// by GetCurrentTime(), which is in the range [0,duration].
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t GetMediaTime() const {
|
2013-11-04 02:11:09 +04:00
|
|
|
AssertCurrentThreadInMonitor();
|
2010-09-15 03:24:47 +04:00
|
|
|
return mStartTime + mCurrentFrameTime;
|
|
|
|
}
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// Returns an upper bound on the number of microseconds of audio that is
|
|
|
|
// decoded and playable. This is the sum of the number of usecs of audio which
|
|
|
|
// is decoded and in the reader's audio queue, and the usecs of unplayed audio
|
2011-01-13 04:06:15 +03:00
|
|
|
// which has been pushed to the audio hardware for playback. Note that after
|
|
|
|
// calling this, the audio hardware may play some of the audio pushed to
|
|
|
|
// hardware, so this can only be used as a upper bound. The decoder monitor
|
2011-07-12 07:39:39 +04:00
|
|
|
// must be held when calling this. Called on the decode thread.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t GetDecodedAudioDuration();
|
2011-01-13 04:06:15 +03:00
|
|
|
|
2011-07-12 07:39:23 +04:00
|
|
|
// Load metadata. Called on the decode thread. The decoder monitor
|
|
|
|
// must be held with exactly one lock count.
|
|
|
|
nsresult DecodeMetadata();
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Wraps the call to DecodeMetadata(), signals a DecodeError() on failure.
|
|
|
|
void CallDecodeMetadata();
|
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Initiate first content decoding. Called on the decode thread.
|
|
|
|
// The decoder monitor must be held with exactly one lock count.
|
|
|
|
nsresult DecodeFirstFrame();
|
|
|
|
|
|
|
|
// Wraps the call to DecodeFirstFrame(), signals a DecodeError() on failure.
|
|
|
|
void CallDecodeFirstFrame();
|
|
|
|
|
|
|
|
// Checks whether we're finished decoding first audio and/or video packets,
|
|
|
|
// and switches to DECODING state if so.
|
|
|
|
void MaybeFinishDecodeFirstFrame();
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2011-07-12 07:39:25 +04:00
|
|
|
// Seeks to mSeekTarget. Called on the decode thread. The decoder monitor
|
|
|
|
// must be held with exactly one lock count.
|
|
|
|
void DecodeSeek();
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
void CheckIfSeekComplete();
|
|
|
|
bool IsAudioSeekComplete();
|
|
|
|
bool IsVideoSeekComplete();
|
2014-06-10 11:31:09 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Completes the seek operation, moves onto the next appropriate state.
|
|
|
|
void SeekCompleted();
|
|
|
|
|
|
|
|
// Queries our state to see whether the decode has finished for all streams.
|
|
|
|
// If so, we move into DECODER_STATE_COMPLETED and schedule the state machine
|
|
|
|
// to run.
|
|
|
|
// The decoder monitor must be held.
|
|
|
|
void CheckIfDecodeComplete();
|
2011-07-12 07:39:25 +04:00
|
|
|
|
2012-04-30 07:12:42 +04:00
|
|
|
// Copy audio from an AudioData packet to aOutput. This may require
|
|
|
|
// inserting silence depending on the timing of the audio packet.
|
2012-07-31 16:17:22 +04:00
|
|
|
void SendStreamAudio(AudioData* aAudio, DecodedStreamData* aStream,
|
|
|
|
AudioSegment* aOutput);
|
2012-04-30 07:12:42 +04:00
|
|
|
|
2011-07-12 07:39:34 +04:00
|
|
|
// State machine thread run function. Defers to RunStateMachine().
|
|
|
|
nsresult CallRunStateMachine();
|
|
|
|
|
|
|
|
// Performs one "cycle" of the state machine. Polls the state, and may send
|
|
|
|
// a video frame to be displayed, and generally manages the decode. Called
|
|
|
|
// periodically via timer to ensure the video stays in sync.
|
|
|
|
nsresult RunStateMachine();
|
|
|
|
|
2014-07-11 11:11:00 +04:00
|
|
|
bool IsStateMachineScheduled() const;
|
2011-07-12 07:39:32 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// Returns true if we're not playing and the decode thread has filled its
|
2011-07-12 07:39:37 +04:00
|
|
|
// decode buffers and is waiting. We can shut the decode thread down in this
|
|
|
|
// case as it may not be needed again.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool IsPausedAndDecoderWaiting();
|
2011-07-12 07:39:37 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// These return true if the respective stream's decode has not yet reached
|
|
|
|
// the end of stream.
|
|
|
|
bool IsAudioDecoding();
|
|
|
|
bool IsVideoDecoding();
|
|
|
|
|
2013-12-11 09:03:30 +04:00
|
|
|
// Set the time that playback started from the system clock.
|
|
|
|
// Can only be called on the state machine thread.
|
|
|
|
void SetPlayStartTime(const TimeStamp& aTimeStamp);
|
|
|
|
|
|
|
|
// Update mAudioEndTime.
|
|
|
|
void OnAudioEndTimeUpdate(int64_t aAudioEndTime);
|
|
|
|
|
|
|
|
// Update mDecoder's playback offset.
|
|
|
|
void OnPlaybackOffsetUpdate(int64_t aPlaybackOffset);
|
|
|
|
|
|
|
|
// Called by the AudioSink to signal that all outstanding work is complete
|
|
|
|
// and the sink is shutting down.
|
|
|
|
void OnAudioSinkComplete();
|
|
|
|
|
2014-09-30 01:42:00 +04:00
|
|
|
// Called by the AudioSink to signal errors.
|
|
|
|
void OnAudioSinkError();
|
|
|
|
|
2014-10-06 07:03:14 +04:00
|
|
|
// The state machine may move into DECODING_METADATA if we are in
|
|
|
|
// DECODER_STATE_WAIT_FOR_RESOURCES.
|
|
|
|
void DoNotifyWaitingForResourcesStatusChanged();
|
|
|
|
|
2014-12-22 06:32:31 +03:00
|
|
|
// Return true if the video decoder's decode speed can not catch up the
|
|
|
|
// play time.
|
|
|
|
bool NeedToSkipToNextKeyframe();
|
|
|
|
|
2011-09-21 11:01:00 +04:00
|
|
|
// The decoder object that created this state machine. The state machine
|
|
|
|
// holds a strong reference to the decoder to ensure that the decoder stays
|
|
|
|
// alive once media element has started the decoder shutdown process, and has
|
|
|
|
// dropped its reference to the decoder. This enables the state machine to
|
|
|
|
// keep using the decoder's monitor until the state machine has finished
|
|
|
|
// shutting down, without fear of the monitor being destroyed. After
|
|
|
|
// shutting down, the state machine will then release this reference,
|
|
|
|
// causing the decoder to be destroyed. This is accessed on the decode,
|
|
|
|
// state machine, audio and main threads.
|
2012-11-14 23:46:40 +04:00
|
|
|
nsRefPtr<MediaDecoder> mDecoder;
|
2011-09-21 11:01:00 +04:00
|
|
|
|
2014-07-11 11:11:00 +04:00
|
|
|
// Used to schedule state machine cycles. This should never outlive
|
|
|
|
// the life cycle of the state machine.
|
|
|
|
const nsAutoPtr<MediaDecoderStateMachineScheduler> mScheduler;
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// Time at which the last video sample was requested. If it takes too long
|
|
|
|
// before the sample arrives, we will increase the amount of audio we buffer.
|
|
|
|
// This is necessary for legacy synchronous decoders to prevent underruns.
|
|
|
|
TimeStamp mVideoDecodeStartTime;
|
|
|
|
|
|
|
|
// Queue of audio frames. This queue is threadsafe, and is accessed from
|
|
|
|
// the audio, decoder, state machine, and main threads.
|
|
|
|
MediaQueue<AudioData> mAudioQueue;
|
|
|
|
|
|
|
|
// Queue of video frames. This queue is threadsafe, and is accessed from
|
|
|
|
// the decoder, state machine, and main threads.
|
|
|
|
MediaQueue<VideoData> mVideoQueue;
|
|
|
|
|
2011-09-21 11:01:00 +04:00
|
|
|
// The decoder monitor must be obtained before modifying this state.
|
|
|
|
// NotifyAll on the monitor must be called when the state is changed so
|
|
|
|
// that interested threads can wake up and alter behaviour if appropriate
|
|
|
|
// Accessed on state machine, audio, main, and AV thread.
|
|
|
|
State mState;
|
|
|
|
|
2014-02-18 02:53:52 +04:00
|
|
|
// The task queue in which we run decode tasks. This is referred to as
|
|
|
|
// the "decode thread", though in practise tasks can run on a different
|
|
|
|
// thread every time they're called.
|
2014-12-02 08:51:03 +03:00
|
|
|
MediaTaskQueue* DecodeTaskQueue() const { return mReader->GetTaskQueue(); }
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-02-18 05:30:33 +03:00
|
|
|
// The time that playback started from the system clock. This is used for
|
|
|
|
// timing the presentation of video frames when there's no audio.
|
2013-12-11 09:03:30 +04:00
|
|
|
// Accessed only via the state machine thread. Must be set via SetPlayStartTime.
|
2010-04-02 07:03:07 +04:00
|
|
|
TimeStamp mPlayStartTime;
|
|
|
|
|
2013-12-02 01:09:06 +04:00
|
|
|
// When we start writing decoded data to a new DecodedDataStream, or we
|
|
|
|
// restart writing due to PlaybackStarted(), we record where we are in the
|
|
|
|
// MediaStream and what that corresponds to in the media.
|
2014-06-12 08:44:56 +04:00
|
|
|
int64_t mSyncPointInMediaStream; // microseconds
|
2013-12-02 01:09:06 +04:00
|
|
|
int64_t mSyncPointInDecodedStream; // microseconds
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// The amount of time we've spent playing already the media. The current
|
2011-02-18 05:30:33 +03:00
|
|
|
// playback position is therefore |Now() - mPlayStartTime +
|
|
|
|
// mPlayDuration|, which must be adjusted by mStartTime if used with media
|
2014-12-04 21:29:00 +03:00
|
|
|
// timestamps. Accessed on state machine and main threads. Access controlled
|
|
|
|
// by decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mPlayDuration;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
|
|
|
// Time that buffering started. Used for buffering timeout and only
|
2011-01-18 03:53:18 +03:00
|
|
|
// accessed on the state machine thread. This is null while we're not
|
|
|
|
// buffering.
|
2010-04-02 07:03:07 +04:00
|
|
|
TimeStamp mBufferingStart;
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// Start time of the media, in microseconds. This is the presentation
|
2011-09-27 07:31:18 +04:00
|
|
|
// time of the first frame decoded from the media, and is used to calculate
|
2011-07-12 07:39:39 +04:00
|
|
|
// duration and as a bounds for seeking. Accessed on state machine, decode,
|
|
|
|
// and main threads. Access controlled by decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mStartTime;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-27 07:31:18 +04:00
|
|
|
// Time of the last frame in the media, in microseconds. This is the
|
|
|
|
// end time of the last frame in the media. Accessed on state
|
2011-07-12 07:39:39 +04:00
|
|
|
// machine, decode, and main threads. Access controlled by decoder monitor.
|
2015-01-16 15:49:01 +03:00
|
|
|
// It will be set to -1 if the duration is infinite
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mEndTime;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2015-01-16 15:49:01 +03:00
|
|
|
// Will be set when SetDuration has been called with a value != -1
|
|
|
|
// mDurationSet false doesn't indicate that we do not have a valid duration
|
|
|
|
// as mStartTime and mEndTime could have been set separately.
|
|
|
|
bool mDurationSet;
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// Position to seek to in microseconds when the seek state transition occurs.
|
2010-04-02 07:03:07 +04:00
|
|
|
// The decoder monitor lock must be obtained before reading or writing
|
2011-07-12 07:39:39 +04:00
|
|
|
// this value. Accessed on main and decode thread.
|
2014-04-01 07:39:04 +04:00
|
|
|
SeekTarget mSeekTarget;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2014-11-06 12:52:44 +03:00
|
|
|
// Position to seek to in microseconds when DecodeFirstFrame completes.
|
|
|
|
// The decoder monitor lock must be obtained before reading or writing
|
|
|
|
// this value. Accessed on main and decode thread.
|
|
|
|
SeekTarget mQueuedSeekTarget;
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// The position that we're currently seeking to. This differs from
|
|
|
|
// mSeekTarget, as mSeekTarget is the target we'll seek to next, whereas
|
|
|
|
// mCurrentSeekTarget is the position that the decode is in the process
|
|
|
|
// of seeking to.
|
|
|
|
// The decoder monitor lock must be obtained before reading or writing
|
|
|
|
// this value.
|
|
|
|
SeekTarget mCurrentSeekTarget;
|
|
|
|
|
2011-08-25 03:42:23 +04:00
|
|
|
// Media Fragment end time in microseconds. Access controlled by decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mFragmentEndTime;
|
2011-08-25 03:42:23 +04:00
|
|
|
|
2013-12-11 09:03:30 +04:00
|
|
|
// The audio sink resource. Used on state machine and audio threads.
|
|
|
|
RefPtr<AudioSink> mAudioSink;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2010-05-06 06:31:02 +04:00
|
|
|
// The reader, don't call its methods with the decoder monitor held.
|
2014-06-18 09:07:02 +04:00
|
|
|
// This is created in the state machine's constructor.
|
|
|
|
nsRefPtr<MediaDecoderReader> mReader;
|
2010-05-06 06:31:02 +04:00
|
|
|
|
2012-07-31 16:17:22 +04:00
|
|
|
// Accessed only on the state machine thread.
|
|
|
|
// Not an nsRevocableEventPtr since we must Revoke() it well before
|
|
|
|
// this object is destroyed, anyway.
|
|
|
|
// Protected by decoder monitor except during the SHUTDOWN state after the
|
|
|
|
// decoder thread has been stopped.
|
|
|
|
nsRevocableEventPtr<WakeDecoderRunnable> mPendingWakeDecoder;
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// The time of the current frame in microseconds. This is referenced from
|
2010-04-02 07:03:07 +04:00
|
|
|
// 0 which is the initial playback position. Set by the state machine
|
|
|
|
// thread, and read-only from the main thread to get the current
|
|
|
|
// time value. Synchronised via decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mCurrentFrameTime;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-27 07:31:18 +04:00
|
|
|
// The presentation time of the first audio frame that was played in
|
2011-04-14 02:12:23 +04:00
|
|
|
// microseconds. We can add this to the audio stream position to determine
|
|
|
|
// the current audio time. Accessed on audio and state machine thread.
|
|
|
|
// Synchronized by decoder monitor.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mAudioStartTime;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-27 07:31:18 +04:00
|
|
|
// The end time of the last audio frame that's been pushed onto the audio
|
2011-04-14 02:12:23 +04:00
|
|
|
// hardware in microseconds. This will approximately be the end time of the
|
2011-09-27 07:31:18 +04:00
|
|
|
// audio stream, unless another frame is pushed to the hardware.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mAudioEndTime;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2014-11-21 00:24:00 +03:00
|
|
|
// The end time of the last decoded audio frame. This signifies the end of
|
|
|
|
// decoded audio data. Used to check if we are low in decoded data.
|
|
|
|
int64_t mDecodedAudioEndTime;
|
|
|
|
|
2011-04-14 02:12:23 +04:00
|
|
|
// The presentation end time of the last video frame which has been displayed
|
|
|
|
// in microseconds. Accessed from the state machine thread.
|
2012-08-22 19:56:38 +04:00
|
|
|
int64_t mVideoFrameEndTime;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2014-12-22 06:32:31 +03:00
|
|
|
// The end time of the last decoded video frame. Used to check if we are low
|
|
|
|
// on decoded video data.
|
|
|
|
int64_t mDecodedVideoEndTime;
|
|
|
|
|
2010-04-02 07:03:07 +04:00
|
|
|
// Volume of playback. 0.0 = muted. 1.0 = full volume. Read/Written
|
|
|
|
// from the state machine and main threads. Synchronised via decoder
|
|
|
|
// monitor.
|
2011-01-17 06:03:00 +03:00
|
|
|
double mVolume;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2012-11-22 14:38:28 +04:00
|
|
|
// Playback rate. 1.0 : normal speed, 0.5 : two times slower. Synchronized via
|
|
|
|
// decoder monitor.
|
|
|
|
double mPlaybackRate;
|
|
|
|
|
|
|
|
// Pitch preservation for the playback rate. Synchronized via decoder monitor.
|
|
|
|
bool mPreservesPitch;
|
|
|
|
|
2011-03-24 01:28:57 +03:00
|
|
|
// Time at which we started decoding. Synchronised via decoder monitor.
|
|
|
|
TimeStamp mDecodeStartTime;
|
|
|
|
|
2012-08-16 10:07:26 +04:00
|
|
|
// The maximum number of second we spend buffering when we are short on
|
|
|
|
// unbuffered data.
|
2012-08-22 19:56:38 +04:00
|
|
|
uint32_t mBufferingWait;
|
|
|
|
int64_t mLowDataThresholdUsecs;
|
2012-08-16 10:07:26 +04:00
|
|
|
|
2012-09-28 21:34:03 +04:00
|
|
|
// If we've got more than mAmpleVideoFrames decoded video frames waiting in
|
|
|
|
// the video queue, we will not decode any more video frames until some have
|
|
|
|
// been consumed by the play state machine thread.
|
|
|
|
uint32_t mAmpleVideoFrames;
|
2014-03-11 07:44:08 +04:00
|
|
|
|
|
|
|
// Low audio threshold. If we've decoded less than this much audio we
|
|
|
|
// consider our audio decode "behind", and we may skip video decoding
|
|
|
|
// in order to allow our audio decoding to catch up. We favour audio
|
|
|
|
// decoding over video. We increase this threshold if we're slow to
|
|
|
|
// decode video frames, in order to reduce the chance of audio underruns.
|
|
|
|
// Note that we don't ever reset this threshold, it only ever grows as
|
|
|
|
// we detect that the decode can't keep up with rendering.
|
|
|
|
int64_t mLowAudioThresholdUsecs;
|
|
|
|
|
|
|
|
// Our "ample" audio threshold. Once we've this much audio decoded, we
|
|
|
|
// pause decoding. If we increase mLowAudioThresholdUsecs, we'll also
|
|
|
|
// increase this too appropriately (we don't want mLowAudioThresholdUsecs
|
|
|
|
// to be greater than ampleAudioThreshold, else we'd stop decoding!).
|
|
|
|
// Note that we don't ever reset this threshold, it only ever grows as
|
|
|
|
// we detect that the decode can't keep up with rendering.
|
|
|
|
int64_t mAmpleAudioThresholdUsecs;
|
|
|
|
|
2015-01-22 08:53:04 +03:00
|
|
|
// If we're quick buffering, we'll remain in buffering mode while we have less than
|
|
|
|
// QUICK_BUFFERING_LOW_DATA_USECS of decoded data available.
|
|
|
|
int64_t mQuickBufferingLowDataThresholdUsecs;
|
|
|
|
|
2014-03-11 07:44:08 +04:00
|
|
|
// At the start of decoding we want to "preroll" the decode until we've
|
|
|
|
// got a few frames decoded before we consider whether decode is falling
|
|
|
|
// behind. Otherwise our "we're falling behind" logic will trigger
|
|
|
|
// unneccessarily if we start playing as soon as the first sample is
|
|
|
|
// decoded. These two fields store how many video frames and audio
|
|
|
|
// samples we must consume before are considered to be finished prerolling.
|
2014-12-30 10:16:48 +03:00
|
|
|
uint32_t AudioPrerollUsecs() const
|
|
|
|
{
|
|
|
|
if (mScheduler->IsRealTime()) {
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
|
|
|
|
uint32_t result = mLowAudioThresholdUsecs * 2;
|
|
|
|
MOZ_ASSERT(result <= mAmpleAudioThresholdUsecs, "Prerolling will never finish");
|
|
|
|
return result;
|
|
|
|
}
|
|
|
|
uint32_t VideoPrerollFrames() const { return mScheduler->IsRealTime() ? 0 : mAmpleVideoFrames / 2; }
|
2014-03-11 07:44:08 +04:00
|
|
|
|
2014-12-30 10:16:48 +03:00
|
|
|
bool DonePrerollingAudio()
|
|
|
|
{
|
|
|
|
AssertCurrentThreadInMonitor();
|
|
|
|
return !IsAudioDecoding() || GetDecodedAudioDuration() >= AudioPrerollUsecs() * mPlaybackRate;
|
|
|
|
}
|
|
|
|
|
|
|
|
bool DonePrerollingVideo()
|
|
|
|
{
|
|
|
|
AssertCurrentThreadInMonitor();
|
|
|
|
return !IsVideoDecoding() ||
|
|
|
|
static_cast<uint32_t>(VideoQueue().GetSize()) >= VideoPrerollFrames() * mPlaybackRate;
|
|
|
|
}
|
|
|
|
|
|
|
|
void StopPrerollingAudio()
|
|
|
|
{
|
|
|
|
AssertCurrentThreadInMonitor();
|
|
|
|
if (mIsAudioPrerolling) {
|
|
|
|
mIsAudioPrerolling = false;
|
|
|
|
ScheduleStateMachine();
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
void StopPrerollingVideo()
|
|
|
|
{
|
|
|
|
AssertCurrentThreadInMonitor();
|
|
|
|
if (mIsVideoPrerolling) {
|
|
|
|
mIsVideoPrerolling = false;
|
|
|
|
ScheduleStateMachine();
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// This temporarily stores the first frame we decode after we seek.
|
|
|
|
// This is so that if we hit end of stream while we're decoding to reach
|
|
|
|
// the seek target, we will still have a frame that we can display as the
|
|
|
|
// last frame in the media.
|
2014-11-20 00:01:10 +03:00
|
|
|
nsRefPtr<VideoData> mFirstVideoFrameAfterSeek;
|
2014-06-18 09:07:02 +04:00
|
|
|
|
2014-03-11 07:44:08 +04:00
|
|
|
// When we start decoding (either for the first time, or after a pause)
|
|
|
|
// we may be low on decoded data. We don't want our "low data" logic to
|
|
|
|
// kick in and decide that we're low on decoded data because the download
|
|
|
|
// can't keep up with the decode, and cause us to pause playback. So we
|
|
|
|
// have a "preroll" stage, where we ignore the results of our "low data"
|
|
|
|
// logic during the first few frames of our decode. This occurs during
|
|
|
|
// playback. The flags below are true when the corresponding stream is
|
|
|
|
// being "prerolled".
|
|
|
|
bool mIsAudioPrerolling;
|
|
|
|
bool mIsVideoPrerolling;
|
|
|
|
|
2015-02-04 07:09:55 +03:00
|
|
|
// Only one of a given pair of ({Audio,Video}DataPromise, WaitForDataPromise)
|
|
|
|
// should exist at any given moment.
|
2014-12-22 11:20:31 +03:00
|
|
|
|
2015-02-04 07:09:55 +03:00
|
|
|
MediaPromiseConsumerHolder<MediaDecoderReader::AudioDataPromise> mAudioDataRequest;
|
|
|
|
MediaPromiseConsumerHolder<MediaDecoderReader::WaitForDataPromise> mAudioWaitRequest;
|
|
|
|
const char* AudioRequestStatus()
|
|
|
|
{
|
|
|
|
if (mAudioDataRequest.Exists()) {
|
|
|
|
MOZ_DIAGNOSTIC_ASSERT(!mAudioWaitRequest.Exists());
|
|
|
|
return "pending";
|
|
|
|
} else if (mAudioWaitRequest.Exists()) {
|
|
|
|
return "waiting";
|
|
|
|
}
|
|
|
|
return "idle";
|
|
|
|
}
|
|
|
|
|
|
|
|
MediaPromiseConsumerHolder<MediaDecoderReader::WaitForDataPromise> mVideoWaitRequest;
|
|
|
|
MediaPromiseConsumerHolder<MediaDecoderReader::VideoDataPromise> mVideoDataRequest;
|
|
|
|
const char* VideoRequestStatus()
|
|
|
|
{
|
|
|
|
if (mVideoDataRequest.Exists()) {
|
|
|
|
MOZ_DIAGNOSTIC_ASSERT(!mVideoWaitRequest.Exists());
|
|
|
|
return "pending";
|
|
|
|
} else if (mVideoWaitRequest.Exists()) {
|
|
|
|
return "waiting";
|
|
|
|
}
|
|
|
|
return "idle";
|
|
|
|
}
|
2014-12-22 11:20:31 +03:00
|
|
|
|
2015-02-04 07:09:55 +03:00
|
|
|
MediaPromiseConsumerHolder<MediaDecoderReader::WaitForDataPromise>& WaitRequestRef(MediaData::Type aType)
|
2014-12-22 11:20:31 +03:00
|
|
|
{
|
2015-02-04 07:09:55 +03:00
|
|
|
return aType == MediaData::AUDIO_DATA ? mAudioWaitRequest : mVideoWaitRequest;
|
2014-12-22 11:20:31 +03:00
|
|
|
}
|
2014-03-11 07:44:08 +04:00
|
|
|
|
2012-04-30 07:12:42 +04:00
|
|
|
// True if we shouldn't play our audio (but still write it to any capturing
|
2012-12-04 14:59:36 +04:00
|
|
|
// streams). When this is true, mStopAudioThread is always true and
|
|
|
|
// the audio thread will never start again after it has stopped.
|
2012-04-30 07:12:42 +04:00
|
|
|
bool mAudioCaptured;
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if an event to notify about a change in the playback
|
|
|
|
// position has been queued, but not yet run. It is set to false when
|
2010-04-02 07:03:07 +04:00
|
|
|
// the event is run. This allows coalescing of these events as they can be
|
|
|
|
// produced many times per second. Synchronised via decoder monitor.
|
|
|
|
// Accessed on main and state machine threads.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mPositionChangeQueued;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if the audio playback thread has finished. It is finished
|
2013-12-02 01:09:06 +04:00
|
|
|
// when either all the audio frames have completed playing, or we've moved
|
|
|
|
// into shutdown state, and the threads are to be
|
2010-04-02 07:03:07 +04:00
|
|
|
// destroyed. Written by the audio playback thread and read and written by
|
|
|
|
// the state machine thread. Synchronised via decoder monitor.
|
2013-12-02 01:09:06 +04:00
|
|
|
// When data is being sent to a MediaStream, this is true when all data has
|
|
|
|
// been written to the MediaStream.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mAudioCompleted;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if mDuration has a value obtained from an HTTP header, or from
|
2010-10-07 02:58:36 +04:00
|
|
|
// the media index/metadata. Accessed on the state machine thread.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mGotDurationFromMetaData;
|
2012-11-07 02:33:02 +04:00
|
|
|
|
2014-02-18 02:53:52 +04:00
|
|
|
// True if we've dispatched an event to the decode task queue to call
|
|
|
|
// DecodeThreadRun(). We use this flag to prevent us from dispatching
|
|
|
|
// unneccessary runnables, since the decode thread runs in a loop.
|
|
|
|
bool mDispatchedEventToDecode;
|
2011-07-12 07:39:25 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// False while audio thread should be running. Accessed state machine
|
2011-07-12 07:39:10 +04:00
|
|
|
// and audio threads. Syncrhonised by decoder monitor.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mStopAudioThread;
|
2010-04-02 07:03:07 +04:00
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// If this is true while we're in buffering mode, we can exit early,
|
2011-03-24 01:28:57 +03:00
|
|
|
// as it's likely we may be able to playback. This happens when we enter
|
|
|
|
// buffering mode soon after the decode starts, because the decode-ahead
|
|
|
|
// ran fast enough to exhaust all data while the download is starting up.
|
|
|
|
// Synchronised via decoder monitor.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mQuickBuffering;
|
2011-03-24 01:28:57 +03:00
|
|
|
|
2014-04-01 07:43:57 +04:00
|
|
|
// True if we should not decode/preroll unnecessary samples, unless we're
|
|
|
|
// played. "Prerolling" in this context refers to when we decode and
|
|
|
|
// buffer decoded samples in advance of when they're needed for playback.
|
|
|
|
// This flag is set for preload=metadata media, and means we won't
|
|
|
|
// decode more than the first video frame and first block of audio samples
|
|
|
|
// for that media when we startup, or after a seek. When Play() is called,
|
|
|
|
// we reset this flag, as we assume the user is playing the media, so
|
|
|
|
// prerolling is appropriate then. This flag is used to reduce the overhead
|
|
|
|
// of prerolling samples for media elements that may not play, both
|
|
|
|
// memory and CPU overhead.
|
|
|
|
bool mMinimizePreroll;
|
|
|
|
|
2011-09-30 03:34:37 +04:00
|
|
|
// True if the decode thread has gone filled its buffers and is now
|
2011-07-12 07:39:37 +04:00
|
|
|
// waiting to be awakened before it continues decoding. Synchronized
|
|
|
|
// by the decoder monitor.
|
2011-09-29 10:19:26 +04:00
|
|
|
bool mDecodeThreadWaiting;
|
2011-07-12 07:39:37 +04:00
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// These two flags are true when we need to drop decoded samples that
|
|
|
|
// we receive up to the next discontinuity. We do this when we seek;
|
|
|
|
// the first sample in each stream after the seek is marked as being
|
|
|
|
// a "discontinuity".
|
|
|
|
bool mDropAudioUntilNextDiscontinuity;
|
|
|
|
bool mDropVideoUntilNextDiscontinuity;
|
|
|
|
|
|
|
|
// True if we need to decode forwards to the seek target inside
|
|
|
|
// mCurrentSeekTarget.
|
|
|
|
bool mDecodeToSeekTarget;
|
|
|
|
|
2014-11-05 03:32:26 +03:00
|
|
|
// True if we've issued Seek() to the reader, but haven't yet received
|
|
|
|
// OnSeekCompleted. We should avoid trying to decode more audio/video
|
|
|
|
// until this completes.
|
|
|
|
bool mWaitingForDecoderSeek;
|
|
|
|
|
2015-01-16 21:58:00 +03:00
|
|
|
// True if we're in the process of canceling a seek. This allows us to avoid
|
|
|
|
// invoking CancelSeek() multiple times.
|
|
|
|
bool mCancelingSeek;
|
|
|
|
|
2014-06-18 09:07:02 +04:00
|
|
|
// We record the playback position before we seek in order to
|
|
|
|
// determine where the seek terminated relative to the playback position
|
|
|
|
// we were at before the seek.
|
|
|
|
int64_t mCurrentTimeBeforeSeek;
|
2014-05-06 04:12:05 +04:00
|
|
|
|
2011-03-24 06:53:03 +03:00
|
|
|
// Stores presentation info required for playback. The decoder monitor
|
|
|
|
// must be held when accessing this.
|
2013-09-27 09:22:38 +04:00
|
|
|
MediaInfo mInfo;
|
2012-11-30 17:17:54 +04:00
|
|
|
|
|
|
|
mozilla::MediaMetadataManager mMetadataManager;
|
2012-12-19 08:48:32 +04:00
|
|
|
|
|
|
|
MediaDecoderOwner::NextFrameStatus mLastFrameStatus;
|
2014-10-24 13:25:45 +04:00
|
|
|
|
|
|
|
// mDecodingFrozenAtStateDecoding: turn on/off at
|
|
|
|
// SetDormant/Seek,Play.
|
|
|
|
bool mDecodingFrozenAtStateDecoding;
|
2015-01-16 18:56:19 +03:00
|
|
|
|
|
|
|
// True if we are back from DECODER_STATE_DORMANT state and
|
|
|
|
// LoadedMetadataEvent was already sent.
|
|
|
|
bool mSentLoadedMetadataEvent;
|
|
|
|
// True if we are back from DECODER_STATE_DORMANT state and
|
|
|
|
// FirstFrameLoadedEvent was already sent, then we can skip
|
|
|
|
// SetStartTime because the mStartTime already set before. Also we don't need
|
|
|
|
// to decode any audio/video since the MediaDecoder will trigger a seek
|
|
|
|
// operation soon.
|
|
|
|
bool mSentFirstFrameLoadedEvent;
|
2010-08-25 17:10:00 +04:00
|
|
|
};
|
2010-08-18 21:04:31 +04:00
|
|
|
|
2012-11-14 23:45:33 +04:00
|
|
|
} // namespace mozilla;
|
2010-04-02 07:03:07 +04:00
|
|
|
#endif
|