gecko-dev/dom/media/webaudio/test/mochitest.ini

175 строки
5.7 KiB
INI
Исходник Обычный вид История

[DEFAULT]
tags=msg
tags = webaudio
skip-if = ((buildapp == 'mulet' || buildapp == 'b2g') && (toolkit != 'gonk' || debug)) || (os == 'win' && strictContentSandbox) #b2g-debug,b2g-desktop(bug 916135); strictContentSandbox(Bug 1042735)
support-files =
audio-expected.wav
audio-mono-expected-2.wav
audio-mono-expected.wav
audio-quad.wav
audio.ogv
audioBufferSourceNodeNeutered_worker.js
corsServer.sjs
invalid.txt
layouttest-glue.js
noaudio.webm
small-shot-expected.wav
small-shot-mono-expected.wav
small-shot.ogg
small-shot.mp3
ting-44.1k-1ch.ogg
ting-44.1k-2ch.ogg
ting-48k-1ch.ogg
ting-48k-2ch.ogg
ting-44.1k-1ch.wav
ting-44.1k-2ch.wav
ting-48k-1ch.wav
ting-48k-2ch.wav
webaudio.js
[test_analyserNode.html]
[test_analyserScale.html]
[test_analyserNodeOutput.html]
[test_analyserNodePassThrough.html]
[test_analyserNodeWithGain.html]
[test_AudioBuffer.html]
[test_audioBufferSourceNode.html]
[test_audioBufferSourceNodeEnded.html]
[test_audioBufferSourceNodeLazyLoopParam.html]
[test_audioBufferSourceNodeLoop.html]
[test_audioBufferSourceNodeLoopStartEnd.html]
[test_audioBufferSourceNodeLoopStartEndSame.html]
[test_audioBufferSourceNodeNeutered.html]
skip-if = (toolkit == 'android' && (processor == 'x86' || debug)) || os == 'win' # bug 1127845, bug 1138468
[test_audioBufferSourceNodeNoStart.html]
[test_audioBufferSourceNodeNullBuffer.html]
[test_audioBufferSourceNodeOffset.html]
skip-if = (toolkit == 'gonk') || (toolkit == 'android') || debug #bug 906752
[test_audioBufferSourceNodePassThrough.html]
[test_audioBufferSourceNodeRate.html]
[test_AudioContext.html]
Bug 1094764 - Implement AudioContext.suspend and friends. r=roc,ehsan - Relevant spec text: - http://webaudio.github.io/web-audio-api/#widl-AudioContext-suspend-Promise - http://webaudio.github.io/web-audio-api/#widl-AudioContext-resume-Promise - http://webaudio.github.io/web-audio-api/#widl-AudioContext-close-Promise - http://webaudio.github.io/web-audio-api/#widl-AudioContext-state - http://webaudio.github.io/web-audio-api/#widl-AudioContext-onstatechange - In a couple words, the behavior we want: - Closed context cannot have new nodes created, but can do decodeAudioData, and create buffers, and such. - OfflineAudioContexts don't support those methods, transitions happen at startRendering and at the end of processing. onstatechange is used to make this observable. - (regular) AudioContexts support those methods. The promises and onstatechange should be resolved/called when the operation has actually completed on the rendering thread. Once a context has been closed, it cannot transition back to "running". An AudioContext switches to "running" when the audio callback start running, this allow authors to know how long the audio stack takes to start running. - MediaStreams that feed in/go out of a suspended graph should respectively not buffer at the graph input, and output silence - suspended context should not be doing much on the CPU, and we should try to pause audio streams if we can (this behaviour is the main reason we need this in the first place, for saving battery on mobile, and CPU on all platforms) - Now, the implementation: - AudioNodeStreams are now tagged with a context id, to be able to operate on all the streams of a given AudioContext on the Graph thread without having to go and lock everytime to touch the AudioContext. This happens in the AudioNodeStream ctor. IDs are of course constant for the lifetime of the node. - When an AudioContext goes into suspended mode, streams for this AudioContext are moved out of the mStreams array to a second array, mSuspendedStreams. Streams in mSuspendedStream are not ordered, and are not processed. - The MSG will automatically switch to a SystemClockDriver when it finds that there are no more AudioNodeStream/Stream with an audio track. This is how pausing the audio subsystem and saving battery works. Subsequently, when the MSG finds that there are only streams in mSuspendedStreams, it will go to sleep (block on a monitor), so we save CPU, but it does not shut itself down. This is mostly not a new behaviour (this is what the MSG does since the refactoring), but is important to note. - Promises are gripped (addref-ed) on the main thread, and then shepherd down other threads and to the GraphDriver, if needed (sometimes we can resolve them right away). They move between threads as void* to prevent calling methods on them, as they are not thread safe. Then, the driver executes the operation, and when it's done (initializing and closing audio streams can take some time), we send the promise back to the main thread, and resolve it, casting back to Promise* after asserting we're back on the main thread. This way, we can send them back on the main thread once an operation has complete (suspending an audio stream, starting it again on resume(), etc.), without having to do bookkeeping between suspend calls and their result. Promises are not thread safe, so we can't move them around AddRef-ed. - The stream destruction logic now takes into account that a stream can be destroyed while not being in mStreams. - A graph can now switch GraphDriver twice or more per iteration, for example if an author goes suspend()/resume()/suspend() in the same script. - Some operation have to be done on suspended stream, so we now use double for-loop around mSuspendedStreams and mStreams in some places in MediaStreamGraph.cpp. - A tricky part was making sure everything worked at AudioContext boundaries. TrackUnionStream that have one of their input stream suspended append null ticks instead. - The graph ordering algorithm had to be altered to not include suspended streams. - There are some edge cases (adding a stream on a suspended graph, calling suspend/resume when a graph has just been close()d).
2015-02-27 20:22:05 +03:00
skip-if = android_version == '10' # bug 1138462
[test_audioContextSuspendResumeClose.html]
tags=capturestream
[test_audioDestinationNode.html]
[test_AudioListener.html]
[test_audioParamExponentialRamp.html]
[test_audioParamGain.html]
[test_audioParamLinearRamp.html]
[test_audioParamSetCurveAtTime.html]
[test_audioParamSetCurveAtTimeZeroDuration.html]
[test_audioParamSetTargetAtTime.html]
[test_audioParamSetValueAtTime.html]
[test_audioParamTimelineDestinationOffset.html]
[test_badConnect.html]
[test_biquadFilterNode.html]
[test_biquadFilterNodePassThrough.html]
[test_biquadFilterNodeWithGain.html]
[test_bug808374.html]
[test_bug827541.html]
[test_bug839753.html]
[test_bug845960.html]
[test_bug856771.html]
[test_bug866570.html]
[test_bug866737.html]
[test_bug867089.html]
[test_bug867104.html]
[test_bug867174.html]
[test_bug867203.html]
[test_bug875221.html]
[test_bug875402.html]
[test_bug894150.html]
[test_bug956489.html]
[test_bug964376.html]
[test_bug966247.html]
tags=capturestream
[test_bug972678.html]
[test_bug1118372.html]
[test_bug1056032.html]
skip-if = toolkit == 'android' # bug 1056706
[test_channelMergerNode.html]
[test_channelMergerNodeWithVolume.html]
[test_channelSplitterNode.html]
[test_channelSplitterNodeWithVolume.html]
skip-if = (android_version == '18' && debug) # bug 1158417
[test_convolverNode.html]
[test_convolverNode_mono_mono.html]
[test_convolverNodeChannelCount.html]
[test_convolverNodePassThrough.html]
[test_convolverNodeWithGain.html]
[test_currentTime.html]
[test_decodeMultichannel.html]
[test_decodeAudioDataPromise.html]
[test_delayNode.html]
[test_delayNodeAtMax.html]
[test_delayNodeChannelChanges.html]
skip-if = toolkit == 'android' # bug 1056706
[test_delayNodeCycles.html]
[test_delayNodePassThrough.html]
[test_delayNodeSmallMaxDelay.html]
[test_delayNodeTailIncrease.html]
[test_delayNodeTailWithDisconnect.html]
[test_delayNodeTailWithGain.html]
[test_delayNodeTailWithReconnect.html]
[test_delayNodeWithGain.html]
[test_dynamicsCompressorNode.html]
[test_dynamicsCompressorNodePassThrough.html]
[test_gainNode.html]
[test_gainNodeInLoop.html]
[test_gainNodePassThrough.html]
[test_maxChannelCount.html]
[test_mediaDecoding.html]
[test_mediaElementAudioSourceNode.html]
tags=capturestream
[test_mediaElementAudioSourceNodePassThrough.html]
tags=capturestream
skip-if = toolkit == 'android' # bug 1145816
[test_mediaElementAudioSourceNodeCrossOrigin.html]
tags=capturestream
skip-if = toolkit == 'android' # bug 1145816
[test_mediaStreamAudioDestinationNode.html]
[test_mediaStreamAudioSourceNode.html]
[test_mediaStreamAudioSourceNodeCrossOrigin.html]
tags=capturestream
[test_mediaStreamAudioSourceNodePassThrough.html]
[test_mediaStreamAudioSourceNodeResampling.html]
tags=capturestream
[test_mixingRules.html]
skip-if = android_version == '10' || android_version == '18' # bug 1091965
[test_mozaudiochannel.html]
# Android: bug 1061675; OSX 10.6: bug 1097721
skip-if = (toolkit == 'gonk' && !debug) || android_version == '10' || android_version == '18' || (os == 'mac' && os_version == '10.6')
[test_nodeToParamConnection.html]
[test_OfflineAudioContext.html]
[test_offlineDestinationChannelCountLess.html]
[test_offlineDestinationChannelCountMore.html]
[test_oscillatorNode.html]
[test_oscillatorNode2.html]
[test_oscillatorNodeNegativeFrequency.html]
[test_oscillatorNodePassThrough.html]
[test_oscillatorNodeStart.html]
[test_oscillatorTypeChange.html]
[test_pannerNode.html]
[test_pannerNode_equalPower.html]
[test_pannerNodeAbove.html]
[test_pannerNodeChannelCount.html]
[test_pannerNodeHRTFSymmetry.html]
[test_pannerNodeTail.html]
[test_pannerNode_maxDistance.html]
[test_stereoPannerNode.html]
[test_stereoPannerNodePassThrough.html]
[test_periodicWave.html]
[test_scriptProcessorNode.html]
[test_scriptProcessorNodeChannelCount.html]
[test_scriptProcessorNodePassThrough.html]
[test_scriptProcessorNode_playbackTime1.html]
[test_scriptProcessorNodeZeroInputOutput.html]
[test_scriptProcessorNodeNotConnected.html]
[test_singleSourceDest.html]
[test_stereoPanningWithGain.html]
[test_waveDecoder.html]
[test_waveShaper.html]
[test_waveShaperNoCurve.html]
[test_waveShaperPassThrough.html]
[test_waveShaperInvalidLengthCurve.html]