The CDM process can't allocate shmems itself because it's sandboxed, so we
pre-allocate shmems in the content process and send them to the GMP process for
the CDM to use. Some sites seem to have encoded their content in such a way as
to cause the CDM to allocate and hang onto more frames than we pre-allocate; so
we run out of shmems in the GMP process, and the CDM can't allocate further
buffers to return output, and we fail.
So change the ChromiumCDMChild to allocate non-shmem-backed buffers if it runs
out of shmems, and return the result to the content process as an nsTArray.
Upon receiving that, the parent will send an extra shmem to the child, to
hopefully avoid the slow path again.
Also increase media.eme.chromium-api.video-shmems to 4, so that we're less
likely to hit this slow path in the wild. I've seen that Lightbox.co.nz and
Microsoft's EME test-drive require 4 shmems.
MozReview-Commit-ID: ISQYDkTj5uY
--HG--
extra : rebase_source : 92870f1adc7ae68e58b15443e4223012bdf0e39a
The CDM process can't allocate shmems itself because it's sandboxed, so we
pre-allocate shmems in the content process and send them to the GMP process for
the CDM to use. Some sites seem to have encoded their content in such a way as
to cause the CDM to allocate and hang onto more frames than we pre-allocate; so
we run out of shmems in the GMP process, and the CDM can't allocate further
buffers to return output, and we fail.
So change the ChromiumCDMChild to allocate non-shmem-backed buffers if it runs
out of shmems, and return the result to the content process as an nsTArray.
Upon receiving that, the parent will send an extra shmem to the child, to
hopefully avoid the slow path again.
Also increase media.eme.chromium-api.video-shmems to 4, so that we're less
likely to hit this slow path in the wild. I've seen that Lightbox.co.nz and
Microsoft's EME test-drive require 4 shmems.
MozReview-Commit-ID: ISQYDkTj5uY
--HG--
extra : rebase_source : 4beba0212411a0f5867feb506bbf732f5a934fa9
In bug 1353123 it was decided to not extend this telemetry, so it will
expire in this release. Given that, and the fact that bug 1329336 has
disabled the feature already, it's time to remove the probe.
clang's -Wcomma warning warns about suspicious use of the comma operator such as between two statements or to call a function for side effects within an expression.
modules/libjar/nsZipArchive.cpp:651:25 [-Wcomma] possible misuse of comma operator here
MozReview-Commit-ID: 9PjB915D81f
--HG--
extra : rebase_source : c494c2f4a8291d6c08f02765e988c1fd14079e54
extra : source : 3643e37b615c4461b6a9359856731252acc36465
This rolls browser.tabs.animate, browser.fullscreen.animate, and
alerts.disableSlidingEffect into a single pref; if any of these are disabled,
we'll disable the new pref too (toolkit.cosmeticAnimations.enabled). Most
future animations will also be subject to this pref.
MozReview-Commit-ID: 77pLMtERDna
--HG--
extra : rebase_source : 8939e453c2277caa4a90123ae09bb542aaa421ed
Per discussion on dev.platform, this is risk-prone, and we want more
time baking on nightlies and early betas to see if it causes problems.
MozReview-Commit-ID: CaprjWxYfbC
--HG--
extra : rebase_source : d8adf27d78ea258af88d4f05d87cd0caf9c5bdb9
The media prefs "media.decoder-doctor.decode-{errors,warnings}-allowed" list
which decode issue codes will show the "Report Site Issue" notification.
Default: NS_ERROR_DOM_MEDIA_DEMUXER_ERR and NS_ERROR_DOM_MEDIA_METADATA_ERR,
which are the kinds of errors we currently care most about, and have more
chance to fix.
MozReview-Commit-ID: JbY2pwv4TXP
--HG--
extra : rebase_source : 65809f5eb3ace85e25b01c0d1cdfdd068c467ca0
Modern compression algorithms are better than zlib both in terms of
space and time. The jar format, used for e.g. omni.ja, addons, etc.
could benefit from using such modern algorithms, but the format only
allows a limited set of compression algorithms.
However, the format in itself is flexible, in that it can be extended
with arbitrary compression algorithms. This breaks compatibility with
programs like unzip, obviously, but we've never promised the files
shipped with Firefox will always remain "valid" zips (which they already
aren't, but they currently work with most zip readers).
With this change, we allow those archives to contain brotli streams,
using an arbitrary large value for the compression type in the Zip local
file header. This only allows to read such archives, but not to produce
them, and, for now, support for brotli streams is kept Nightly-only,
until everything is pieced together and we're happy to ship it.
--HG--
extra : rebase_source : fa637251f460ad0d91d5f5bec392c6e59555e80d
This appears to have been "broken" since bug 510844, for some value of
broken where it doesn't actually cause any problem in practice because
of how zlib behaves.
That is, in practice, we always still have input to process when there's
pending output. But while that's true with zlib, that's not necessarily
true for other decompressors (e.g. brotli).
--HG--
extra : rebase_source : 7572139f8e2b3df8c6b68123c0a14524dddb3faf
Since the allow list contains both hostnames and certificate hashes, it makes sense
to use it on all platforms.
MozReview-Commit-ID: 1icRFYhhnAY
--HG--
extra : rebase_source : bcb6113090546cdca2b4f2dd120db98ffb511b0d
This is regression-prone, so dev.platform discussion concluded we want
it behind a pref. We might turn the pref off for beta and/or release
for now as well.
MozReview-Commit-ID: 2H2et3RElZx
--HG--
extra : rebase_source : baf7e679ca1ee16016666d7697990c4f64ecf83e
After this patch is introduced, url-classifier will return merged classify result
from V2 and V4.
--HG--
extra : rebase_source : 1940f8617547878b87a5808479ef9d46e6722a92