162 reftests that were previously failing are now passing, and one that was
previously passing is now failing. Sounds like a good tradeoff.
MozReview-Commit-ID: Imw9m2NcD1c
This patch was generated using a script and failure logs from a try push, so
whitespace formatting may not be the same as a human would do. The intent is to
fix many of these failures before merging back to m-c.
MozReview-Commit-ID: Etdx9LlWkLX
"random" is used where the bug is likely in the test, or the test was already
random on other platforms because the feature was not complete.
"fails" is used where it looks like the bug is in the product, so that we know
to re-enable the test with the product is fixed.
MozReview-Commit-ID: FenLhua7tm8
--HG--
extra : rebase_source : f93e5a67036174fcc20ef9209c731b311190f362
We want the maximum scroll position to be aligned with layer pixels. That way
we don't have to re-rasterize the scrolled contents once scrolling hits the
edge of the scrollable area.
Here's how we determine the maximum scroll position: We get the scroll port
rect, snapped to layer pixels. Then we get the scrolled rect and also snap
that to layer pixels. The maximum scroll position is set to the difference
between right/bottom edges of these rectangles.
Now the scrollable area is computed by adding this maximum scroll position
to the unsnapped scroll port size.
The underlying idea here is: Pretend we have overflow:visible so that the
scrolled contents start at (0, 0) relative to the scroll port and spill over
the scroll port edges. When these contents are rendered, their rendering is
snapped to layer pixels. We want those exact pixels to be accessible by
scrolling.
This way of computing the snapped scrollable area ensures that, if you scroll
to the maximum scroll position, the right/bottom edges of the rendered
scrolled contents line up exactly with the right/bottom edges of the scroll
port. The scrolled contents are neither cut off nor are they moved too far.
(This is something that no other browser engine gets completely right, see the
testcase in bug 1012752.)
There are also a few disadvantages to this solution. We snap to layer pixels,
and the size of a layer pixel can depend on the zoom level, the document
resolution, the current screen's scale factor, and CSS transforms. The snap
origin is the position of the reference frame. So a change to any of these
things can influence the scrollable area and the maximum scroll position.
This patch does not make us adjust the current scroll position in the event
that the maximum scroll position changes such that the current scroll position
would be out of range, unless there's a reflow of the scrolled contents. This
means that we can sometimes render a slightly inconsistent state where the
current scroll position exceeds the maximum scroll position. We can fix this
once it turns out to be a problem; I doubt that it will be a problem because
none of the other browsers seems to prevent this problem either.
The size of the scrollable area is exposed through the DOM properties
scrollWidth and scrollHeight. At the moment, these are integer properties, so
their value is rounded to the nearest CSS pixel. Before this patch, the
returned value would always be within 0.5 CSS pixels of the value that layout
computed for the content's scrollable overflow based on the CSS styles of the
contents.
Now that scrollWidth and scrollHeight also depend on pixel snapping, their
values can deviate by up to one layer pixel from what the page might expect
based on the styles of the contents. This change requires a few changes to
existing tests.
The fact that scrollWidth and scrollHeight can change based on the position of
the scrollable element and the zoom level / resolution may surprise some web
pages. However, this also seems to happen in Edge. Edge seems to always round
scrollWidth and scrollHeight upwards, possibly to their equivalent of layout
device pixels.
MozReview-Commit-ID: 3LFV7Lio4tG
--HG--
extra : rebase_source : 3e4e0b60493397e61283aa1d7fd93d7c197dec29
extra : source : d43c2d5e87f31ff47d7f3ada66c3f5f27cef84a9
We want the maximum scroll position to be aligned with layer pixels. That way
we don't have to re-rasterize the scrolled contents once scrolling hits the
edge of the scrollable area.
Here's how we determine the maximum scroll position: We get the scroll port
rect, snapped to layer pixels. Then we get the scrolled rect and also snap
that to layer pixels. The maximum scroll position is set to the difference
between right/bottom edges of these rectangles.
Now the scrollable area is computed by adding this maximum scroll position
to the unsnapped scroll port size.
The underlying idea here is: Pretend we have overflow:visible so that the
scrolled contents start at (0, 0) relative to the scroll port and spill over
the scroll port edges. When these contents are rendered, their rendering is
snapped to layer pixels. We want those exact pixels to be accessible by
scrolling.
This way of computing the snapped scrollable area ensures that, if you scroll
to the maximum scroll position, the right/bottom edges of the rendered
scrolled contents line up exactly with the right/bottom edges of the scroll
port. The scrolled contents are neither cut off nor are they moved too far.
(This is something that no other browser engine gets completely right, see the
testcase in bug 1012752.)
There are also a few disadvantages to this solution. We snap to layer pixels,
and the size of a layer pixel can depend on the zoom level, the document
resolution, the current screen's scale factor, and CSS transforms. The snap
origin is the position of the reference frame. So a change to any of these
things can influence the scrollable area and the maximum scroll position.
This patch does not make us adjust the current scroll position in the event
that the maximum scroll position changes such that the current scroll position
would be out of range, unless there's a reflow of the scrolled contents. This
means that we can sometimes render a slightly inconsistent state where the
current scroll position exceeds the maximum scroll position. We can fix this
once it turns out to be a problem; I doubt that it will be a problem because
none of the other browsers seems to prevent this problem either.
The size of the scrollable area is exposed through the DOM properties
scrollWidth and scrollHeight. At the moment, these are integer properties, so
their value is rounded to the nearest CSS pixel. Before this patch, the
returned value would always be within 0.5 CSS pixels of the value that layout
computed for the content's scrollable overflow based on the CSS styles of the
contents.
Now that scrollWidth and scrollHeight also depend on pixel snapping, their
values can deviate by up to one layer pixel from what the page might expect
based on the styles of the contents. This change requires a few changes to
existing tests.
The fact that scrollWidth and scrollHeight can change based on the position of
the scrollable element and the zoom level / resolution may surprise some web
pages. However, this also seems to happen in Edge. Edge seems to always round
scrollWidth and scrollHeight upwards, possibly to their equivalent of layout
device pixels.
MozReview-Commit-ID: 3LFV7Lio4tG
--HG--
extra : histedit_source : 5390eeebfe9a2791d9ac8e91ec1dfec4ec7b4118
With APZ, we always layerize the first scrollable element of the page,
if the page itself is not scrollable. These additional layers can cause
fuzzy reftest failures in two ways: differences in blending, and by
disabling sub-pixel test anti-aliasing. The latter is only a problem
with unaccelerated drawing, because we don't support component alpha
layers with BasicLayers. (We also don't support them with
BasicCompositor, but "Reftest unaccelerated" only tests BasicLayers).