In order to do that, move buffer type specific code into
`NativeSurfaceWayland` and create subclasses for SHM and EGL
buffers.
This should help identify bugs, improve the code structure for
additional buffer types (e.g. YUV) and makes us get closer
to the CA backend.
Also includes some minor unrelated cleanups.
Differential Revision: https://phabricator.services.mozilla.com/D115938
This implements a mostly working native backend for Wayland. It can
be enabled via `gfx.webrender.compositor.force-enabled`.
The focus here was to get a basic structure in place while mini-
mising changes in shared code.
Known issues and limitations:
- No readback - this will likely require an internal compositor
again, as Wayland doesn't allow easily allow readback of the
composited image, at least not without asking for permission.
Alternatively, a new Wayland extension could be written for it.
- Frame-call related issues when using a compositor that optimizes
them (e.g. Gnome-Shell). This will be fixed in a follow-up, in
the mean time disabling `widget.wayland.opaque-region.enabled`
and `widget.wayland.vsync.enabled` works around the issues.
- Only works on Weston or very recent versions of Gnome-Shell, see
bug 1699754
Differential Revision: https://phabricator.services.mozilla.com/D111662
This code doesn't seem to be working correctly and was broken
accidentally before. Intentionally break it for now to avoid the
regression.
Differential Revision: https://phabricator.services.mozilla.com/D110347
Using `dlsym` for `gdk_wayland_display_get_type` is a cleaner solution
to bug 1696319, allowing running with a GTK that lacks the Wayland
backend.
Also adds a symmetric implementation for `gdk_x11_display_get_type`,
which should help running without X11.
Differential Revision: https://phabricator.services.mozilla.com/D107406
Using `dlsym` for `gdk_wayland_display_get_type` is a cleaner solution
to bug 1696319, allowing running with a GTK that lacks the Wayland
backend.
Also adds a symmetric implementation for `gdk_x11_display_get_type`,
which should help running without X11.
Differential Revision: https://phabricator.services.mozilla.com/D107406
Xwayland will give us a 60Hz timer for `glXWaitVideoSyncSGI` anyway,
but an optimization in Xwayland to reduce that to 1Hz if a window is
occluded can cause issues for us in multi-window cases.
In unaffected (i.e. single window) cases this will make us consume
more resources, as rendering will not get throttled to 1Hz anymore
when hidden. The native Wayland backend supports this, however.
Differential Revision: https://phabricator.services.mozilla.com/D106723
As we make the transition to using EGL over GLX, we will need our
detection code to be sufficient without EGL to determine the device in
use. This patch makes us always use the EGL testing code over the GLX
testing code, regardless of the pref/envvar setting.
At the very least, we need to know the vendor ID of the device in use.
We can determine this if there is only one GPU on the PCI list, if we
get a driver name from Mesa, or if it is a proprietary driver (i.e.
NVIDIA) which includes its name in the vendor ID. If we know the vendor
ID, we can usually derive the device ID from the PCI list.
We now also track which path glxtest took. If we successfully did the
test via EGL, then we will allow the pref/envvar to use EGL instead of
GLX. If the test reverted to GLX, then it will use GLX regardless of the
pref/envvar. This is necessary because we need to know if the libraries
are available or not -- some systems may be missing one or the other.
Differential Revision: https://phabricator.services.mozilla.com/D102933
As we make the transition to using EGL over GLX, we will need our
detection code to be sufficient without EGL to determine the device in
use. This patch makes us always use the EGL testing code over the GLX
testing code, regardless of the pref/envvar setting.
At the very least, we need to know the vendor ID of the device in use.
We can determine this if there is only one GPU on the PCI list, if we
get a driver name from Mesa, or if it is a proprietary driver (i.e.
NVIDIA) which includes its name in the vendor ID. If we know the vendor
ID, we can usually derive the device ID from the PCI list.
We now also track which path glxtest took. If we successfully did the
test via EGL, then we will allow the pref/envvar to use EGL instead of
GLX. If the test reverted to GLX, then it will use GLX regardless of the
pref/envvar. This is necessary because we need to know if the libraries
are available or not -- some systems may be missing one or the other.
Differential Revision: https://phabricator.services.mozilla.com/D102933
Fetch the DRM device in the EGL version of glxtest, set it in gfxInfo and pass
it to gfxVars. Finally, use it in nsDMABufDevice::Configure().
While on it, also clean up EGL typedefs and defines a bit to match how it's
done for GLX.
Inspired by and copied from wlroots and Xwayland. Thanks to emersion!
Differential Revision: https://phabricator.services.mozilla.com/D98108
Fetch the DRM device in the EGL version of glxtest, set it in gfxInfo and pass
it to gfxVars. Finally, use it in nsDMABufDevice::Configure().
While on it, also clean up EGL typedefs and defines a bit to match how it's
done for GLX.
Inspired by and copied from wlroots and Xwayland. Thanks to emersion!
Differential Revision: https://phabricator.services.mozilla.com/D98108
Fetch the DRM device in the EGL version of glxtest, set it in gfxInfo and pass
it to gfxVars. Finally, use it in nsDMABufDevice::Configure().
While on it, also clean up EGL typedefs and defines a bit to match how it's
done for GLX.
Inspired by and copied from wlroots and Xwayland. Thanks to emersion!
Differential Revision: https://phabricator.services.mozilla.com/D98108
1. On Wayland, use `get_egl_status()` by default
2. On X11/EGL use `x11_egltest()`, avoiding runtime dependencies on GLX
3. Avoid dlopening libgl/libgles on EGL if not needed
4. Some Wayland/X11 `ifdef` cleanups
5. Don't throw warnings when on mesa and using pci device detection
Depends on D101383
Differential Revision: https://phabricator.services.mozilla.com/D100638
gfxPlatformGtk needs gtk_init() so we can't use it in all places where dmabuf is used, so remove it from it.
Leave only WebGL config there as it can use gfxPlatform.
- Reomve UseDMABufTextures()
- Remove UseDMABufVideoTextures()
- Remove UseHardwareVideoDecoding()
- Remove UseDRMVAAPIDisplay()
Depends on D95992
Differential Revision: https://phabricator.services.mozilla.com/D95993
On Linux, Firefox is listening on notify::scale-factor to detect DPI
change. However, scale-factor is an int and on the lower-end side of
the DPI scale, some devices are using fractional scale factors encoded
into Xft/DPI setting. Changing from ×1 to ×1.5 scale is therefore
undetected.
The proposed change is two-folds:
- remove use of a cached sDPI value and rely on GTK being the cache
- listening on notify::gtk-xft-dpi to trigger a DPI change
What is missing:
- performance evaluation of not caching sDPI (on a 10s session
loading 2 pages, there is an "overhead" of 6ms on my setup, nothing
visible from my point of view)
- when changing Xft/DPI and scale, the change is done twice, this
seems harmless
Differential Revision: https://phabricator.services.mozilla.com/D92095
This allows mesa to continue using the existing vsync implementation
and nvidia to use the new EGL xvisual logic.
It is an intermediate solution until the issues are fixed. However,
assuming it will take a while to do so, it's probably worth it.
Differential Revision: https://phabricator.services.mozilla.com/D92466
When GLX Vsync source is created along EGL contexts, NVIDIA drivers refuse to make any EGL content current.
So disable GLX Vsync source creation when EGL context is used.
Differential Revision: https://phabricator.services.mozilla.com/D87634
When GLX Vsync source is created along EGL contexts, NVIDIA drivers refuse to make any EGL content current.
So disable GLX Vsync source creation when EGL context is used.
Differential Revision: https://phabricator.services.mozilla.com/D87634