On Android, we don't want touches that are pans/scrolls to gesture activate.
So we don't want to gesture activate on touchstart (analogous to
mouse/pointerdown on desktop) as we can't tell in advance whether a touchstart
will turn into a touchmove. So gesture activate on touchend only if the
touchend does not finish further than the drag threshold away from where the
touch started.
This means we won't gesture activate for touches that turn into scroll
gestures, but we will for touches that are taps.
The tests are disabled on Android, so I don't have a test for this yet.
MozReview-Commit-ID: C8YgAfVqUa7
--HG--
extra : rebase_source : 76380871b1aa2a5e1bb04882104ada75f6bb3f5a
This commit implements the auto-dir scrolling functionality in non-APZ, based on
part 1 to part 3. However, the functionality of mousewheel.autodir.honourroot is
unimplemented in this commit.
MozReview-Commit-ID: 2vYABOx4RkK
--HG--
extra : rebase_source : 7dc45e6747a101c1a2c3a22bc695b2a0b2494b50
This commit only introduces the concept and the functionality of retrieving the
values from the two new related prefs. Still no actual functionality change is
involved.
MozReview-Commit-ID: 2Gl3Wqdo6jL
--HG--
extra : rebase_source : bf30483e3e32829a5d6fd927740471ba348448f9
Do some work in preparation for implementing actual functionalities for this
bug. No actual functionality change is involved in this commit.
MozReview-Commit-ID: 5aLhr38n1N4
--HG--
extra : rebase_source : 15cfc2cea5b7668367dd3bd4a0746ae8c61b7d20
As for now, the scrollable direction with a mouse wheel for a single-line text
control is hard-coded; that is, only horizontal wheel scrolls are able to take
effect while vertical ones aren't. However, this isn't the desired case for
vertical writing mode, where the opposite case definitely suits better.
This commit refines the hard-coded scrollable direction for a single-line text
control to be writing-mode-adaptive.
MozReview-Commit-ID: 4Zkoe2ExPCZ
--HG--
extra : rebase_source : 113b2ea80b6bbbcd2d8379b438de97eedd616551
There may be some pending input events in the queue of thread when content starts a dnd operation. Spec says that input events should be suppressed when there is a dnd operation. Add a flag in ESM and turn on/off when start/finish a dnd operation. Checking the flag in PresShell::HandleEvent because we may start a dnd operation with pointermove and we want to suppress the mousemove as well.
MozReview-Commit-ID: 43NZrA7SW4c
This flag would be used to help us decide whether website could be allowed the autoplay.
The media related implementation would be implemented in bug1382574.
MozReview-Commit-ID: GGIauBufs5A
--HG--
extra : rebase_source : c407ceed311fc3fb0140e5da44fd69e457a5098f
Currently, widget doesn't show VKB when input context change is caused by JS.
However, if it's caused by an event handler of a user input, user may expect
to open VKB. For example, if a touch event in fake editor causes moving
focus to actual editable node, user expect to show VKB.
Therefore, InputContextAction should declare two causes. One is unknown but
occurred during handling non-keyboard event. The other is unknown but occurred
during handling keyboard event.
However, EventStateManager doesn't have an API to check if it's being handling
a keyboard event. Therefore, this patch adds it first.
AutoHandlingUserInputStatePusher sends event type to StartHandlingUserInput()
and StopHandlingUserInput() of EventStateManager and sUserKeyboardEventDepth
manages the number of nested keyboard event handling. Therefore,
EventStateManager::IsHandlingKeyboardInput() can return if it's handling a
keyboard event.
IMEStateManager uses this new API to adjust the cause of changes of input
context.
Finally, InputContextAction::IsUserInput() is renamed to IsHandlingUserInput()
for consistency with EventStateManager and starts to return true when the
input context change is caused by script while it's handling a user input.
MozReview-Commit-ID: 5JsLqdqeGah
--HG--
extra : rebase_source : 9fcf7687d1bf90eeebbf6eac62d4488ff64b083c
This patch declares a new default action, "horizontal scroll", this scrolls
content horizontally with deltaY of wheel events and ignores deltaX and deltaZ.
This is used for default action with Shift key in default setting except on
macOS. On macOS, legacy mouse's vertical wheel operation with Shift key causes
native horizontal wheel event. Therefore, we don't need to use this new
default action on macOS. Additionally, old default action with Shift key,
navigating history, is moved to with Alt key. This makes same settings between
macOS and the others. So, this is better for users who use macOS and another
OS and web app developers who check wheel events only on macOS or other
platform(s).
For simpler implementation, default action handlers moves deltaY values to
deltaX values temporarily *only* while they handle wheel events. This is
performed by AutoWheelDeltaAdjuster and restored after handling it
automatically.
So, in other words, even if default action is "horizontal scroll", web apps
receives wheel events whose deltaY is not zero but its content will be
scrolled horizontally. This is same as Chromium, so, this behavior shouldn't
cause any incompatible behavior with it.
MozReview-Commit-ID: E4X3yZzLEAl
--HG--
extra : rebase_source : e20d854c6b0a181ad4c9e7304bd9ad14256481ff
When triggering an iframe load or starting to parse a document for an iframe, the main thread may often have some time before the new page has been created. Try to trigger CC/GC slice at such point in order to avoid collector later when page is already executing its JS
--HG--
extra : rebase_source : 806df0af1dbaefb1761134eca0bb7c6ade6ac1a9
Protected EventStateManager::HandleAccessKey() walks ESMs to handle access key and EventStateManager::ExecuteAccessKey() looks for an accesskey which matches given char code values and execute an accesskey if it finds a target. These names are hard to understand what they do and we need an option not to execute accesskey but looks for a target. Therefore, this patch renames the former to WalkESMTreeToHandleAccessKey() and the latter to LookForAccessKeyAndExecute().
Then, they take a new bool argument, aExecute. When it's true, LookForAccessKeyAndExecute() executes found accesskey. Otherwise, i.e., it's false, they return true if they find an accesskey target for the given event in the process.
MozReview-Commit-ID: ETYbNmtTMGj
--HG--
extra : rebase_source : 553676384db5f795de1087a439f30eded11d22fe
EventStateManager checks if every keypress event's modifiers match with access key modifiers which are in prefs. Moving related methods of this to WidgetKeyboardEvent makes EventStateManager simpler and we can hide the NS_MODIFIER_* constants (they may make developers confused between Modifiers of WidgetInputEvent) into WidgetEventImpl.cpp.
MozReview-Commit-ID: 23NUQ51lJ1M
--HG--
extra : rebase_source : 341f3764ef62575577572d8b349159e2d5512b26
Currently, it's not been managed yet that whether an event is posted to at least one remote process. So, for managing the state, BaseEventFlags should have a new bool flag and WidgetEvent and BaseEventFlags should have helper methods for it.
Additionally, this fixes a bug of nsGUIEventIPC.h. In a lot of ParamTraits, static_cast<Foo> is used for using base class's ParamTraits. However, it causes creating temporary instance with copy constructor. Therefore, WidgetEvent::MarkAsPostedToRemoteProcess() call in ParamTraits<mozilla::WidgetEvent>::Write() didn't work as expected.
MozReview-Commit-ID: DdafsbVfrya
--HG--
extra : rebase_source : 94205f3a7b36455c3c9f607c35866be033e627c1
This patch adds a new non-DOM event type, "mousetouchdrag". The name is horrible, I
know. The "mouse" comes from the fact that it's a WidgetMouseEvent, and the
"touchdrag" comes from the fact that this event is fired at the start of a touch
gesture for drag-and-drop. Right now this event is only fired from the Windows
widget code, when we receive a touch-source doubleclick event from the OS. This
event is sent to us from the OS when it detects the sequence "touchstart, touchend,
touchstart" within certain time/distance constraints. Eventually we may detect
similar gestures for other platforms in the APZ GestureEventListener and dispatch
the "mousetouchdrag" event for those as well.
The only effect of this event is that it begins tracking a drag gesture in the
EventStateManager. Subsequent touchmove events can begin the actual drag-and-drop
operation by calling ::DoDragDrop. See the discussion in bug 1147335 for some
important caveats about DoDragDrop and how it only works with left-mouse-button
events (real or synthetic).
MozReview-Commit-ID: bGyOk6dRoJ
After click events with button 2 or 3 are fired, fire auxclick, a new
event intended to represent a non-primary mouse click. Because this
event, based on the design examples and blink's implementation, is
intended to be used with content listeners, always dispatch on content
listeners--not just those that force all events to be dispatched (i.e.
document/window). This diverges from the behavior of our click events
from non-primary buttons.
Eventually, we hope this will replace click events for non-primary
buttons. For now, leave those events for compatibility reasons.
Additionally, add handling of this new event, where necessary.
MozReview-Commit-ID: 8osozM4h6Ya
--HG--
extra : rebase_source : 558261dd0d0b9241efa84ca168c50455850af03a