The 5s timeout was not enough for debug builds. I don't really see a
reason to use something other than the default socket timeout here.
MozReview-Commit-ID: Fm5lgSI3lFb
The primary change here is to increase the number of times a failed
download is retried, when downloading test_packages.json and test zip
files, in hopes of recovering from more temporary service interruptions.
Increases WPT wdspec timeouts to more realistic values. Because wdspec
tests interact with the browser from an OOP program, they require more
time to run. Interactive browser tests are also known for generally
being more expensive to run.
25 seconds for the default timeout and 120 seconds for the long timeout
are values picked out of the air and likely needs to be further refined
in the future. It is however the current belief that this moves us in
the right direction.
Further improvements to this approach may involve letting wdspec tests
define timeouts on a per-file or a per-test function level through the
use of pytest-timeouts, but this is purely speculative at this point.
It is the current recommendation to adjust the number of tests and the
runtime duration of the tests in a file according to these new defaults.
MozReview-Commit-ID: 4I3Xz9G6lzv
--HG--
extra : rebase_source : 5ec7439e736dc9978828e420bd31195e63130fed
Certain test types have a need for other defaults than the
wpttest.DEFAULT_TIMEOUT and wpttest.LONG_TIMEOUT values. This patch
changes wptrunner to define default- and long timeouts on a test type
level. This allows a test type to override the default durations defined
in the abstract Test.default_timeout and Test.long_timeout.
Concrete classes, such as ReftestTest and WdspecTest, may override these
class properties.
MozReview-Commit-ID: IS6df5vuIDC
--HG--
extra : rebase_source : a3f37d4524902f2b0d54e14126b57da327f0ec06
There were three issues here: The first is that the TPS's history module didn't
wait for PlacesUtils.history.remove's promise to resolve.
The second is that the SYNC_WIPE_REMOTE in the previous client would cause a
write to the clients collection, which would cause the next client to get a
"sync:collection_changed" push. This caused a sync of *only* the clients
collection upon reciept, which would prevent TPS from explicitly syncing all
engines.
The third is that TPS wasn't correctly handling the cases where logIn would
trigger a sync, which would cause a failure during the first sync of a session.
This would cause failures on other TPS tests as well.
MozReview-Commit-ID: LpqZ7Kt9fyy
--HG--
extra : rebase_source : f1d3c40e2ef4e09cce4d2ce8ae25f2c86ddfee45
This rolls browser.tabs.animate, browser.fullscreen.animate, and
alerts.disableSlidingEffect into a single pref; if any of these are disabled,
we'll disable the new pref too (toolkit.cosmeticAnimations.enabled). Most
future animations will also be subject to this pref.
MozReview-Commit-ID: 77pLMtERDna
--HG--
extra : rebase_source : 8939e453c2277caa4a90123ae09bb542aaa421ed
Because it tests some Gecko-specific things as well, I'm making two
copies, as advised by bz and jgraham. One is to be submitted upstream,
and a second one has local changes. This means most of the test is run
twice.
This overwrites the preexisting Element-classlist.html test upstream. I
think I took the useful bits out of it (particularly replace() testing),
but there are some things that it had that I didn't think were
necessary, including: things that belong in idlharness; .className
testing; testing .contains() and stringification and hasAttribute() and
such after add/remove/etc. (instead of just testing getAttribute()); CSS
class selector matching.
MozReview-Commit-ID: JxPK7OyVLXa
--HG--
rename : dom/base/test/test_classList.html => testing/web-platform/mozilla/tests/dom/classList.html
extra : rebase_source : 31c63fd709a7385e0ed7f4f4ea960f5ccff6e187
Everything depending on the widget being gonk can go away, as well as
everything depending on MOZ_AUDIO_CHANNEL_MANAGER, which was only
defined on gonk builds under b2g/ (which goes away in bug 1357326).
--HG--
extra : rebase_source : 9f0aeeb7eea8417fa4e06d662d566d67ecaf2a24
Content listeners that are using the old IPC dispatching technique can
cause Marionette to hang when errors are thrown but not handled. To
ensure errors are returned to the chrome process, all the code has to
be placed in try/catch blocks.
MozReview-Commit-ID: J6fwnFUURl7
--HG--
extra : rebase_source : ade78c8839e58ccb1e603c8e92cba1938519d2f4
When defaultParagraphSeparator is not "br", and we hit Enter on a line
that is not contained in any block element, we first create a new <div>
(or <p>) wrapper to hold the line's contents. If creating this wrapper
fails for some reason, we go ahead and insert a <br> instead.
In some cases, we would get confused and think we didn't create the
block element when really we did. We would insert a <br>, and
afterwards something would get rid of the empty block element. In a
corner case where the line only consisted of a <br> to start with, this
would result in nothing happening, because the original <br> was removed
when creating the block element, and only one <br> was inserted to
replace it.
The correct fix is to just not get confused!
MozReview-Commit-ID: 1U8KHC71bfw
--HG--
extra : rebase_source : 50640615a3a652c3a74c1aef5412eb82daf8c5fb
Injected script in the Marionette caret selection API used the "default"
immutable sandbox, but we want to run them in the mutable sandbox.
MozReview-Commit-ID: BpbHdDhDtg4
--HG--
extra : rebase_source : 4193a01370e903874aa5da1634bdfe5c2b9fd577
Testing the return value is misleading in this case. What we want to
test is that it does not throw due to a permissions issue.
MozReview-Commit-ID: 2Wbwou9opyF
--HG--
extra : rebase_source : f2a95ba66999ee430f58b7aa9de70742a209defd
The Components.classes constructor should throw an error in both the
mutable and the "default" sandbox.
MozReview-Commit-ID: C40nZNaVWwz
--HG--
extra : rebase_source : ced5ccba9108f2cd0c37cf799e83913bf19afac6
We accidentally only ran them in "default" and "system" before, and also
one of the arguments in the system globals test was wrong.
MozReview-Commit-ID: DmBYGsZaIVP
--HG--
extra : rebase_source : 75b2f87a6c9f1b425607e0a743669b985b8f3072
We were previously missing a test for the arguments variable that is
implicitly exposed to functions.
MozReview-Commit-ID: IC6aJcUsyhd
--HG--
extra : rebase_source : d94cdf0a0f4c74b0bb3240b32ad53da107931183
The Python standard library uses tuples to define arguments for functions,
whenever they are invoked through meta programming.
The Marionette client only allows the list type for backwards
compatibility, so we prefer tuples in this case.
Another good argument for tuples is that tuples are immutable.
MozReview-Commit-ID: 72zPzYvBz7Q
--HG--
extra : rebase_source : f9338db8adacbccd82f23a3b7a38194d747e27a1
Marionette does not protect the unloadHandler in
testing/marionette/evaluate.js from content introspection or
modification, which can happen when web frameworks override
window.addEventListener/window.removeEventListener.
The script evaluation module used in Marionette relies on
sandbox.window.addEventListener/removeEventListener to throw an error when
script execution is aborted due to the document unloading itself. If the
window.addEventListener/removeEventListener functions have been overridden
to introspect the objects that are passed, they may inadvertently touch
objects originating from chrome space, such as the unloadHandler.
Because the Gecko sandboxing system put in place strict security measures
to prevent accidental chrome-space modification from content, inspecting
the unloadHandler will throw a permission denied error once the script
has finished executing.
We have found examples in the wild of this in particular with the Angular
web framework. This patch makes the unloadHandler safe for introspection
from web content.
Fixes: https://github.com/mozilla/geckodriver/issues/515
MozReview-Commit-ID: E2LgPhLLuDT
--HG--
extra : rebase_source : c7431630d24c42ebfd7ded3cf204c1ef245211d0
To ensure that we correctly restart Firefox for update tests, the restart
button in the about window or the old software update window have to be
clicked.
MozReview-Commit-ID: 7acl1DcA85d
--HG--
extra : rebase_source : 8af6c300ae34befc2c05e801ea4b5901659c1c2a
Puppeteer enforces to use an in_app restart, unless the clean argument is
set to True. But whatever case is present, all passed in arguments have
to be forwarded to Marionette.
MozReview-Commit-ID: ADPRvuXhyXh
--HG--
extra : rebase_source : cfc65ae082664a93abbda35b9d3a09e8af2784f0
To ease the investigation of possible page load issues debug logging
output is added to the page load listener.
MozReview-Commit-ID: 18itxTHtnBf
--HG--
extra : rebase_source : 7d5f64125453e57113aa565ca09b4eb61a14ec9a
If the click command triggered a page load, it should not return before
the page has been fully loaded. With this patch we allow an opt-in for
commands to make use of an unload check. It's set to 200ms right now, and
will cancel an ongoing waitForPageLoad() if no page activity is detected.
MozReview-Commit-ID: DWV53sckBS2
--HG--
extra : rebase_source : 2c4d2a19a006645ecd44e08a28309367bf4f8d32
Tests for page timeout durations have to use an HTTPD handler that delays
or slows down document load. Otherwise there a risk that the timeout error
is not returned before the document finishes loading.
MozReview-Commit-ID: HGGcXfCuaSH
--HG--
extra : rebase_source : 42f60ad9864d87601cd528a0f1accffb768ba438
In the case when the trigger callback inside navigate() uses a generator,
the code has to be synchronized and needs to wait until the contained
command has been completed.
MozReview-Commit-ID: 8qKUMvH7HpS
--HG--
extra : rebase_source : 3bc63d130c370354dab27bf40bbf13ec441bd423
This patch does a few things:
a) Adds the resources location from the .app directory to the read whitelist
b) When it's a non-packaged build, mach run (and various mach tests) set an environment variable for the repo location which we allow reads from.
r=haik,froydnj
MozReview-Commit-ID: KNvAoUs5Ati
--HG--
extra : rebase_source : 81ba8bfee0ca96979cf8e30d75cdd47f06bc10ea
This commit makes sccache dump JSON stats at the end of the build, and then
reads them in `BuildScript.generate_build_stats` and adds them to the
build_metrics we submit to Perfherder. The stats dumping is done in
Makefile.in where we currently dump verbose sccache stats because sccache
doesn't persist stats to disk right now and it will also shut down its server
process after 5 minutes, so when the post-build automation steps take more
than 5 minutes the server shuts down and the stats are lost.
Currently it's collecting:
* Cache hit rate
* Cache write errors
* Non-cacheable requests (compiler invocations that sccache can't cache)
We can always grow this list later.
MozReview-Commit-ID: J9CwU7XB05I
--HG--
extra : rebase_source : 084b09c3b0621330ac331a99b1bca9a15cf833b7
The current setup is confusing, and, I guess, an inheritage from when
the same mozharness configs were used for buildbot and taskcluster jobs.
When mozharness calls tooltool_wrapper.sh, it doesn't set the
TOOLTOOL_CACHE environment variable from its configuration, like it does
for other commands. Instead, it passes the -c flag with the path from
its configuration. Then tooltool_wrapper.sh proceeds with ignoring the
-c flag and using TOOLTOOL_CACHE from the original environment,
inherited from the taskcluster setup script.
The upcoming new wrapper for tooltool in bug 1355731 doesn't keep this
confusing behavior, and respects the cache directory it's given on the
command line.
Now that most jobs run on taskcluster, and few use the same mozharness
config between buildbot and taskcluster, we can now go ahead and change
the TOOLTOOL_CACHE path in the mozharness config to match reality.
The list of files modified was generated from looking at
MOZHARNESS_CONFIG values in the full-task-graph.json file coming from
the Gecko decision task.
--HG--
extra : rebase_source : fe2baee48baffae52f738b4168f862c81370fcef
Those jobs, running on taskcluster, don't have a persistent cache
anyways, so we might as well not pretend setting a tooltool cache does
anything, especially considering the configured directory is not
writeable anyways.
--HG--
extra : rebase_source : a435bce42b4c181a6690a42e068bb2e0e875d5e8
We'd like to ensure that both parallel and serial traversal in Stylo are
tested on automation. Since e10s is the future, we've chosen to force
parallel traversal on during e10s tests, and force serial traversal on
during non-e10s tests.
The 5s timeout was not enough for debug builds. I don't really see a
reason to use something other than the default socket timeout here.
MozReview-Commit-ID: Fm5lgSI3lFb
This eliminates a 2 minute timeout seen at the end of Android mochitests
and reftests. Attempts to shutdown the web server were failing because
they were directed at IP 10.0.2.2 -- the loopback address for the
Android emulator.
Screenshots is a system add-on imported from https://github.com/mozilla-services/screenshots/.
This is the initial build system patch for building screenshots. ESLint is ignored since Screenshots currently use their own version (this may change in the future).
MozReview-Commit-ID: 4OEcaduaeWE
We don't currently support H.264 on Android in automation, but we can improve
our test coverage by running the VP8 portion of this test in the meantime.
MozReview-Commit-ID: 3SPCTaqlfMk
--HG--
extra : rebase_source : cae3251f489e45f56b04378074083d6b4fd24666
The devicemanager killProcess() is updated to use force-stop first, then
use kill if force-stop does not work.
Browser test harnesses are updated to check if killProcess() worked, and
warn if it failed.
This is from changeset 249a47720ddcf896a9f07600c429a1b4492b805e from
the version-control-tools repo. It contains a fix to restore
compatibility with Mercurial 3.7, which caused mozharness tests
to fail because that test pins Mercurial 3.7.3 in a requirements
file. This is a bug. But getting a modern robustcheckout deployed
is more important than upgrading that test.
File copied verbatim from changeset e0d30b04dac6bcd36b57c711d9c5b1c280f63390
from the version-control-tools repository.
The updated extension now detects and retries after network failures
where it didn't before. This should cut down on the number of
intermittent failures.
MozReview-Commit-ID: 2bFLcGEARTJ
--HG--
extra : rebase_source : ac43b1925713ce33e1d0d835323efc02c30aed74
Our configs as well as the artifact code are not equipped to do this.
MozReview-Commit-ID: BDkI3Peo8Md
--HG--
extra : rebase_source : 66a68737e080decd0f53c265553eacb1237e3194
Our configs and the artifact machinery aren't equipped to handle this.
MozReview-Commit-ID: 68DYmWEdGnA
--HG--
extra : rebase_source : fa79eeab616412acab773b6d6bd46a58399699dd
The error message returned when unmarshalling the timeout configuration
object with invalid input is misleading, because it checks the typing
of the value before the field name.
This patch changes Marionette to run the type assertion for the value
after each case in the switch statement has been evaluated, ensuring
that the field is valid before asserting its value.
It also adds a few unit tests to verify this behaviour.
Fixes: https://github.com/mozilla/geckodriver/issues/633
MozReview-Commit-ID: LVjTyUacD0s
--HG--
extra : rebase_source : f8a215aedfa5edf8ddbd037cae583ec07626de27
As long as update tests do not support the new simplified update ui
it has to be kept disabled.
MozReview-Commit-ID: 4fC0CYhp7Pc
--HG--
extra : rebase_source : f3558973b0153fe2104f0e612120298d711fc491
To ensure better failure messages a refactoring of checks has to be done. It
includes the following changes:
* No further checks for a follow-up (watershed) update. It's not supported
and as such doesn't need assertions (bug 1353717)
* Checks for fallback updates have to be made to ensure that an invalidated
partial/complete update does not cause an upgrade of Firefox during the restart.
MozReview-Commit-ID: CLb0aXoIur2
--HG--
extra : rebase_source : e41f2463cef695f6f9984ea2ee6f7d43196a9138
There was never a need to run a multiple-update step in the past, and as
we agreed a while ago it is not something we want to do in the future.
It means that watershed releases will have to be tested by issuing
multiple update tests.
MozReview-Commit-ID: 7cmK3gEOkv1
--HG--
extra : rebase_source : dac6bcf5d4505febc9b6ffb6333cf77b41d4dc4a
Automation is now largely using Mercurial 4.1 with some lagging
components still on 3.9 and a very small sliver of random parts
still on 3.7.3. Let's update the mozharness tests to match what
automation is using.
FWIW, the Mercurial tests still pass on 3.9.
MozReview-Commit-ID: BgZVDcx29mf
--HG--
extra : rebase_source : edb8516491fe9ef616b1ad797be2fc02a89c2829
Tests based on this code fail in Mercurial 3.8+ due to `hg rebase`
changing its behavior for selecting default revisions. I was going
to update the code to work with 3.8+. However, I could find no actual
consumers of this method! Annotate says this was added back in
mozharness 0.4. However, I can't find a reason for it being added.
Nor can I find any consumers of this method in the mozharness repo
outside the tests. This smells like dead code to me.
MozReview-Commit-ID: 3Q6MTjQJT1p
--HG--
extra : rebase_source : 2c488b70136bc9856f6075297d7c32e3b69079ad
A future patch will move symbolstore.py to be invoked as a py_action, making
the unix path found in the environment cause failures on some Windows builds.
MozReview-Commit-ID: Hp9AMTqWd3E
--HG--
extra : rebase_source : b9797ddb4417cfe95c193a126dd06b633bfa3d1f
There are quite a few changes in here. At a high level, all we're trying to do is to replace the old update popup with a less intrusive and more modern doorhanger (set of doorhangers) for various update success and failure conditions.
There are quite a few changes in here. At a high level, all we're trying to do is to replace the old update popup with a less intrusive and more modern doorhanger (set of doorhangers) for various update success and failure conditions.
The gecko messages are now in the "process_output" action, rather than
in the "log" action (except for a few legacy cases), so examine both
when looking for LSAN messages.
MozReview-Commit-ID: 82r1p8WLwFa
--HG--
extra : rebase_source : 5af1c529e58f5ba90a3fd222e3cbbc67a850a08c
Testing the return value is misleading in this case. What we want to
test is that it does not throw due to a permissions issue.
MozReview-Commit-ID: 2Wbwou9opyF
--HG--
extra : rebase_source : cd056ed38b9cf7b9eb095635209fbe6b090721fd
The Components.classes constructor should throw an error in both the
mutable and the "default" sandbox.
MozReview-Commit-ID: C40nZNaVWwz
--HG--
extra : rebase_source : f02506f16ff409761ba09ae0f32ff2902cdf07a3
We accidentally only ran them in "default" and "system" before, and also
one of the arguments in the system globals test was wrong.
MozReview-Commit-ID: DmBYGsZaIVP
--HG--
extra : rebase_source : c9f1493ce3faed0ec2e5ad6125a4f7811a1fef03
We were previously missing a test for the arguments variable that is
implicitly exposed to functions.
MozReview-Commit-ID: IC6aJcUsyhd
--HG--
extra : rebase_source : 9039dcb7fcea681d8c9cd729cca8c55701631a5c
The Python standard library uses tuples to define arguments for functions,
whenever they are invoked through meta programming.
The Marionette client only allows the list type for backwards
compatibility, so we prefer tuples in this case.
Another good argument for tuples is that tuples are immutable.
MozReview-Commit-ID: 72zPzYvBz7Q
--HG--
extra : rebase_source : 2406e92b8bec8a965df6457bb166fd9761513b1e
Marionette does not protect the unloadHandler in
testing/marionette/evaluate.js from content introspection or
modification, which can happen when web frameworks override
window.addEventListener/window.removeEventListener.
The script evaluation module used in Marionette relies on
sandbox.window.addEventListener/removeEventListener to throw an error when
script execution is aborted due to the document unloading itself. If the
window.addEventListener/removeEventListener functions have been overridden
to introspect the objects that are passed, they may inadvertently touch
objects originating from chrome space, such as the unloadHandler.
Because the Gecko sandboxing system put in place strict security measures
to prevent accidental chrome-space modification from content, inspecting
the unloadHandler will throw a permission denied error once the script
has finished executing.
We have found examples in the wild of this in particular with the Angular
web framework. This patch makes the unloadHandler safe for introspection
from web content.
Fixes: https://github.com/mozilla/geckodriver/issues/515
MozReview-Commit-ID: E2LgPhLLuDT
--HG--
extra : rebase_source : 6fe4f61fd18f42fb5332a664189f3ea919db28c5
There's quite a few changes in here. At a high level, all we're trying to do
is to replace the old update popup with a less intrusive and more modern
doorhanger (set of doorhangers) for various update failure conditions.
MozReview-Commit-ID: 24sESMTosNX
--HG--
extra : rebase_source : ee0c1e00fe3f99e16388f0de17274ff97a3b9fcf
Without the use of an upstream the first repo cloned on a machine will
be cached. If a subsequent job references a different repo, it may have
to pull thousands of changesets because those changesets aren't part of
the initially-cloned repo. This is why the --upstream feature to
robustclone exists and is why it uses the mozilla-unified repo.
The mozilla-unified repo is a superset of central, aurora, beta, release,
etc. So by cloning it, you get changesets for all of the repos at the
time of the clone. When a subsequent job comes along and requests
a changeset from a different repo, you likely only need to fetch data
for a handful of changesets, not thousands.
This change adds the upstream url config for all fx_desktop_build jobs,
ensuring it is used. A redundant config entry for the try repo has been
removed as well.
MozReview-Commit-ID: 3EL7aSXS4AG
--HG--
extra : rebase_source : 19fc7373da56ad879b4b813a379dd8d9798909e4
I think I initially defined these and I think I know why I used
mozilla-central (it had to do with try not advertising bundles and
mozilla-unified being generaldelta when other repos were not). Those
reasons are no longer valid and we should be using mozilla-unified
everywhere.
MozReview-Commit-ID: CFaZspU6A5M
--HG--
extra : rebase_source : 79b74038307a3faa9b150a7ea1d449cad472e748
We want to remove Marionette's dependency on
Selenium JS fragment atoms entirely (work is tracked in
https://bugzilla.mozilla.org/show_bug.cgi?id=1354203), but in the interim
we ought to link to where the original, unminified source code of the
fragments can be found.
MozReview-Commit-ID: 4tTbn3RVJ5q
--HG--
extra : rebase_source : 0c869866b3d5a475378fbc9dd9e80e9eb037dd5b
Tests which have to wait for a page being loaded should always use a timeout as
set via self.marionette.timeout.page_load.
MozReview-Commit-ID: HFTOYy6WYNk
--HG--
extra : rebase_source : cc2981595e2a62fd761baec8a3c15486832cc0ed
We add a new "on-off" protocol PURLClassifierLocal which calls
nsIURIClassifier.asyncClassifyLocalWithTables on construction and
calls back on destruction. Pretty much the same design as PURLClassifier.
In order to avoid code duplication, the actor implementation is templatized
and |MaybeInfo| in PURLClassifier.ipdl is moved around.
Test case is included and the custom event target is not in place for labelling.
The custom event target will be done in Bug 1353701.
MozReview-Commit-ID: IdHYgdnBV7S
--HG--
extra : rebase_source : ab1c896305b9f76cab13a92c9bd88c2d356aacb7
Errors thrown are printed to console and there is no point in having a
custom catch to print it.
This also makes it possible to start Marionette programmatically without
worrying about disappearing errors.
MozReview-Commit-ID: GGhyCyYqJg
--HG--
extra : rebase_source : 28937d94c8688c05887dcbf7dcf862e7bdc3a6c7
Port 666 is in the protected port range and can not normally be bound
to unless the process is running with sudo permissions.
We can instead bind to port 0, which will give us a system-defined port
in the epemeral range.
MozReview-Commit-ID: Ld6BDMhtbck
--HG--
extra : rebase_source : 0c2ae692dd675c664898e004c29a0e342fd9755b
This is a follow-up to address a fallout caused by bug 1344748 whereby
deprecated preferences relevant to Marionette are no longer being
picked up. This is preventing trace logs from being emitted in CI.
The old logic related to falling back to a deprecated preference is
faulty in that it the preferred, new preference always exists through
the power of testing/marionette/prefs.js. This patch introduces a new
helper method getPref that first looks at whether the preferred pref
is set, and only falls back to the deprecated if it isn't set and the
deprecation preference exists.
MozReview-Commit-ID: 8DeawLAELyK
--HG--
extra : rebase_source : d75ff2eff3941c2cb074d4f3983a70ebd66b8043
The Marionette component ships in Firefox, but is not enabled by default.
We want to facilitate activating Marionette at runtime by flipping
the marionette.enabled preference, and showing the Marionette related
preferences in about:config helps discoverability.
It is also useful to rely on the preferences' default values so that
they do not have to be hardcoded in the component.
When Marionette is enabled by setting marionette.enabled to true, a set of
recommended automation preferences found in testing/marionette/server.js
are set if the user has not overriden/user-defined one of them and
marionette.prefs.recommended is true (default). When Marionette is
stopped, the altered preferences are reset.
MozReview-Commit-ID: 3HLnEI0TEBB
--HG--
extra : rebase_source : 8be91ed46c443dd120cbc4b42c729cf3ae250b5f
Files appended to JS_PREFERENCE_FILES are moved into the
objdir/dist/bin/defaults/pref directory, shared with default global
preferences from other parts of Gecko.
To ensure Marionette's preference file ends up in this directory with
a sensible name, we put it in testing/marionette/prefs/marionette.js so
that it ends up in the objdir as dist/bin/defaults/pref/marionette.js.
MozReview-Commit-ID: 9YJ7vysDjSJ
--HG--
rename : testing/marionette/prefs.js => testing/marionette/prefs/marionette.js
extra : rebase_source : a5f275ed051eac659e89b55e8dfe950b67885618
Verify that there is no hang when we use pointer actions to
perform a click that results in navigation and document unload.
MozReview-Commit-ID: EO5FClnxML5
--HG--
extra : rebase_source : e4ea787bd7ad583d8141aef9ec2bf63df33ef912
This fixes the reported hang that occurs after a pointer click
action resulting in navigation.
MozReview-Commit-ID: A9SBhextVLH
--HG--
extra : rebase_source : 7de7f06a1c05e0e52a03f1850187926aa13a4b08
Errors thrown are printed to console and there is no point in having a
custom catch to print it.
This also makes it possible to start Marionette programmatically without
worrying about disappearing errors.
MozReview-Commit-ID: GGhyCyYqJg
--HG--
extra : rebase_source : 0a55dc87a2e3a3dab5da59800e421562b9385c51
Port 666 is in the protected port range and can not normally be bound
to unless the process is running with sudo permissions.
We can instead bind to port 0, which will give us a system-defined port
in the epemeral range.
MozReview-Commit-ID: Ld6BDMhtbck
--HG--
extra : rebase_source : 1ccbefb4829ba7d493a576e8339bae9174441484
This is a follow-up to address a fallout caused by bug 1344748 whereby
deprecated preferences relevant to Marionette are no longer being
picked up. This is preventing trace logs from being emitted in CI.
The old logic related to falling back to a deprecated preference is
faulty in that it the preferred, new preference always exists through
the power of testing/marionette/prefs.js. This patch introduces a new
helper method getPref that first looks at whether the preferred pref
is set, and only falls back to the deprecated if it isn't set and the
deprecation preference exists.
MozReview-Commit-ID: 8DeawLAELyK
--HG--
extra : rebase_source : b26a992ad9bda2423cb9033edbc1cb0ddfe12cfc
The Marionette component ships in Firefox, but is not enabled by default.
We want to facilitate activating Marionette at runtime by flipping
the marionette.enabled preference, and showing the Marionette related
preferences in about:config helps discoverability.
It is also useful to rely on the preferences' default values so that
they do not have to be hardcoded in the component.
When Marionette is enabled by setting marionette.enabled to true, a set of
recommended automation preferences found in testing/marionette/server.js
are set if the user has not overriden/user-defined one of them and
marionette.prefs.recommended is true (default). When Marionette is
stopped, the altered preferences are reset.
MozReview-Commit-ID: 3HLnEI0TEBB
--HG--
extra : rebase_source : 6557962c8dbd91002bbf22690ef03cd4cbcbbe38
Files appended to JS_PREFERENCE_FILES are moved into the
objdir/dist/bin/defaults/pref directory, shared with default global
preferences from other parts of Gecko.
To ensure Marionette's preference file ends up in this directory with
a sensible name, we put it in testing/marionette/prefs/marionette.js so
that it ends up in the objdir as dist/bin/defaults/pref/marionette.js.
MozReview-Commit-ID: 9YJ7vysDjSJ
--HG--
rename : testing/marionette/prefs.js => testing/marionette/prefs/marionette.js
extra : rebase_source : 54e084700d1ae691a0395531156626f56190f0fe
This matches Blink/WebKit, and is more similar to Edge than before, but
may cause compat problems for Gecko-only code or code paths. Sites can
revert to old behavior with:
document.execCommand("defaultparagraphseparator", false, "br").
This regresses test_bug430392.html on one test ("adding returns") and I
don't know why. The test involves embedded non-editable content and we
already have a lot of todos in that file, so I think it's tolerable.
MozReview-Commit-ID: Dml0bXxgu87
*aHandled was previously always set to true in WillMakeBasicBlock, which
was probably a bug but is not addressed in this commit.
MozReview-Commit-ID: 41JSmptVc0l
I don't personally agree with this behavior (although I did spec it some
years ago), but it's the behavior of all other UAs, so we should do it
anyway.
MozReview-Commit-ID: IiIg41kMJIU
These expectations were just wrong. I don't know how they got here. I
think Chrome matches the tests and not the spec, fortunately.
MozReview-Commit-ID: ENiO34GiZ0Y
Update the refresh command to make it synchronous, and let it return
once the target page has been loaded. This can be accomplished by using
the loadListener object in listener.js.
MozReview-Commit-ID: Lc8QoGFeQrY
--HG--
extra : rebase_source : 1fd914aec1c55fe91a0de773cfd7ff22b5d12167
This refactoring allows us to re-use the same load algorithm for
each command which could trigger a page load. It also takes remoteness
changes into account, and waits until the load has been done.
With this change we no longer check for readyState only, but observe
the necessary DOM events as fired for page unloads and loads. This will
help us to implement the page loading strategy later.
By observing the DOM events, I also expect a small increase of performance
for any kind of page load, given that we now return immediately and do not
have a delay of 100ms at maximum.
MozReview-Commit-ID: IVtO6KgJFES
--HG--
extra : rebase_source : 40f90e3b9d1bf0a2f9123271cd08513769616e41
To delay the page load for our slowly served example page when using the
back or forward commands, the page doesn't have to be put into the browser
cache.
MozReview-Commit-ID: 4xMjR3SakZn
--HG--
extra : rebase_source : 024b8e702d95689defcee7e12496ce531e72d651
The key dispatch functions now pass the raw key to event.js,
which determines the keyCode for the event.
Note the change in Normalized Key Value for Enter versus Return.
The browser throws an exception when the event key attribute is
set to "Return" and KEY_NON_PRINTABLE_KEY is set, which implies
that the key value isn't valid. Changing it to Enter fixes the
issue.
MozReview-Commit-ID: 831f4EcqI1P
--HG--
extra : rebase_source : 6045b6199c72bcc7a971907d6e1513687d8ed3f9