Bug 1584979 [wpt PR 19401] - Create a decision task for taskcluster, a=testonly

Automatic update from web-platform-tests
Implement a decision task for taskcluster jobs

A decision task is the first task run, and ensures that we only
schedule subsequent tasks which are relevant to the current push or
pull request. This dynamic scheduling helps reduce load since we avoid
spinning up workers for tasks that ultimately don't run, and unlocks
new possibilities since we are able to schedule tasks that are
dependent on other tasks.

The tasks, their scheduling criteria and their dependnecies are
specified in a a YAML format configuration file in
tools/ci/tc/tasks/test.yml. This has a bespoke format, adopting some
ideas from Azure and Gecko's taskcluster integration. The format is
documented in `tools/ci/tc/README.md`. The data in that file undergoes
trandformations to produce a set of tasks, which are then filtered
according to the event that caused the decision task to run.

To initially prove out the implementation of dependent tasks we make
the Firefox tasks depend on a download task. But this does not yet
include the work to actually make the dependent tasks download Firefox
from the parent task.

--

wpt-commits: f57ead1924c978dc9256d8f572614731a1a51678
wpt-pr: 19401


--HG--
rename : testing/web-platform/tests/tools/taskcluster/testdata/master_push_event.json => testing/web-platform/tests/tools/ci/tc/testdata/master_push_event.json
rename : testing/web-platform/tests/tools/taskcluster/testdata/pr_event.json => testing/web-platform/tests/tools/ci/tc/testdata/pr_event.json
This commit is contained in:
James Graham 2019-11-26 11:31:33 +00:00 коммит произвёл moz-wptsync-bot
Родитель 710c4f8c8a
Коммит e90b4029cd
21 изменённых файлов: 1550 добавлений и 512 удалений

Просмотреть файл

@ -4,359 +4,62 @@ policy:
tasks:
$let:
event_str: {$json: {$eval: event}}
provisionerId:
$if: 'taskcluster_root_url == "https://taskcluster.net"'
then: aws-provisioner-v1
else: proj-wpt
workerType:
$if: 'taskcluster_root_url == "https://taskcluster.net"'
scopes:
$if: 'tasks_for == "github-push"'
then:
$if: event.repository.full_name == 'web-platform-tests/wpt'
$let:
branch:
$if: "event.ref[:11] == 'refs/heads/'"
then: "${event.ref[11:]}"
else: "${event.ref}"
in: "assume:repo:github.com/${event.repository.full_name}:branch:${branch}"
else: "assume:repo:github.com/${event.repository.full_name}:pull-request"
run_task:
$if: 'tasks_for == "github-push"'
then:
$if: 'event.ref in ["refs/heads/master", "refs/heads/epochs/daily", "refs/heads/epochs/weekly", "refs/heads/triggers/chrome_stable", "refs/heads/triggers/chrome_beta", "refs/heads/triggers/chrome_dev", "refs/heads/triggers/firefox_stable", "refs/heads/triggers/firefox_beta", "refs/heads/triggers/firefox_nightly", "refs/heads/triggers/webkitgtk_minibrowser_stable", "refs/heads/triggers/webkitgtk_minibrowser_nightly"]'
then: true
else: false
else:
$if: 'tasks_for == "github-pull-request"'
then:
wpt-docker-worker
else:
github-worker
else: ci
in:
$flattenDeep:
- $if: tasks_for == "github-push"
then:
$map:
$flatten:
$match: {
event.ref == "refs/heads/master": [{name: firefox, channel: nightly}, {name: chrome, channel: dev}],
event.ref == "refs/heads/epochs/daily": [{name: firefox, channel: stable}, {name: chrome, channel: stable}, {name: webkitgtk_minibrowser, channel: nightly}],
event.ref == "refs/heads/epochs/weekly": [{name: firefox, channel: beta}, {name: chrome, channel: beta}, {name: webkitgtk_minibrowser, channel: stable}],
event.ref == "refs/heads/triggers/chrome_stable": [{name: chrome, channel: stable}],
event.ref == "refs/heads/triggers/chrome_beta": [{name: chrome, channel: beta}],
event.ref == "refs/heads/triggers/chrome_dev": [{name: chrome, channel: dev}],
event.ref == "refs/heads/triggers/firefox_stable": [{name: firefox, channel: stable}],
event.ref == "refs/heads/triggers/firefox_beta": [{name: firefox, channel: beta}],
event.ref == "refs/heads/triggers/firefox_nightly": [{name: firefox, channel: nightly}],
event.ref == "refs/heads/triggers/webkitgtk_minibrowser_stable": [{name: webkitgtk_minibrowser, channel: stable}],
event.ref == "refs/heads/triggers/webkitgtk_minibrowser_nightly": [{name: webkitgtk_minibrowser, channel: nightly}]
}
each(browser):
$map:
- [testharness, 1, 16]
- [testharness, 2, 16]
- [testharness, 3, 16]
- [testharness, 4, 16]
- [testharness, 5, 16]
- [testharness, 6, 16]
- [testharness, 7, 16]
- [testharness, 8, 16]
- [testharness, 9, 16]
- [testharness, 10, 16]
- [testharness, 11, 16]
- [testharness, 12, 16]
- [testharness, 13, 16]
- [testharness, 14, 16]
- [testharness, 15, 16]
- [testharness, 16, 16]
- [reftest, 1, 5]
- [reftest, 2, 5]
- [reftest, 3, 5]
- [reftest, 4, 5]
- [reftest, 5, 5]
- [wdspec, 1, 1]
each(chunk):
taskId: {$eval: 'as_slugid(browser.name + browser.channel + chunk[0] + str(chunk[1]))'}
taskGroupId: {$eval: 'as_slugid("task group")'}
created: {$fromNow: ''}
deadline: {$fromNow: '24 hours'}
provisionerId: ${provisionerId}
workerType: ${workerType}
metadata:
name: wpt-${browser.name}-${browser.channel}-${chunk[0]}-${chunk[1]}
description: >-
A subset of WPT's "${chunk[0]}" tests (chunk number ${chunk[1]}
of ${chunk[2]}), run in the ${browser.channel} release of
${browser.name}.
owner:
# event.pusher.email is null when it comes from a GitHub action, so it has to be null-checked,
# and using the "in" operator causes an evaluation error when the right variable is null, if its
# done on the same "if" statement than the null-check (with &&), therefore we use a nested "if" here.
$if: 'event.pusher.email'
then:
$if: '"@" in event.pusher.email'
then: ${event.pusher.email}
else: web-platform-tests@users.noreply.github.com
else: web-platform-tests@users.noreply.github.com
source: ${event.repository.url}
payload:
image:
harjgam/web-platform-tests:0.33
maxRunTime: 7200
artifacts:
public/results:
path: /home/test/artifacts
type: directory
command:
- /bin/bash
- --login
- -c
- set -ex;
echo "wpt-${browser.name}-${browser.channel}-${chunk[0]}-${chunk[1]}";
~/start.sh
${event.repository.url}
${event.ref};
cd ~/web-platform-tests;
sudo cp tools/certs/cacert.pem
/usr/local/share/ca-certificates/cacert.crt;
sudo update-ca-certificates;
./tools/ci/run_tc.py
--checkout=${event.after}
--oom-killer
--hosts
--browser=${browser.name}
--channel=${browser.channel}
--xvfb
run-all
./tools/ci/taskcluster-run.py
${browser.name}
--
--channel=${browser.channel}
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
--no-fail-on-unexpected
--test-type=${chunk[0]}
--this-chunk=${chunk[1]}
--total-chunks=${chunk[2]};
extra:
github_event: "${event_str}"
- $if: tasks_for == "github-pull-request"
# PR tasks that run the tests in various configurations
then:
# Taskcluster responds to a number of events issued by the GitHub API
# which should not trigger re-validation.
$if: event.action in ['opened', 'reopened', 'synchronize']
then:
$map: [{name: firefox, channel: nightly}, {name: chrome, channel: dev}]
each(browser):
$map:
# This is the main place to define new stability checks
- name: wpt-${browser.name}-${browser.channel}-stability
job_name: stability
checkout: task_head
diff_base: base_head
description: >-
Verify that all tests affected by a pull request are stable
when executed in ${browser.name}.
extra_args: '--verify'
- name: wpt-${browser.name}-${browser.channel}-results
job_name: affected_tests
checkout: task_head
diff_base: base_head
description: >-
Collect results for all tests affected by a pull request in
${browser.name}.
extra_args: >-
--no-fail-on-unexpected
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
- name: wpt-${browser.name}-${browser.channel}-results-without-changes
job_name: affected_tests
checkout: base_head
diff_base: task_head
description: >-
Collect results for all tests affected by a pull request in
${browser.name} but without the changes in the PR.
extra_args: >-
--no-fail-on-unexpected
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
each(operation):
taskId: {$eval: 'as_slugid(operation.name)'}
taskGroupId: {$eval: 'as_slugid("task group")'}
created: {$fromNow: ''}
deadline: {$fromNow: '24 hours'}
provisionerId: ${provisionerId}
workerType: ${workerType}
metadata:
name: ${operation.name}
description: ${operation.description}
owner: ${event.pull_request.user.login}@users.noreply.github.com
source: ${event.repository.url}
payload:
image: harjgam/web-platform-tests:0.33
maxRunTime: 7200
artifacts:
public/results:
path: /home/test/artifacts
type: directory
# Fetch the GitHub-provided merge commit (rather than the pull
# request branch) so that the tasks simulate the behavior of the
# submitted patch after it is merged. Using the merge commit also
# simplifies detection of modified files because the first parent
# of the merge commit can consistently be used to summarize the
# changes.
command:
- /bin/bash
- --login
- -c
- set -ex;
echo "${operation.name}";
~/start.sh
${event.repository.clone_url}
refs/pull/${event.number}/merge;
cd web-platform-tests;
./tools/ci/run_tc.py
--checkout=${operation.checkout}
--oom-killer
--browser=${browser.name}
--channel=${browser.channel}
--xvfb
${operation.job_name}
./tools/ci/taskcluster-run.py
--commit-range ${operation.diff_base}
${browser.name}
--
--channel=${browser.channel}
${operation.extra_args};
extra:
github_event: "${event_str}"
then: true
else: false
else: false
in:
- $if: run_task
then:
created: {$fromNow: ''}
deadline: {$fromNow: '24 hours'}
provisionerId: proj-wpt
workerType: ci
metadata:
name: "wpt-decision-task"
description: "The task that creates all of the other tasks in the task graph"
owner: "${event.sender.login}@users.noreply.github.com"
source: ${event.repository.clone_url}
payload:
image: harjgam/web-platform-tests:0.33
maxRunTime: 7200
artifacts:
public/results:
path: /home/test/artifacts
type: directory
command:
- /bin/bash
- --login
- -c
- set -ex;
~/start.sh
${event.repository.clone_url}
${event.after};
cd ~/web-platform-tests;
./wpt tc-decision --tasks-path=/home/test/artifacts/tasks.json
features :
taskclusterProxy: true
scopes:
- ${scopes}
extra:
github_event: "${event_str}"
- $map:
# This is the main point to define new CI checks other than stability checks
- name: lint
description: >-
Lint for wpt-specific requirements
script: >-
./tools/ci/run_tc.py \
--no-hosts \
lint \
./wpt lint --all
conditions:
push
pull-request
- name: update built tests
description: >-
Ensure test suites that require a build step are updated
script: >-
./tools/ci/run_tc.py \
--no-hosts \
update_built \
tools/ci/ci_built_diff.sh
conditions:
pull-request
- name: tools/ unittests (Python 2)
description: >-
Unit tests for tools running under Python 2.7, excluding wptrunner
script: >-
export TOXENV=py27;
export HYPOTHESIS_PROFILE=ci;
export PY_COLORS=0;
./tools/ci/run_tc.py \
tools_unittest \
tools/ci/ci_tools_unittest.sh
conditions:
push
pull-request
- name: tools/ unittests (Python 3)
description: >-
Unit tests for tools running under Python 3, excluding wptrunner
script: >-
export TOXENV=py36;
export HYPOTHESIS_PROFILE=ci;
export PY_COLORS=0;
sudo apt update -qqy;
sudo apt install -qqy python3-pip;
./tools/ci/run_tc.py \
tools_unittest \
tools/ci/ci_tools_unittest.sh
conditions:
push
pull-request
- name: tools/wpt/ tests
description: >-
Integration tests for wpt commands
script: >-
export TOXENV=py27;
sudo apt update -qqy;
sudo apt install -qqy libnss3-tools;
./tools/ci/run_tc.py \
--oom-killer \
--browser=firefox \
--browser=chrome \
--channel=experimental \
--xvfb \
wpt_integration \
tools/ci/ci_wpt.sh
conditions:
pull-request
- name: resources/ tests
description: >-
Tests for testharness.js and other files in resources/
script: >-
export TOXENV=py27;
./tools/ci/run_tc.py \
--browser=firefox \
--xvfb \
resources_unittest \
tools/ci/ci_resources_unittest.sh
conditions:
pull-request
- name: infrastructure/ tests
description: >-
Smoketests for wptrunner
script: >-
sudo apt update -qqy;
sudo apt install -qqy libnss3-tools libappindicator1 fonts-liberation;
./tools/ci/run_tc.py \
--oom-killer \
--browser=firefox \
--browser=chrome \
--channel=experimental \
--no-hosts \
--xvfb \
wptrunner_infrastructure \
tools/ci/ci_wptrunner_infrastructure.sh
conditions:
pull-request
each(operation):
# Note: jsone doesn't short-circuit evaluation so all parts of the conditional are evaluated
# Accessing properties using the [] notation allows them to evaluate as null in case they're undefined
# TODO: Allow running pushes on branches other than master
- $if: ("push" in operation.conditions && tasks_for == "github-push" && event['ref'] == "refs/heads/master") || ("pull-request" in operation.conditions && tasks_for == "github-pull-request" && event['action'] in ['opened', 'reopened', 'synchronize'])
then:
$let:
checkout_ref:
$if: tasks_for == "github-push"
then:
${event.ref}
else:
refs/pull/${event.number}/merge
in:
taskId: {$eval: 'as_slugid(operation.name)'}
taskGroupId: {$eval: 'as_slugid("task group")'}
created: {$fromNow: ''}
deadline: {$fromNow: '24 hours'}
provisionerId: ${provisionerId}
workerType: ${workerType}
metadata:
name: ${operation.name}
description: ${operation.description}
owner: ${event.sender.login}@users.noreply.github.com
source: ${event.repository.url}
payload:
image: harjgam/web-platform-tests:0.33
maxRunTime: 7200
artifacts:
public/results:
path: /home/test/artifacts
type: directory
command:
- /bin/bash
- --login
- -c
- set -ex;
echo "${operation.name}";
~/start.sh
${event.repository.clone_url}
${checkout_ref};
cd ~/web-platform-tests;
${operation.script};
extra:
github_event: "${event_str}"

Просмотреть файл

@ -14,7 +14,7 @@
"virtualenv": false
},
"tc-download": {
"path": "tcdownload.py",
"path": "tc/download.py",
"script": "run",
"parser": "get_parser",
"parse_known": true,
@ -24,5 +24,27 @@
"requests",
"pygithub"
]
},
"tc-taskgraph": {
"path": "tc/taskgraph.py",
"script": "run",
"help": "Build the taskgraph",
"virtualenv": true,
"install": [
"requests",
"pyyaml"
]
},
"tc-decision": {
"path": "tc/decision.py",
"parser": "get_parser",
"script": "run",
"help": "Run the decision task",
"virtualenv": true,
"install": [
"requests",
"pyyaml",
"taskcluster"
]
}
}

Просмотреть файл

@ -38,7 +38,6 @@ the serialization of a GitHub event payload.
import argparse
import json
import os
import re
import subprocess
import sys
import tempfile
@ -99,8 +98,12 @@ def get_parser():
help="Start xvfb")
p.add_argument("--checkout",
help="Revision to checkout before starting job")
p.add_argument("job",
help="Name of the job associated with the current event")
p.add_argument("--install-certificates", action="store_true", default=None,
help="Install web-platform.test certificates to UA store")
p.add_argument("--no-install-certificates", action="store_false", default=None,
help="Don't install web-platform.test certificates to UA store")
p.add_argument("--rev",
help="Revision that the task_head ref is expected to point to")
p.add_argument("script",
help="Script to run for the job")
p.add_argument("script_args",
@ -123,6 +126,12 @@ def checkout_revision(rev):
subprocess.check_call(["git", "checkout", "--quiet", rev])
def install_certificates():
subprocess.check_call(["sudo", "cp", "tools/certs/cacert.pem",
"/usr/local/share/ca-certificates/cacert.crt"])
subprocess.check_call(["sudo", "update-ca-certificates"])
def install_chrome(channel):
if channel in ("experimental", "dev", "nightly"):
deb_archive = "google-chrome-unstable_current_amd64.deb"
@ -213,29 +222,6 @@ def start_xvfb():
start(["sudo", "fluxbox", "-display", os.environ["DISPLAY"]])
def get_extra_jobs(event):
body = None
jobs = set()
if "commits" in event and event["commits"]:
body = event["commits"][0]["message"]
elif "pull_request" in event:
body = event["pull_request"]["body"]
if not body:
return jobs
regexp = re.compile(r"\s*tc-jobs:(.*)$")
for line in body.splitlines():
m = regexp.match(line)
if m:
items = m.group(1)
for item in items.split(","):
jobs.add(item.strip())
break
return jobs
def set_variables(event):
# Set some variables that we use to get the commits on the current branch
ref_prefix = "refs/heads/"
@ -256,23 +242,13 @@ def set_variables(event):
os.environ["GITHUB_BRANCH"] = branch
def include_job(job):
# Special case things that unconditionally run on pushes,
# assuming a higher layer is filtering the required list of branches
if (os.environ["GITHUB_PULL_REQUEST"] == "false" and
job == "run-all"):
return True
jobs_str = run([os.path.join(root, "wpt"),
"test-jobs"], return_stdout=True)
print(jobs_str)
return job in set(jobs_str.splitlines())
def setup_environment(args):
if args.hosts_file:
make_hosts_file()
if args.install_certificates:
install_certificates()
if "chrome" in args.browser:
assert args.channel is not None
install_chrome(args.channel)
@ -340,6 +316,13 @@ def fetch_event_data():
def main():
args = get_parser().parse_args()
if args.rev is not None:
task_head = subprocess.check_output(["git", "rev-parse", "task_head"]).strip()
if task_head != args.rev:
print("CRITICAL: task_head points at %s, expected %s. "
"This may be because the branch was updated" % (task_head, args.rev))
sys.exit(1)
if "TASK_EVENT" in os.environ:
event = json.loads(os.environ["TASK_EVENT"])
else:
@ -350,25 +333,6 @@ def main():
setup_repository()
extra_jobs = get_extra_jobs(event)
job = args.job
print("Job %s" % job)
run_if = [(lambda: job == "all", "job set to 'all'"),
(lambda:"all" in extra_jobs, "Manually specified jobs includes 'all'"),
(lambda:job in extra_jobs, "Manually specified jobs includes '%s'" % job),
(lambda:include_job(job), "CI required jobs includes '%s'" % job)]
for fn, msg in run_if:
if fn():
print(msg)
break
else:
print("Job not scheduled for this push")
return
# Run the job
setup_environment(args)
os.chdir(root)

Просмотреть файл

@ -0,0 +1,235 @@
# Taskgraph Setup
The taskgraph is built from a YAML file. This file has two top-level
properties: `components` and `tasks`. The full list of tasks is
defined by the `tasks` object; each task is an object with a single
property representing the task with the corresponding value an object
representing the task properties. Each task requires the following
top-level properties:
* `provisionerId`: String. Name of Taskcluster provisioner
* `schedulerId`: String. Name of Taskcluster scheduler
* `deadline`: String. Time until the task expires
* `image`: String. Name of docker image to use for task
* `maxRunTime`: Number. Maximum time in seconds for which the task can
run.
* `artifacts`: Object. List of artifacts and directories to upload; see
Taskcluster documentation.
* `command`: String. Command to run. This is automatically wrapped in a
run_tc command
* `options`: Optional Object. Options to pass into run_tc
- xvfb: Boolean. Enable Xvfb for run
- oom-killer: Boolean. Enable xvfb for run
- hosts: Boolean. Update hosts file with wpt hosts before run
- install-certificates: Boolean. Install wpt certs into OS
certificate store for run
- browser: List. List of browser names for run
- channel: String. Browser channel for run
* `trigger`: Object. Conditions on which to consider task. One or more
of following properties:
- branch: List. List of branch names on which to trigger.
- pull-request: No value. Trigger for pull request actions
* `schedule-if`: Optional Object. Conditions on which task should be
scheduled given it meets the trigger conditions.
- `run-job`: List. Job names for which this task should be considered,
matching the output from `./wpt test-jobs`
* `env`: Optional Object. Environment variables to set when running task.
* `depends-on`: Optional list. List of task names that must be complete
before the current task is scheduled.
* `description`: String. Task description.
* `name`: Optional String. Name to use for the task overriding the
property name. This is useful in combination with substitutions
described below.
## Task Expansions
Using the above syntax it's possble to describe each task
directly. But typically in a taskgraph there are many common
properties between tasks so it's tedious and error prone to repeat
information that's common to multiple tasks. Therefore the taskgraph
format provides several mechanisms to reuse partial task definitions
across multiple tasks.
### Components
The other top-level property in the taskgraph format is
`components`. The value of this property is an object containing named
partial task definitions. Each task definition may contain a property called
`use` which is a list of components to use as the basis for the task
definition. The components list is evaluated in order. If a property
is not previously defined in the output it is added to the output. If
it was previously defined, the value is updated according to the type:
* Strings and numbers are replaced with a new value
* Lists are extended with the additional values
* Objects are updated recursively following the above rules
This means that types must always match between components and the
final value.
For example
```
components:
example-1:
list_prop:
- first
- second
object_prop:
key1: value1
key2: base_value
example-2:
list_prop:
- third
- fourth
object_prop:
key3:
- value3-1
tasks:
- example-task:
use:
- example-1
- example-2
object_prop:
key2: value2
key3:
- value3-2
```
will evaluate to the following task:
```
example-task:
list_prop:
- first
- second
- third
- fourth
object_prop:
key1: value1
key2: value2
key3:
- value3-1
- value3-2
```
Note that components cannot currently define `use` properties of their own.
## Substitutions
Components and tasks can define a property `vars` that holds variables
which are later substituted into the task definition using the syntax
`${vars.property-name}`. For example:
```
components:
generic-component:
prop: ${vars.value}
tasks:
- first:
use:
- generic-component
vars:
value: value1
- second:
use:
- generic-component
vars:
value: value2
```
Results in the following tasks:
```
first:
prop: value1
second:
prop: value2
```
## Maps
Instead of defining a task directly, an item in the tasks property may
be an object with a single property `$map`. This object itself has two
child properties; `for` and `do`. The value of `for` is a list of
objects, and the value of `do` is either an object or a list of
objects. For each object in the `for` property, a set of tasks is
created by taking a copy of that object for each task in the `do`
property, updating the object with the properties from the
corresponding `do` object, using the same rules as for components
above, and then processing as for a normal task. `$map` rules can also
be nested.
Note: Although `$map` shares a name with the `$map` used in json-e
(used. in `.taskcluster.yml`), the semantics are different.
For example
```
components: {}
tasks:
$map:
for:
- vars:
example: value1
- vars:
example: value2
do:
example-${vars.example}
prop: ${vars.example}
```
Results in the tasks
```
example-value1:
prop: value1
example-value2:
prop: value2
```
Note that in combination with `$map`, variable substitutions are
applied *twice*; once after the `$map` is evaluated and once after the
`use` statements are evaluated.
## Chunks
A common requirements for tasks is that they are "chunked" into N
partial tasks. This is handled specially in the syntax. A top level
property `chunks` can be used to define the number of individual
chunks to create for a specific task. Each chunked task is created
with a `chunks` property set to an object containing an `id` property
containing the one-based index of the chunk an a `total` property
containing the total number of chunks. These can be substituted into
the task definition using the same syntax as for `vars` above
e.g. `${chunks.id}`. Note that because task names must be unique, it's
common to specify a `name` property on the task that will override the
property name e.g.
```
components: {}
tasks:
- chunked-task:
chunks:2
command: "task-run --chunk=${chunks.id} --totalChunks=${chunks.total}"
name: task-chunk-${chunks.id}
```
creates tasks:
```
task-chunk-1:
command: "task-run --chunk=1 --totalChunks=2"
task-chunk-2:
command: "task-run --chunk=2 --totalChunks=2"
```
# Overall processing model
The overall processing model for tasks is as follows:
* Evaluate maps
* Perform subsitutions
* Evaluate use statements
* Expand chunks
* Perform subsitutions
At each point after maps are evaluated tasks must have a unique name.

Просмотреть файл

@ -0,0 +1,314 @@
import argparse
import json
import logging
import os
import re
import subprocess
from collections import OrderedDict
import taskcluster
from six import iteritems, itervalues
from . import taskgraph
here = os.path.abspath(os.path.dirname(__file__))
logging.basicConfig()
logger = logging.getLogger()
def get_triggers(event):
# Set some variables that we use to get the commits on the current branch
ref_prefix = "refs/heads/"
is_pr = "pull_request" in event
branch = None
if not is_pr and "ref" in event:
branch = event["ref"]
if branch.startswith(ref_prefix):
branch = branch[len(ref_prefix):]
return is_pr, branch
def fetch_event_data(queue):
try:
task_id = os.environ["TASK_ID"]
except KeyError:
logger.warning("Missing TASK_ID environment variable")
# For example under local testing
return None
task_data = queue.task(task_id)
return task_data.get("extra", {}).get("github_event")
def filter_triggers(event, all_tasks):
is_pr, branch = get_triggers(event)
triggered = {}
for name, task in iteritems(all_tasks):
if "trigger" in task:
if is_pr and "pull-request" in task["trigger"]:
triggered[name] = task
elif branch is not None and "branch" in task["trigger"]:
for trigger_branch in task["trigger"]["branch"]:
if (trigger_branch == branch or
trigger_branch.endswith("*") and branch.startswith(trigger_branch[:-1])):
triggered[name] = task
logger.info("Triggers match tasks:\n * %s" % "\n * ".join(triggered.keys()))
return triggered
def get_run_jobs(event):
from tools.ci import jobs
revish = "%s..%s" % (event["pull_request"]["base"]["sha"]
if "pull_request" in event
else event["before"],
event["after"])
logger.info("Looking for changes in range %s" % revish)
paths = jobs.get_paths(revish=revish)
logger.info("Found changes in paths:%s" % "\n".join(paths))
path_jobs = jobs.get_jobs(paths)
all_jobs = path_jobs | get_extra_jobs(event)
logger.info("Including jobs:\n * %s" % "\n * ".join(all_jobs))
return all_jobs
def get_extra_jobs(event):
body = None
jobs = set()
if "commits" in event and event["commits"]:
body = event["commits"][0]["message"]
elif "pull_request" in event:
body = event["pull_request"]["body"]
if not body:
return jobs
regexp = re.compile(r"\s*tc-jobs:(.*)$")
for line in body.splitlines():
m = regexp.match(line)
if m:
items = m.group(1)
for item in items.split(","):
jobs.add(item.strip())
break
return jobs
def filter_schedule_if(event, tasks):
scheduled = {}
run_jobs = None
for name, task in iteritems(tasks):
if "schedule-if" in task:
if "run-job" in task["schedule-if"]:
if run_jobs is None:
run_jobs = get_run_jobs(event)
if "all" in run_jobs or any(item in run_jobs for item in task["schedule-if"]["run-job"]):
scheduled[name] = task
else:
scheduled[name] = task
logger.info("Scheduling rules match tasks:\n * %s" % "\n * ".join(scheduled.keys()))
return scheduled
def get_fetch_rev(event):
is_pr, _ = get_triggers(event)
if is_pr:
# Try to get the actual rev so that all non-decision tasks are pinned to that
ref = "refs/pull/%s/merge" % event["pull_request"]["number"]
try:
output = subprocess.check_output(["git", "ls-remote", "origin", ref])
except subprocess.CalledProcessError:
import traceback
logger.error(traceback.format_exc())
logger.error("Failed to get merge commit sha1")
return ref, None
if not output:
logger.error("Failed to get merge commit")
return ref, None
return ref, output.split()[0]
else:
return event["ref"], event["after"]
def build_full_command(event, task):
fetch_ref, fetch_sha = get_fetch_rev(event)
cmd_args = {
"task_name": task["name"],
"repo_url": event["repository"]["clone_url"],
"fetch_ref": fetch_ref,
"task_cmd": task["command"],
"install_str": "",
}
options = task.get("options", {})
options_args = []
if fetch_sha is not None:
options_args.append("--rev=%s" % fetch_sha)
if options.get("oom-killer"):
options_args.append("--oom-killer")
if options.get("xvfb"):
options_args.append("--xvfb")
if not options.get("hosts"):
options_args.append("--no-hosts")
else:
options_args.append("--hosts")
if options.get("checkout"):
options_args.append("--checkout=%s" % options["checkout"])
for browser in options.get("browser", []):
options_args.append("--browser=%s" % browser)
if options.get("channel"):
options_args.append("--channel=%s" % options["channel"])
if options.get("install-certificates"):
options_args.append("--install-certificates")
cmd_args["options_str"] = " ".join(str(item) for item in options_args)
install_packages = task.get("install")
if install_packages:
install_items = ["apt update -qqy"]
install_items.extend("apt install -qqy %s" % item
for item in install_packages)
cmd_args["install_str"] = "\n".join("sudo %s;" % item for item in install_items)
return ["/bin/bash",
"--login",
"-c",
"""
~/start.sh \
%(repo_url)s \
%(fetch_ref)s;
%(install_str)s
cd web-platform-tests;
./tools/ci/run_tc.py %(options_str)s -- %(task_cmd)s;
""" % cmd_args]
def get_owner(event):
pusher = event.get("pusher", {}).get("email", "")
if "@" in pusher:
return pusher
return "web-platform-tests@users.noreply.github.com"
def create_tc_task(event, task, taskgroup_id, depends_on_ids):
command = build_full_command(event, task)
task_id = taskcluster.slugId()
task_data = {
"taskGroupId": taskgroup_id,
"created": taskcluster.fromNowJSON(""),
"deadline": taskcluster.fromNowJSON(task["deadline"]),
"provisionerId": task["provisionerId"],
"schedulerId": task["schedulerId"],
"workerType": task["workerType"],
"metadata": {
"name": task["name"],
"description": task.get("description", ""),
"owner": get_owner(event),
"source": event["repository"]["clone_url"]
},
"payload": {
"artifacts": task.get("artifacts"),
"command": command,
"image": task.get("image"),
"maxRunTime": task.get("maxRunTime"),
"env": task.get("env", {}),
},
"extra": {
"github_event": json.dumps(event)
}
}
if depends_on_ids:
task_data["dependencies"] = depends_on_ids
task_data["requires"] = "all-completed"
return task_id, task_data
def build_task_graph(event, all_tasks, tasks):
task_id_map = OrderedDict()
taskgroup_id = os.environ.get("TASK_ID", taskcluster.slugId())
def add_task(task_name, task):
depends_on_ids = []
if "depends-on" in task:
for depends_name in task["depends-on"]:
if depends_name not in task_id_map:
add_task(depends_name,
all_tasks[depends_name])
depends_on_ids.append(task_id_map[depends_name][0])
task_id, task_data = create_tc_task(event, task, taskgroup_id, depends_on_ids)
task_id_map[task_name] = (task_id, task_data)
for task_name, task in iteritems(tasks):
add_task(task_name, task)
return task_id_map
def create_tasks(queue, task_id_map):
for (task_id, task_data) in itervalues(task_id_map):
queue.createTask(task_id, task_data)
def get_event(queue, event_path):
if event_path is not None:
try:
with open(event_path) as f:
event_str = f.read()
except IOError:
logger.error("Missing event file at path %s" % event_path)
raise
elif "TASK_EVENT" in os.environ:
event_str = os.environ["TASK_EVENT"]
else:
event_str = fetch_event_data(queue)
if not event_str:
raise ValueError("Can't find GitHub event definition; for local testing pass --event-path")
try:
return json.loads(event_str)
except ValueError:
logger.error("Event was not valid JSON")
raise
def decide(event):
all_tasks = taskgraph.load_tasks_from_path(os.path.join(here, "tasks", "test.yml"))
triggered_tasks = filter_triggers(event, all_tasks)
scheduled_tasks = filter_schedule_if(event, triggered_tasks)
task_id_map = build_task_graph(event, all_tasks, scheduled_tasks)
return task_id_map
def get_parser():
parser = argparse.ArgumentParser()
parser.add_argument("--event-path",
help="Path to file containing serialized GitHub event")
parser.add_argument("--dry-run", action="store_true",
help="Don't actually create the tasks, just output the tasks that "
"would be created")
parser.add_argument("--tasks-path",
help="Path to file in which to write payload for all scheduled tasks")
return parser
def run(venv, **kwargs):
queue = taskcluster.Queue({'rootUrl': os.environ['TASKCLUSTER_PROXY_URL']})
event = get_event(queue, event_path=kwargs["event_path"])
task_id_map = decide(event)
try:
if not kwargs["dry_run"]:
create_tasks(queue, task_id_map)
else:
print(json.dumps(task_id_map, indent=2))
finally:
if kwargs["tasks_path"]:
with open(kwargs["tasks_path"], "w") as f:
json.dump(task_id_map, f, indent=2)

Просмотреть файл

@ -15,6 +15,7 @@ logger = logging.getLogger("tc-download")
# be https://community-tc.services.mozilla.com)
TASKCLUSTER_ROOT_URL = 'https://taskcluster.net'
def get_parser():
parser = argparse.ArgumentParser()
parser.add_argument("--ref", action="store", default="master",

Просмотреть файл

@ -0,0 +1,170 @@
import json
import os
import re
from copy import deepcopy
import six
import yaml
from six import iteritems
here = os.path.dirname(__file__)
def first(iterable):
# First item from a list or iterator
if not hasattr(iterable, "next"):
if hasattr(iterable, "__iter__"):
iterable = iter(iterable)
else:
raise ValueError("Object isn't iterable")
return next(iterable)
def load_task_file(path):
with open(path) as f:
return yaml.safe_load(f)
def update_recursive(data, update_data):
for key, value in iteritems(update_data):
if key not in data:
data[key] = value
else:
initial_value = data[key]
if isinstance(value, dict):
if not isinstance(initial_value, dict):
raise ValueError("Variable %s has inconsistent types "
"(expected object)" % key)
update_recursive(initial_value, value)
elif isinstance(value, list):
if not isinstance(initial_value, list):
raise ValueError("Variable %s has inconsistent types "
"(expected list)" % key)
initial_value.extend(value)
else:
data[key] = value
def resolve_use(task_data, templates):
rv = {}
if "use" in task_data:
for template_name in task_data["use"]:
update_recursive(rv, deepcopy(templates[template_name]))
update_recursive(rv, task_data)
rv.pop("use", None)
return rv
def resolve_name(task_data, default_name):
if "name" not in task_data:
task_data["name"] = default_name
return task_data
def resolve_chunks(task_data):
if "chunks" not in task_data:
return [task_data]
rv = []
total_chunks = task_data["chunks"]
for i in range(1, total_chunks + 1):
chunk_data = deepcopy(task_data)
chunk_data["chunks"] = {"id": i,
"total": total_chunks}
rv.append(chunk_data)
return rv
def replace_vars(input_string, variables):
# TODO: support replacing as a non-string type?
variable_re = re.compile(r"(?<!\\)\${([^}]+)}")
def replacer(m):
var = m.group(1).split(".")
repl = variables
for part in var:
try:
repl = repl[part]
except Exception:
# Don't substitute
return m.group(0)
return str(repl)
return variable_re.sub(replacer, input_string)
def sub_variables(data, variables):
if isinstance(data, six.string_types):
return replace_vars(data, variables)
if isinstance(data, list):
return [sub_variables(item, variables) for item in data]
if isinstance(data, dict):
return {key: sub_variables(value, variables)
for key, value in iteritems(data)}
return data
def substitute_variables(task):
variables = {"vars": task.get("vars", {}),
"chunks": task.get("chunks", {})}
return sub_variables(task, variables)
def expand_maps(task):
name = first(task.keys())
if name != "$map":
return [task]
map_data = task["$map"]
if set(map_data.keys()) != set(["for", "do"]):
raise ValueError("$map objects must have exactly two properties named 'for' "
"and 'do' (got %s)" % ("no properties" if not map_data.keys()
else ", ". join(map_data.keys())))
rv = []
for for_data in map_data["for"]:
do_items = map_data["do"]
if not isinstance(do_items, list):
do_items = expand_maps(do_items)
for do_data in do_items:
task_data = deepcopy(for_data)
if len(do_data.keys()) != 1:
raise ValueError("Each item in the 'do' list must be an object "
"with a single property")
name = first(do_data.keys())
update_recursive(task_data, deepcopy(do_data[name]))
rv.append({name: task_data})
return rv
def load_tasks(tasks_data):
map_resolved_tasks = {}
tasks = []
for task in tasks_data["tasks"]:
if len(task.keys()) != 1:
raise ValueError("Each task must be an object with a single property")
for task in expand_maps(task):
if len(task.keys()) != 1:
raise ValueError("Each task must be an object with a single property")
name = first(task.keys())
data = task[name]
new_name = sub_variables(name, {"vars": data.get("vars", {})})
if new_name in map_resolved_tasks:
raise ValueError("Got duplicate task name %s" % new_name)
map_resolved_tasks[new_name] = substitute_variables(data)
for task_default_name, data in iteritems(map_resolved_tasks):
task = resolve_use(data, tasks_data["components"])
task = resolve_name(task, task_default_name)
tasks.extend(resolve_chunks(task))
tasks = [substitute_variables(task_data) for task_data in tasks]
return {task["name"]: task for task in tasks}
def load_tasks_from_path(path):
return load_tasks(load_task_file(path))
def run(venv, **kwargs):
print(json.dumps(load_tasks_from_path(os.path.join(here, "tasks", "test.yml")), indent=2))

Просмотреть файл

@ -0,0 +1,354 @@
components:
wpt-base:
provisionerId: proj-wpt
workerType: ci
schedulerId: taskcluster-github
deadline: "24 hours"
image: harjgam/web-platform-tests:0.33
maxRunTime: 7200
artifacts:
public/results:
path: /home/test/artifacts
type: directory
wpt-testharness:
chunks: 16
vars:
test-type: testharness
wpt-reftest:
chunks: 5
vars:
test-type: reftest
wpt-wdspec:
chunks: 1
vars:
test-type: wdspec
run-options:
options:
xvfb: true
oom-killer: true
hosts: true
install-certificates: true
wpt-run:
name: wpt-${vars.browser}-${vars.channel}-${vars.suite}-chunk-${chunks.id}
options:
browser:
- ${vars.browser}
channel: ${vars.channel}
command: >-
./tools/ci/taskcluster-run.py
${vars.browser}
--
--channel=${vars.channel}
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
--no-fail-on-unexpected
--this-chunk=${chunks.id}
--total-chunks=${chunks.total}
trigger-master:
trigger:
branch:
- master
trigger-push:
trigger:
branch:
- triggers/${vars.browser}_${vars.channel}
trigger-daily:
trigger:
branch:
- epochs/daily
trigger-weekly:
trigger:
branch:
- epochs/weekly
trigger-pr:
trigger:
pull-request:
browser-firefox:
depends-on:
- download-firefox-${vars.channel}
browser-webkitgtk_minibrowser: {}
browser-chrome: {}
tox-python2:
env:
TOXENV: py27
PY_COLORS: 0
tox-python3:
env:
TOXENV: py36
PY_COLORS: 0
install:
- python3-pip
tasks:
# Run full suites on push
- $map:
for:
- vars:
suite: testharness
- vars:
suite: reftest
- vars:
suite: wdspec
do:
$map:
for:
- vars:
browser: firefox
channel: nightly
use:
- trigger-master
- trigger-push
- vars:
browser: firefox
channel: beta
use:
- trigger-weekly
- trigger-push
- vars:
browser: firefox
channel: stable
use:
- trigger-daily
- trigger-push
- vars:
browser: chrome
channel: dev
use:
- trigger-master
- trigger-push
- vars:
browser: chrome
channel: beta
use:
- trigger-weekly
- trigger-push
- vars:
browser: chrome
channel: stable
use:
- trigger-daily
- trigger-push
- vars:
browser: webkitgtk_minibrowser
channel: nightly
use:
- trigger-daily
- trigger-push
- vars:
browser: webkitgtk_minibrowser
channel: stable
use:
- trigger-weekly
- trigger-push
do:
- ${vars.browser}-${vars.channel}-${vars.suite}:
use:
- wpt-base
- run-options
- wpt-run
- browser-${vars.browser}
- wpt-${vars.suite}
description: >-
A subset of WPT's "${vars.suite}" tests (chunk number ${chunks.id}
of ${chunks.total}), run in the ${vars.channel} release of
${vars.browser}.
- $map:
for:
- vars:
browser: firefox
channel: nightly
- vars:
browser: chrome
channel: dev
do:
- wpt-${vars.browser}-${vars.channel}-stability:
use:
- wpt-base
- browser-${vars.browser}
description: >-
Verify that all tests affected by a pull request are stable
when executed in ${vars.browser}.
command: >-
./tools/ci/taskcluster-run.py
--commit-range base_head
${vars.browser}
--
--channel=${vars.channel}
--verify
- wpt-${vars.browser}-${vars.channel}-results:
use:
- wpt-base
- run-options
- browser-${vars.browser}
description: >-
Collect results for all tests affected by a pull request in
${vars.browser}.
command: >-
./tools/ci/taskcluster-run.py
--commit-range base_head
${vars.browser}
--
--channel=${vars.channel}
--no-fail-on-unexpected
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
- wpt-${vars.browser}-${vars.channel}-results-without-changes:
use:
- wpt-base
- run-options
- browser-${vars.browser}
options:
checkout: base_head
description: >-
Collect results for all tests affected by a pull request in
${vars.browser} but without the changes in the PR.
command: >-
./tools/ci/taskcluster-run.py
--commit-range task_head
${vars.browser}
--
--channel=${vars.channel}
--no-fail-on-unexpected
--log-wptreport=../artifacts/wpt_report.json
--log-wptscreenshot=../artifacts/wpt_screenshot.txt
- $map:
for:
- vars:
channel: nightly
- vars:
channel: beta
- vars:
channel: stable
do:
download-firefox-${vars.channel}:
use:
- wpt-base
command: "./wpt install --download-only --destination /home/test/artifacts/ --channel=${vars.channel} firefox browser"
- lint:
use:
- wpt-base
- trigger-master
- trigger-pr
description: >-
Lint for wpt-specific requirements
command: "./wpt lint --all"
- update-built:
use:
- wpt-base
- trigger-pr
schedule-if:
run-job:
- update_built
command: "./tools/ci/ci_built_diff.sh"
- tools/ unittests (Python 2):
use:
- wpt-base
- trigger-pr
- tox-python2
description: >-
Unit tests for tools running under Python 2.7, excluding wptrunner
command: ./tools/ci/ci_tools_unittest.sh
env:
HYPOTHESIS_PROFILE: ci
schedule-if:
run-job:
- tools_unittest
- tools/ unittests (Python 3):
description: >-
Unit tests for tools running under Python 3, excluding wptrunner
use:
- wpt-base
- trigger-pr
- tox-python3
command: ./tools/ci/ci_tools_unittest.sh
env:
HYPOTHESIS_PROFILE: ci
schedule-if:
run-job:
- tools_unittest
- tools/wpt/ tests:
description: >-
Integration tests for wpt commands
use:
- wpt-base
- trigger-pr
- tox-python2
command: ./tools/ci/ci_wpt.sh
install:
- libnss3-tools
options:
oom-killer: true
browser:
- firefox
- chrome
channel: experimental
xvfb: true
hosts: true
schedule-if:
run-job:
- wpt_integration
- resources/ tests:
description: >-
Tests for testharness.js and other files in resources/
use:
- wpt-base
- trigger-pr
- tox-python2
command: ./tools/ci/ci_resources_unittest.sh
options:
browser:
- firefox
xvfb: true
hosts: true
schedule-if:
run-job:
- resources_unittest
- infrastructure/ tests:
description: >-
Smoketests for wptrunner
use:
- wpt-base
- trigger-pr
- tox-python2
command: ./tools/ci/ci_wptrunner_infrastructure.sh
install:
- libnss3-tools
- libappindicator1
- fonts-liberation
options:
oom-killer: true
browser:
- firefox
- chrome
channel: experimental
xvfb: true
hosts: false
schedule-if:
run-job:
- wptrunner_infrastructure

Просмотреть файл

@ -0,0 +1,54 @@
import mock
import pytest
from tools.ci.tc import decision
from six import iteritems
@pytest.mark.parametrize("run_jobs,tasks,expected", [
([], {"task-no-schedule-if": {}}, ["task-no-schedule-if"]),
([], {"task-schedule-if-no-run-job": {"schedule-if": {}}}, []),
(["job"],
{"job-present": {"schedule-if": {"run-job": ["other-job", "job"]}}},
["job-present"]),
(["job"], {"job-missing": {"schedule-if": {"run-job": ["other-job"]}}}, []),
(["all"], {"job-all": {"schedule-if": {"run-job": ["other-job"]}}}, ["job-all"]),
(["job"],
{"job-1": {"schedule-if": {"run-job": ["job"]}},
"job-2": {"schedule-if": {"run-job": ["other-job"]}}},
["job-1"]),
])
def test_filter_schedule_if(run_jobs, tasks, expected):
with mock.patch("tools.ci.tc.decision.get_run_jobs",
return_value=run_jobs) as get_run_jobs:
assert (decision.filter_schedule_if({}, tasks) ==
{name: tasks[name] for name in expected})
get_run_jobs.call_count in (0, 1)
@pytest.mark.parametrize("msg,expected", [
("Some initial line\n\ntc-jobs:foo,bar", {"foo", "bar"}),
("Some initial line\n\ntc-jobs:foo, bar", {"foo", "bar"}),
("tc-jobs:foo, bar \nbaz", {"foo", "bar"}),
("tc-jobs:all", {"all"}),
("", set()),
("tc-jobs:foo\ntc-jobs:bar", {"foo"})])
@pytest.mark.parametrize("event", [
{"commits": [{"message": "<message>"}]},
{"pull_request": {"body": "<message>"}}
])
def test_extra_jobs_pr(msg, expected, event):
def sub(obj):
"""Copy obj, except if it's a string with the value <message>
replace it with the value of the msg argument"""
if isinstance(obj, dict):
return {key: sub(value) for (key, value) in iteritems(obj)}
elif isinstance(obj, list):
return [sub(value) for value in obj]
elif obj == "<message>":
return msg
return obj
event = sub(event)
assert decision.get_extra_jobs(event) == expected

Просмотреть файл

@ -0,0 +1,146 @@
import pytest
import yaml
from tools.ci.tc import taskgraph
@pytest.mark.parametrize("data, update_data, expected", [
({"a": 1}, {"b": 2}, {"a": 1, "b": 2}),
({"a": 1}, {"a": 2}, {"a": 2}),
({"a": [1]}, {"a": [2]}, {"a": [1, 2]}),
({"a": {"b": 1, "c": 2}}, {"a": {"b": 2, "d": 3}}, {"a": {"b": 2, "c": 2, "d": 3}}),
({"a": {"b": [1]}}, {"a": {"b": [2]}}, {"a": {"b": [1, 2]}}),
]
)
def test_update_recursive(data, update_data, expected):
taskgraph.update_recursive(data, update_data)
assert data == expected
def test_use():
data = """
components:
component1:
a: 1
b: [1]
c: "c"
component2:
a: 2
b: [2]
d: "d"
tasks:
- task1:
use:
- component1
- component2
b: [3]
c: "e"
"""
tasks_data = yaml.safe_load(data)
assert taskgraph.load_tasks(tasks_data) == {
"task1": {
"a": 2,
"b": [1,2,3],
"c": "e",
"d": "d",
"name": "task1"
}
}
def test_var():
data = """
components:
component1:
a: ${vars.value}
tasks:
- task1:
use:
- component1
vars:
value: 1
"""
tasks_data = yaml.safe_load(data)
assert taskgraph.load_tasks(tasks_data) == {
"task1": {
"a": "1",
"vars": {"value": 1},
"name": "task1"
}
}
def test_map():
data = """
components: {}
tasks:
- $map:
for:
- vars:
a: 1
b: [1]
- vars:
a: 2
b: [2]
do:
- task1-${vars.a}:
a: ${vars.a}
b: [3]
- task2-${vars.a}:
a: ${vars.a}
b: [4]
"""
tasks_data = yaml.safe_load(data)
assert taskgraph.load_tasks(tasks_data) == {
"task1-1": {
"a": "1",
"b": [1, 3],
"vars": {"a": 1},
"name": "task1-1"
},
"task1-2": {
"a": "2",
"b": [2, 3],
"vars": {"a": 2},
"name": "task1-2"
},
"task2-1": {
"a": "1",
"b": [1, 4],
"vars": {"a": 1},
"name": "task2-1"
},
"task2-2": {
"a": "2",
"b": [2, 4],
"vars": {"a": 2},
"name": "task2-2"
},
}
def test_chunks():
data = """
components: {}
tasks:
- task1:
name: task1-${chunks.id}
chunks: 2
"""
tasks_data = yaml.safe_load(data)
assert taskgraph.load_tasks(tasks_data) == {
"task1-1": {
"name": "task1-1",
"chunks": {
"id": 1,
"total": 2
}
},
"task1-2": {
"name": "task1-2",
"chunks": {
"id": 2,
"total": 2
}
}
}

Просмотреть файл

@ -0,0 +1,73 @@
import json
import os
import jsone
import mock
import pytest
import requests
import sys
import yaml
from jsonschema import validate
here = os.path.dirname(__file__)
root = os.path.abspath(os.path.join(here, "..", "..", "..", ".."))
def data_path(filename):
return os.path.join(here, "..", "testdata", filename)
@pytest.mark.xfail(sys.version_info.major == 2,
reason="taskcluster library has an encoding bug "
"https://github.com/taskcluster/json-e/issues/338")
def test_verify_taskcluster_yml():
"""Verify that the json-e in the .taskcluster.yml is valid"""
with open(os.path.join(root, ".taskcluster.yml")) as f:
template = yaml.safe_load(f)
events = [("pr_event.json", "github-pull-request", "Pull Request"),
("master_push_event.json", "github-push", "Push to master")]
for filename, tasks_for, title in events:
with open(data_path(filename)) as f:
event = json.load(f)
context = {"tasks_for": tasks_for,
"event": event,
"as_slugid": lambda x: x}
jsone.render(template, context)
def test_verify_payload():
"""Verify that the decision task produces tasks with a valid payload"""
from tools.ci.tc.decision import decide
create_task_schema = requests.get(
"https://raw.githubusercontent.com/taskcluster/taskcluster/blob/master/services/queue/schemas/v1/create-task-request.yml")
create_task_schema = yaml.safe_load(create_task_schema.content)
payload_schema = requests.get("https://raw.githubusercontent.com/taskcluster/docker-worker/master/schemas/v1/payload.json").json()
jobs = ["lint",
"manifest_upload",
"resources_unittest",
"tools_unittest",
"wpt_integration",
"wptrunner_infrastructure",
"wptrunner_unittest"]
for filename in ["pr_event.json", "master_push_event.json"]:
with open(data_path(filename)) as f:
event = json.load(f)
with mock.patch("tools.ci.tc.decision.get_fetch_rev", return_value=(event["after"], None)):
with mock.patch("tools.ci.tc.decision.get_run_jobs", return_value=set(jobs)):
task_id_map = decide(event)
for name, (task_id, task_data) in task_id_map.items():
try:
validate(instance=task_data, schema=create_task_schema)
validate(instance=task_data["payload"], schema=payload_schema)
except Exception as e:
print("Validation failed for task '%s':\n%s" % (name, json.dumps(task_data, indent=2)))
raise e

Просмотреть файл

@ -1,33 +0,0 @@
import pytest
from six import iteritems
from tools.ci import run_tc
@pytest.mark.parametrize("msg,expected", [
("Some initial line\n\ntc-jobs:foo,bar", {"foo", "bar"}),
("Some initial line\n\ntc-jobs:foo, bar", {"foo", "bar"}),
("tc-jobs:foo, bar \nbaz", {"foo", "bar"}),
("tc-jobs:all", {"all"}),
("", set()),
("tc-jobs:foo\ntc-jobs:bar", {"foo"})])
@pytest.mark.parametrize("event", [
{"commits": [{"message": "<message>"}]},
{"pull_request": {"body": "<message>"}}
])
def test_extra_jobs_pr(msg, expected, event):
def sub(obj):
"""Copy obj, except if it's a string with the value <message>
replace it with the value of the msg argument"""
if isinstance(obj, dict):
return {key: sub(value) for (key, value) in iteritems(obj)}
elif isinstance(obj, list):
return [sub(value) for value in obj]
elif obj == "<message>":
return msg
return obj
event = sub(event)
assert run_tc.get_extra_jobs(event) == expected

Просмотреть файл

@ -1,4 +0,0 @@
{
"tc-verify": {"path": "verify.py", "script": "run", "parser": "create_parser", "help": "Verify .taskcluster.yml file is parsable",
"virtualenv": true, "install": ["json-e", "pyyaml"]}
}

Просмотреть файл

@ -1,37 +0,0 @@
import argparse
import json
import os
import jsone
import yaml
here = os.path.dirname(__file__)
root = os.path.abspath(os.path.join(here, "..", ".."))
def create_parser():
return argparse.ArgumentParser()
def run(venv, **kwargs):
with open(os.path.join(root, ".taskcluster.yml")) as f:
template = yaml.safe_load(f)
events = [("pr_event.json", "github-pull-request", "Pull Request"),
("master_push_event.json", "github-push", "Push to master")]
for filename, tasks_for, title in events:
with open(os.path.join(here, "testdata", filename)) as f:
event = json.load(f)
context = {"tasks_for": tasks_for,
"event": event,
"as_slugid": lambda x: x}
data = jsone.render(template, context)
heading = "Got %s tasks for %s" % (len(data["tasks"]), title)
print(heading)
print("=" * len(heading))
for item in data["tasks"]:
print(json.dumps(item, indent=2))
print("")

Просмотреть файл

@ -8,8 +8,11 @@ deps =
pytest-cov
mock
hypothesis
# `requests` is required by `pr_preview.py`
requests
taskcluster
pyyaml
json-e
jsonschema
commands = pytest {posargs}

Просмотреть файл

@ -47,6 +47,11 @@ class Browser(object):
def __init__(self, logger):
self.logger = logger
@abstractmethod
def download(self, dest=None, channel=None):
"""Download a package or installer for the browser"""
return NotImplemented
@abstractmethod
def install(self, dest=None):
"""Install the browser."""
@ -116,11 +121,19 @@ class Firefox(Browser):
return "%s%s" % (self.platform, bits)
def install(self, dest=None, channel="nightly"):
"""Install Firefox."""
def _get_dest(self, dest, channel):
if dest is None:
# os.getcwd() doesn't include the venv path
dest = os.path.join(os.getcwd(), "_venv")
import mozinstall
dest = os.path.join(dest, "browsers", channel)
if not os.path.exists(dest):
os.makedirs(dest)
return dest
def download(self, dest=None, channel="nightly"):
product = {
"nightly": "firefox-nightly-latest-ssl",
"beta": "firefox-beta-latest-ssl",
@ -136,21 +149,15 @@ class Firefox(Browser):
}
os_key = (self.platform, uname[4])
if dest is None:
dest = self._get_dest(None, channel)
if channel not in product:
raise ValueError("Unrecognised release channel: %s" % channel)
if os_key not in os_builds:
raise ValueError("Unsupported platform: %s %s" % os_key)
if dest is None:
# os.getcwd() doesn't include the venv path
dest = os.path.join(os.getcwd(), "_venv")
dest = os.path.join(dest, "browsers", channel)
if not os.path.exists(dest):
os.makedirs(dest)
url = "https://download.mozilla.org/?product=%s&os=%s&lang=en-US" % (product[channel],
os_builds[os_key])
self.logger.info("Downloading Firefox from %s" % url)
@ -175,6 +182,18 @@ class Firefox(Browser):
with open(installer_path, "wb") as f:
f.write(resp.content)
return installer_path
def install(self, dest=None, channel="nightly"):
"""Install Firefox."""
import mozinstall
dest = self._get_dest(dest, channel)
filename = os.path.basename(dest)
installer_path = self.download(dest, channel)
try:
mozinstall.install(installer_path, dest)
except mozinstall.mozinstall.InstallError:
@ -422,7 +441,7 @@ class FirefoxAndroid(Browser):
product = "firefox_android"
requirements = "requirements_firefox.txt"
def install(self, dest=None, channel=None):
def download(self, dest=None, channel=None):
if dest is None:
dest = os.pwd
@ -452,6 +471,9 @@ class FirefoxAndroid(Browser):
return apk_path
def install(self, dest=None, channel=None):
return self.download(dest, channel)
def install_prefs(self, binary, dest=None, channel=None):
fx_browser = Firefox(self.logger)
return fx_browser.install_prefs(binary, dest, channel)
@ -478,6 +500,9 @@ class Chrome(Browser):
product = "chrome"
requirements = "requirements_chrome.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -633,6 +658,9 @@ class ChromeAndroidBase(Browser):
super(ChromeAndroidBase, self).__init__(logger)
self.device_serial = None
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -724,6 +752,9 @@ class ChromeiOS(Browser):
product = "chrome_ios"
requirements = "requirements_chrome_ios.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -757,6 +788,9 @@ class Opera(Browser):
self.logger.warning("Unable to find the browser binary.")
return None
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -826,6 +860,9 @@ class EdgeChromium(Browser):
edgedriver_name = "msedgedriver"
requirements = "requirements_edge_chromium.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -922,6 +959,9 @@ class Edge(Browser):
product = "edge"
requirements = "requirements_edge.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -953,6 +993,9 @@ class InternetExplorer(Browser):
product = "ie"
requirements = "requirements_ie.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -978,6 +1021,9 @@ class Safari(Browser):
product = "safari"
requirements = "requirements_safari.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -1037,17 +1083,33 @@ class Servo(Browser):
return (platform, extension, decompress)
def install(self, dest=None, channel="nightly"):
"""Install latest Browser Engine."""
def _get(self, channel="nightly"):
if channel != "nightly":
raise ValueError("Only nightly versions of Servo are available")
platform, extension, _ = self.platform_components()
url = "https://download.servo.org/nightly/%s/servo-latest%s" % (platform, extension)
return get(url)
def download(self, dest=None, channel="nightly"):
if dest is None:
dest = os.pwd
platform, extension, decompress = self.platform_components()
url = "https://download.servo.org/nightly/%s/servo-latest%s" % (platform, extension)
resp = self._get(dest, channel)
_, extension, _ = self.platform_components()
decompress(get(url).raw, dest=dest)
with open(os.path.join(dest, "servo-latest%s" % (extension,)), "w") as f:
f.write(resp.content)
def install(self, dest=None, channel="nightly"):
"""Install latest Browser Engine."""
if dest is None:
dest = os.pwd
_, _, decompress = self.platform_components()
resp = self._get(dest, channel)
decompress(resp.raw, dest=dest)
path = find_executable("servo", os.path.join(dest, "servo"))
st = os.stat(path)
os.chmod(path, st.st_mode | stat.S_IEXEC)
@ -1083,6 +1145,9 @@ class Sauce(Browser):
product = "sauce"
requirements = "requirements_sauce.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -1105,6 +1170,9 @@ class WebKit(Browser):
product = "webkit"
requirements = "requirements_webkit.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError
@ -1168,6 +1236,9 @@ class Epiphany(Browser):
product = "epiphany"
requirements = "requirements_epiphany.txt"
def download(self, dest=None, channel=None):
raise NotImplementedError
def install(self, dest=None, channel=None):
raise NotImplementedError

Просмотреть файл

@ -42,6 +42,8 @@ def get_parser():
'the latest available development release. For WebDriver installs, '
'we attempt to select an appropriate, compatible, version for the '
'latest browser release on the selected channel.')
parser.add_argument('--download-only', action="store_true",
help="Download the selected component but don't install it")
parser.add_argument('-d', '--destination',
help='filesystem directory to place the component')
return parser
@ -73,21 +75,22 @@ def run(venv, **kwargs):
raise argparse.ArgumentError(None,
"No --destination argument, and no default for the environment")
install(browser, kwargs["component"], destination, channel)
install(browser, kwargs["component"], destination, channel,
download_only=kwargs["download_only"])
def install(name, component, destination, channel="nightly", logger=None):
def install(name, component, destination, channel="nightly", logger=None, download_only=False):
if logger is None:
import logging
logger = logging.getLogger("install")
if component == 'webdriver':
method = 'install_webdriver'
else:
method = 'install'
prefix = "download" if download_only else "install"
suffix = "_webdriver" if component == 'webdriver' else ""
method = prefix + suffix
subclass = getattr(browser, name.title())
sys.stdout.write('Now installing %s %s...\n' % (name, component))
path = getattr(subclass(logger), method)(dest=destination, channel=channel)
if path:
sys.stdout.write('Binary installed as %s\n' % (path,))
sys.stdout.write('Binary %s as %s\n' % ("downloaded" if download_only else "installed", path,))

Просмотреть файл

@ -3,5 +3,4 @@ tools/docker/
tools/lint/
tools/manifest/
tools/serve/
tools/taskcluster/
tools/wpt/