Merge pull request #37735 from github/copy-to-commonmark-compliant

Update copy syntax to CommonMark compliant
This commit is contained in:
Kevin Heis 2023-06-13 10:48:18 -07:00 коммит произвёл GitHub
Родитель 92a43a6a37 05affb1f96
Коммит 2bdba80bd7
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
239 изменённых файлов: 735 добавлений и 739 удалений

Просмотреть файл

@ -48,7 +48,7 @@ Alternatively, if you want to use the HTTPS protocol for both accounts, you can
{% data reusables.git.no-credential-manager %}
- If the output is `osxkeychain`, you're using the macOS keychain. To clear the credentials, enter the following command.
```shell{:copy}
```shell copy
git credential-osxkeychain erase https://github.com
```
{% data reusables.git.clear-stored-gcm-credentials %}
@ -67,7 +67,7 @@ Alternatively, if you want to use the HTTPS protocol for both accounts, you can
{% data reusables.git.clear-stored-gcm-credentials %}
- If the output is `wincred`, you're using the Windows Credential Manager. To clear the credentials, enter the following command.
```shell{:copy}
```shell copy
cmdkey /delete:LegacyGeneric:target=git:https://github.com
```
{% data reusables.git.cache-on-repository-path %}

Просмотреть файл

@ -35,7 +35,7 @@ We recommend that you have a basic understanding of the Go language. For more in
To get started quickly, add the starter workflow to the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
name: Go package
on: [push]
@ -69,7 +69,7 @@ The `setup-go` action is the recommended way of using Go with {% data variables.
### Using multiple versions of Go
```yaml{:copy}
```yaml copy
name: Go
on: [push]
@ -97,7 +97,7 @@ jobs:
You can configure your job to use a specific version of Go, such as `1.16.2`. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest patch release of Go 1.16:
```yaml{:copy}
```yaml copy
- name: Setup Go 1.16.x
uses: {% data reusables.actions.action-setup-go %}
with:
@ -109,7 +109,7 @@ You can configure your job to use a specific version of Go, such as `1.16.2`. Al
You can use `go get` to install dependencies:
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Setup Go
@ -134,7 +134,7 @@ The `setup-go` action searches for the dependency file, `go.sum`, in the reposit
You can use the `cache-dependency-path` parameter for cases when multiple dependency files are used, or when they are located in different subdirectories.
```yaml{:copy}
```yaml copy
- name: Setup Go
- uses: {% data reusables.actions.action-setup-go %}
with:
@ -145,7 +145,7 @@ You can use the `cache-dependency-path` parameter for cases when multiple depend
When caching is enabled, the `setup-go` action searches for the dependency file, `go.sum`, in the repository root and uses the hash of the dependency file as a part of the cache key.
```yaml{:copy}
```yaml copy
- name: Setup Go
uses: {% data reusables.actions.action-setup-go %}
with:
@ -155,7 +155,7 @@ When caching is enabled, the `setup-go` action searches for the dependency file,
Alternatively, you can use the `cache-dependency-path` parameter for cases when multiple dependency files are used, or when they are located in different subdirectories.
```yaml{:copy}
```yaml copy
- uses: {% data reusables.actions.action-setup-go %}
with:
go-version: '1.17'
@ -172,7 +172,7 @@ If you have a custom requirement or need finer controls for caching, you can use
You can use the same commands that you use locally to build and test your code. This example workflow demonstrates how to use `go build` and `go test` in a job:
```yaml{:copy}
```yaml copy
name: Go
on: [push]
@ -200,7 +200,7 @@ After a workflow completes, you can upload the resulting artifacts for analysis.
For more information, see "[AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts)."
```yaml{:copy}
```yaml copy
name: Upload Go test results
on: [push]

Просмотреть файл

@ -47,7 +47,7 @@ To get started quickly, you can choose the preconfigured Ant starter workflow wh
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
name: Java CI
on: [push]
@ -87,7 +87,7 @@ The starter workflow will run the default target specified in your _build.xml_ f
If you use different commands to build your project, or you want to run a different target, you can specify those. For example, you may want to run the `jar` target that's configured in your `_build-ci.xml_` file.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}
@ -104,7 +104,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Ant will usually create output files like JARs, EARs, or WARs in the `build/jar` directory. You can upload the contents of that directory using the `upload-artifact` action.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}

Просмотреть файл

@ -47,7 +47,7 @@ To get started quickly, you can choose the preconfigured Gradle starter workflow
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}
@ -96,7 +96,7 @@ The starter workflow will run the `build` task by default. In the default Gradle
If you use different commands to build your project, or you want to use a different task, you can specify those. For example, you may want to run the `package` task that's configured in your _ci.gradle_ file.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}
@ -127,7 +127,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Gradle will usually create output files like JARs, EARs, or WARs in the `build/libs` directory. You can upload the contents of that directory using the `upload-artifact` action.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}

Просмотреть файл

@ -47,7 +47,7 @@ To get started quickly, you can choose the preconfigured Maven starter workflow
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
name: Java CI
on: [push]
@ -87,7 +87,7 @@ The starter workflow will run the `package` target by default. In the default Ma
If you use different commands to build your project, or you want to use a different target, you can specify those. For example, you may want to run the `verify` target that's configured in a _pom-ci.xml_ file.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}
@ -104,7 +104,7 @@ steps:
You can cache your dependencies to speed up your workflow runs. After a successful run, your local Maven repository will be stored in a cache. In future workflow runs, the cache will be restored so that dependencies don't need to be downloaded from remote Maven repositories. You can cache dependencies simply using the [`setup-java` action](https://github.com/marketplace/actions/setup-java-jdk) or can use [`cache` action](https://github.com/actions/cache) for custom and more advanced configuration.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Set up JDK 11
@ -127,7 +127,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Maven will usually create output files like JARs, EARs, or WARs in the `target` directory. To upload those as artifacts, you can copy them into a new directory that contains artifacts to upload. For example, you can create a directory called `staging`. Then you can upload the contents of that directory using the `upload-artifact` action.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-java %}

Просмотреть файл

@ -39,7 +39,7 @@ We recommend that you have a basic understanding of Node.js, YAML, workflow conf
To get started quickly, add the starter workflow to the `.github/workflows` directory of your repository. The workflow shown below assumes that the default branch for your repository is `main`.
```yaml{:copy}
```yaml copy
name: Node.js CI
on:
@ -80,7 +80,7 @@ The starter workflow includes a matrix strategy that builds and tests your code
Each job can access the value defined in the matrix `node-version` array using the `matrix` context. The `setup-node` action uses the context as the `node-version` input. The `setup-node` action configures each job with a different Node.js version before building and testing code. For more information about matrix strategies and contexts, see "[AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix)" and "[AUTOTITLE](/actions/learn-github-actions/contexts)."
```yaml{:copy}
```yaml copy
strategy:
matrix:
node-version: [14.x, 16.x, 18.x, 20.x]
@ -95,7 +95,7 @@ steps:
Alternatively, you can build and test with exact Node.js versions.
```yaml{:copy}
```yaml copy
strategy:
matrix:
node-version: [10.17.0, 17.9.0]
@ -103,7 +103,7 @@ strategy:
Or, you can build and test using a single version of Node.js too.
```yaml{:copy}
```yaml copy
name: Node.js CI
on: [push]
@ -139,7 +139,7 @@ If you don't specify a Node.js version, {% data variables.product.prodname_dotco
This example installs the dependencies defined in the *package.json* file. For more information, see [`npm install`](https://docs.npmjs.com/cli/install).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js
@ -152,7 +152,7 @@ steps:
Using `npm ci` installs the versions in the *package-lock.json* or *npm-shrinkwrap.json* file and prevents updates to the lock file. Using `npm ci` is generally faster than running `npm install`. For more information, see [`npm ci`](https://docs.npmjs.com/cli/ci.html) and "[Introducing `npm ci` for faster, more reliable builds](https://blog.npmjs.org/post/171556855892/introducing-npm-ci-for-faster-more-reliable)."
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js
@ -167,7 +167,7 @@ steps:
This example installs the dependencies defined in the *package.json* file. For more information, see [`yarn install`](https://yarnpkg.com/en/docs/cli/install).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js
@ -180,7 +180,7 @@ steps:
Alternatively, you can pass `--frozen-lockfile` to install the versions in the `yarn.lock` file and prevent updates to the `yarn.lock` file.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js
@ -201,7 +201,7 @@ In the example below, the secret `NPM_TOKEN` stores the npm authentication token
Before installing dependencies, use the `setup-node` action to create the *.npmrc* file. The action has two input parameters. The `node-version` parameter sets the Node.js version, and the `registry-url` parameter sets the default registry. If your package registry uses scopes, you must use the `scope` parameter. For more information, see [`npm-scope`](https://docs.npmjs.com/misc/scope).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js
@ -233,7 +233,7 @@ You can cache and restore the dependencies using the [`setup-node` action](https
The following example caches dependencies for npm.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-node %}
@ -246,7 +246,7 @@ steps:
The following example caches dependencies for Yarn.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-node %}
@ -259,7 +259,7 @@ steps:
The following example caches dependencies for pnpm (v6.10+).
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
# NOTE: pnpm caching support requires pnpm version >= 6.10.0
@ -285,7 +285,7 @@ If you have a custom requirement or need finer controls for caching, you can use
You can use the same commands that you use locally to build and test your code. For example, if you run `npm run build` to run build steps defined in your *package.json* file and `npm test` to run your test suite, you would add those commands in your workflow file.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Use Node.js

Просмотреть файл

@ -45,7 +45,7 @@ We recommend that you have a basic understanding of Python, PyPy, and pip. For m
To get started quickly, add the starter workflow to the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -104,7 +104,7 @@ If you are using a self-hosted runner, you can configure the runner to use the `
### Using multiple Python versions
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -134,7 +134,7 @@ jobs:
You can configure a specific version of Python. For example, 3.10. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest minor release of Python 3.
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -164,7 +164,7 @@ If you specify a version of Python that is not available, `setup-python` fails w
You can also use the `exclude` keyword in your workflow if there is a configuration of Python that you do not wish to run. For more information, see "[AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstrategy)."
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -200,7 +200,7 @@ We recommend using `setup-python` to configure the version of Python used in you
{% ifversion actions-caching %}You can also cache dependencies to speed up your workflow. For more information, see "[AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows)."{% endif %}
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Set up Python
@ -215,7 +215,7 @@ steps:
After you update `pip`, a typical next step is to install dependencies from *requirements.txt*. For more information, see [pip](https://pip.pypa.io/en/stable/cli/pip_install/#example-requirements-file).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Set up Python
@ -236,7 +236,7 @@ You can cache and restore the dependencies using the [`setup-python` action](htt
The following example caches dependencies for pip.
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: {% data reusables.actions.action-setup-python %}
@ -261,7 +261,7 @@ You can use the same commands that you use locally to build and test your code.
This example installs or upgrades `pytest` and `pytest-cov`. Tests are then run and output in JUnit format while code coverage results are output in Cobertura. For more information, see [JUnit](https://junit.org/junit5/) and [Cobertura](https://cobertura.github.io/cobertura/).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Set up Python
@ -282,7 +282,7 @@ steps:
The following example installs or upgrades `ruff` and uses it to lint all files. For more information, see [Ruff](https://beta.ruff.rs/docs).
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- name: Set up Python
@ -306,7 +306,7 @@ The linting step has `continue-on-error: true` set. This will keep the workflow
With {% data variables.product.prodname_actions %}, you can run tests with tox and spread the work across multiple jobs. You'll need to invoke tox using the `-e py` option to choose the version of Python in your `PATH`, rather than specifying a specific version. For more information, see [tox](https://tox.readthedocs.io/en/latest/).
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -338,7 +338,7 @@ You can upload artifacts to view after a workflow completes. For example, you ma
The following example demonstrates how you can use the `upload-artifact` action to archive test results from running `pytest`. For more information, see the [`upload-artifact` action](https://github.com/actions/upload-artifact).
```yaml{:copy}
```yaml copy
name: Python package
on: [push]
@ -379,7 +379,7 @@ You can configure your workflow to publish your Python package to a package regi
For this example, you will need to create two [PyPI API tokens](https://pypi.org/help/#apitoken). You can use secrets to store the access tokens or credentials needed to publish your package. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -36,7 +36,7 @@ We recommend that you have a basic understanding of Swift packages. For more inf
To get started quickly, add the starter workflow to the `.github/workflows` directory of your repository.
```yaml{:copy}
```yaml copy
name: Swift
on: [push]
@ -66,7 +66,7 @@ The examples below demonstrate using the `swift-actions/setup-swift` action.
You can configure your job to use multiple versions of Swift in a matrix.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
@ -101,7 +101,7 @@ jobs:
You can configure your job to use a single specific version of Swift, such as `5.3.3`.
{% raw %}
```yaml{:copy}
```yaml copy
steps:
- uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf
with:
@ -115,7 +115,7 @@ steps:
You can use the same commands that you use locally to build and test your code using Swift. This example demonstrates how to use `swift build` and `swift test` in a job:
```yaml{:copy}
```yaml copy
steps:
- uses: {% data reusables.actions.action-checkout %}
- uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf

Просмотреть файл

@ -34,24 +34,24 @@ Before you begin, you'll create a repository on {% ifversion ghae %}{% data vari
1. From your terminal, change directories into your new repository.
```shell{:copy}
```shell copy
cd hello-world-composite-action
```
2. In the `hello-world-composite-action` repository, create a new file called `goodbye.sh`, and add the following example code:
```bash{:copy}
```bash copy
echo "Goodbye"
```
3. From your terminal, make `goodbye.sh` executable.
```shell{:copy}
```shell copy
chmod +x goodbye.sh
```
1. From your terminal, check in your `goodbye.sh` file.
```shell{:copy}
```shell copy
git add goodbye.sh
git commit -m "Add goodbye script"
git push
@ -63,7 +63,7 @@ Before you begin, you'll create a repository on {% ifversion ghae %}{% data vari
{% raw %}
**action.yml**
```yaml{:copy}
```yaml copy
name: 'Hello World'
description: 'Greet someone'
inputs:
@ -101,7 +101,7 @@ Before you begin, you'll create a repository on {% ifversion ghae %}{% data vari
1. From your terminal, check in your `action.yml` file.
```shell{:copy}
```shell copy
git add action.yml
git commit -m "Add action"
git push
@ -109,7 +109,7 @@ Before you begin, you'll create a repository on {% ifversion ghae %}{% data vari
1. From your terminal, add a tag. This example uses a tag called `v1`. For more information, see "[AUTOTITLE](/actions/creating-actions/about-custom-actions#using-release-management-for-actions)."
```shell{:copy}
```shell copy
git tag -a -m "Description of this release" v1
git push --follow-tags
```
@ -121,7 +121,7 @@ The following workflow code uses the completed hello world action that you made
Copy the workflow code into a `.github/workflows/main.yml` file in another repository, but replace `actions/hello-world-composite-action@v1` with the repository and tag you created. You can also replace the `who-to-greet` input with your name.
**.github/workflows/main.yml**
```yaml{:copy}
```yaml copy
on: [push]
jobs:

Просмотреть файл

@ -49,7 +49,7 @@ Before you begin, you'll need to create a {% data variables.product.prodname_dot
1. From your terminal, change directories into your new repository.
```shell{:copy}
```shell copy
cd hello-world-docker-action
```
@ -58,7 +58,7 @@ Before you begin, you'll need to create a {% data variables.product.prodname_dot
In your new `hello-world-docker-action` directory, create a new `Dockerfile` file. Make sure that your filename is capitalized correctly (use a capital `D` but not a capital `f`) if you're having issues. For more information, see "[AUTOTITLE](/actions/creating-actions/dockerfile-support-for-github-actions)."
**Dockerfile**
```Dockerfile{:copy}
```Dockerfile copy
# Container image that runs your code
FROM alpine:3.10
@ -75,7 +75,7 @@ Create a new `action.yml` file in the `hello-world-docker-action` directory you
{% raw %}
**action.yml**
```yaml{:copy}
```yaml copy
# action.yml
name: 'Hello World'
description: 'Greet someone and record the time'
@ -110,7 +110,7 @@ Next, the script gets the current time and sets it as an output variable that ac
1. Add the following code to your `entrypoint.sh` file.
**entrypoint.sh**
```shell{:copy}
```shell copy
#!/bin/sh -l
echo "Hello $1"
@ -126,14 +126,14 @@ Next, the script gets the current time and sets it as an output variable that ac
1. Make your `entrypoint.sh` file executable. Git provides a way to explicitly change the permission mode of a file so that it doesnt get reset every time there is a clone/fork.
```shell{:copy}
```shell copy
$ git add entrypoint.sh
$ git update-index --chmod=+x entrypoint.sh
```
1. Optionally, to check the permission mode of the file in the git index, run the following command.
```shell{:copy}
```shell copy
$ git ls-files --stage entrypoint.sh
```
@ -153,7 +153,7 @@ In your `hello-world-docker-action` directory, create a `README.md` file that sp
- An example of how to use your action in a workflow.
**README.md**
```markdown{:copy}
```markdown copy
# Hello world docker action
This action prints "Hello World" or "Hello" + the name of a person to greet to the log.
@ -183,7 +183,7 @@ From your terminal, commit your `action.yml`, `entrypoint.sh`, `Dockerfile`, and
It's best practice to also add a version tag for releases of your action. For more information on versioning your action, see "[AUTOTITLE](/actions/creating-actions/about-custom-actions#using-release-management-for-actions)."
```shell{:copy}
```shell copy
git add action.yml entrypoint.sh Dockerfile README.md
git commit -m "My first action is ready"
git tag -a -m "My first action release" v1
@ -205,7 +205,7 @@ Now you're ready to test your action out in a workflow.
The following workflow code uses the completed _hello world_ action in the public [`actions/hello-world-docker-action`](https://github.com/actions/hello-world-docker-action) repository. Copy the following workflow example code into a `.github/workflows/main.yml` file, but replace the `actions/hello-world-docker-action` with your repository and action name. You can also replace the `who-to-greet` input with your name. {% ifversion fpt or ghec %}Public actions can be used even if they're not published to {% data variables.product.prodname_marketplace %}. For more information, see "[AUTOTITLE](/actions/creating-actions/publishing-actions-in-github-marketplace#publishing-an-action)." {% endif %}
**.github/workflows/main.yml**
```yaml{:copy}
```yaml copy
on: [push]
jobs:
@ -228,7 +228,7 @@ jobs:
Copy the following example workflow code into a `.github/workflows/main.yml` file in your action's repository. You can also replace the `who-to-greet` input with your name. {% ifversion fpt or ghec %}This private action can't be published to {% data variables.product.prodname_marketplace %}, and can only be used in this repository.{% endif %}
**.github/workflows/main.yml**
```yaml{:copy}
```yaml copy
on: [push]
jobs:

Просмотреть файл

@ -45,13 +45,13 @@ Before you begin, you'll need to download Node.js and create a public {% data va
1. From your terminal, change directories into your new repository.
```shell{:copy}
```shell copy
cd hello-world-javascript-action
```
1. From your terminal, initialize the directory with npm to generate a `package.json` file.
```shell{:copy}
```shell copy
npm init -y
```
@ -59,7 +59,7 @@ Before you begin, you'll need to download Node.js and create a public {% data va
Create a new file named `action.yml` in the `hello-world-javascript-action` directory with the following example code. For more information, see "[AUTOTITLE](/actions/creating-actions/metadata-syntax-for-github-actions)."
```yaml{:copy}
```yaml copy
name: 'Hello World'
description: 'Greet someone and record the time'
inputs:
@ -89,7 +89,7 @@ The toolkit offers more than the `core` and `github` packages. For more informat
At your terminal, install the actions toolkit `core` and `github` packages.
```shell{:copy}
```shell copy
npm install @actions/core
npm install @actions/github
```
@ -105,7 +105,7 @@ GitHub Actions provide context information about the webhook event, Git refs, wo
Add a new file called `index.js`, with the following code.
{% raw %}
```javascript{:copy}
```javascript copy
const core = require('@actions/core');
const github = require('@actions/github');
@ -139,7 +139,7 @@ In your `hello-world-javascript-action` directory, create a `README.md` file tha
- Environment variables the action uses.
- An example of how to use your action in a workflow.
````markdown{:copy}
````markdown copy
# Hello world javascript action
This action prints "Hello World" or "Hello" + the name of a person to greet to the log.
@ -173,7 +173,7 @@ From your terminal, commit your `action.yml`, `index.js`, `node_modules`, `packa
It's best practice to also add a version tag for releases of your action. For more information on versioning your action, see "[AUTOTITLE](/actions/creating-actions/about-custom-actions#using-release-management-for-actions)."
```shell{:copy}
```shell copy
git add action.yml index.js node_modules/* package.json package-lock.json README.md
git commit -m "My first action is ready"
git tag -a -m "My first action release" v1.1
@ -202,7 +202,7 @@ Checking in your `node_modules` directory can cause problems. As an alternative,
1. From your terminal, commit the updates to your `action.yml`, `dist/index.js`, and `node_modules` files.
```shell{:copy}
```shell copy
git add action.yml dist/index.js node_modules/*
git commit -m "Use vercel/ncc"
git tag -a -m "My first action release" v1.1
@ -224,7 +224,7 @@ This example demonstrates how your new public action can be run from within an e
Copy the following YAML into a new file at `.github/workflows/main.yml`, and update the `uses: octocat/hello-world-javascript-action@v1.1` line with your username and the name of the public repository you created above. You can also replace the `who-to-greet` input with your name.
{% raw %}
```yaml{:copy}
```yaml copy
on: [push]
jobs:
@ -250,7 +250,7 @@ When this workflow is triggered, the runner will download the `hello-world-javas
Copy the workflow code into a `.github/workflows/main.yml` file in your action's repository. You can also replace the `who-to-greet` input with your name.
**.github/workflows/main.yml**
```yaml{:copy}
```yaml copy
on: [push]
jobs:

Просмотреть файл

@ -36,7 +36,7 @@ The following script demonstrates how you can get a user-specified version as in
{% data variables.product.prodname_dotcom %} provides [`actions/toolkit`](https://github.com/actions/toolkit), which is a set of packages that helps you create actions. This example uses the [`actions/core`](https://github.com/actions/toolkit/tree/main/packages/core) and [`actions/tool-cache`](https://github.com/actions/toolkit/tree/main/packages/tool-cache) packages.
{% raw %}
```javascript{:copy}
```javascript copy
const core = require('@actions/core');
const tc = require('@actions/tool-cache');

Просмотреть файл

@ -43,7 +43,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, using [the AWS CLI](https://aws.amazon.com/cli/):
{% raw %}```bash{:copy}
{% raw %}```bash copy
aws ecr create-repository \
--repository-name MY_ECR_REPOSITORY \
--region MY_AWS_REGION
@ -63,7 +63,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
The format of the file should be the same as the output generated by:
{% raw %}```bash{:copy}
{% raw %}```bash copy
aws ecs register-task-definition --generate-cli-skeleton
```{% endraw %}
@ -89,7 +89,7 @@ Ensure that you provide your own values for all the variables in the `env` key o
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -40,7 +40,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -77,7 +77,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -39,7 +39,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app with a Java runtime:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -63,7 +63,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -38,7 +38,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app with a .NET runtime:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -62,7 +62,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -44,7 +44,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app with a Node.js runtime:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -68,7 +68,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -38,7 +38,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app with a PHP runtime:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -62,7 +62,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -39,7 +39,7 @@ Before creating your {% data variables.product.prodname_actions %} workflow, you
For example, you can use the Azure CLI to create an Azure App Service web app with a Python runtime:
```bash{:copy}
```bash copy
az webapp create \
--name MY_WEBAPP_NAME \
--plan MY_APP_SERVICE_PLAN \
@ -65,7 +65,7 @@ Ensure that you set `AZURE_WEBAPP_NAME` in the workflow `env` key to the name of
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -51,7 +51,7 @@ Under the workflow `env` key, change the following values:
This workflow uses the `helm` render engine for the [`azure/k8s-bake` action](https://github.com/Azure/k8s-bake). If you will use the `helm` render engine, change the value of `CHART_PATH` to the path to your helm file. Change `CHART_OVERRIDE_PATH` to an array of override file paths. If you use a different render engine, update the input parameters sent to the `azure/k8s-bake` action.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -50,7 +50,7 @@ Under the workflow `env` key, change the following values:
For more information about these values, see "[Build configuration for Azure Static Web Apps](https://docs.microsoft.com/azure/static-web-apps/build-configuration?tabs=github-actions)" in the Azure documentation.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -49,7 +49,7 @@ To create the GKE cluster, you will first need to authenticate using the `gcloud
For example:
{% raw %}
```bash{:copy}
```bash copy
$ gcloud container clusters create $GKE_CLUSTER \
--project=$GKE_PROJECT \
--zone=$GKE_ZONE
@ -61,7 +61,7 @@ $ gcloud container clusters create $GKE_CLUSTER \
Enable the Kubernetes Engine and Container Registry APIs. For example:
{% raw %}
```bash{:copy}
```bash copy
$ gcloud services enable \
containerregistry.googleapis.com \
container.googleapis.com
@ -133,7 +133,7 @@ Under the `env` key, change the value of `GKE_CLUSTER` to the name of your clust
{% data reusables.actions.delete-env-key %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -73,7 +73,7 @@ Create secrets in your repository or organization for the following items:
This example workflow includes a step that imports the Apple certificate and provisioning profile from the {% data variables.product.prodname_dotcom %} secrets, and installs them on the runner.
```yaml{:copy}
```yaml copy
name: App build
on: push

Просмотреть файл

@ -236,7 +236,7 @@ You could also use a `curl` command to request the JWT, using the following envi
For example:
```shell{:copy}
```shell copy
curl -H "Authorization: bearer $ACTIONS_ID_TOKEN_REQUEST_TOKEN" "$ACTIONS_ID_TOKEN_REQUEST_URL&audience=api://AzureADTokenExchange"
```

Просмотреть файл

@ -38,7 +38,7 @@ To configure the role and trust in IAM, see the AWS documentation for ["Assuming
Edit the trust policy to add the `sub` field to the validation conditions. For example:
```json{:copy}
```json copy
"Condition": {
"StringEquals": {
"{% ifversion ghes %}HOSTNAME/_services/token{% else %}token.actions.githubusercontent.com{% endif %}:aud": "sts.amazonaws.com",
@ -49,7 +49,7 @@ Edit the trust policy to add the `sub` field to the validation conditions. For e
In the following example, `StringLike` is used with a wildcard operator (`*`) to allow any branch, pull request merge branch, or environment from the `octo-org/octo-repo` organization and repository to assume a role in AWS.
```json{:copy}
```json copy
{
"Version": "2012-10-17",
"Statement": [
@ -91,7 +91,7 @@ The `aws-actions/configure-aws-credentials` action receives a JWT from the {% da
- `<role-to-assume>`: Replace the example with your AWS role.
- `<example-aws-region>`: Add the name of your AWS region here.
```yaml{:copy}
```yaml copy
# Sample workflow to access AWS resources when workflow is tied to branch
# The workflow Creates static website using aws s3
name: AWS example workflow

Просмотреть файл

@ -57,7 +57,7 @@ The [`azure/login`](https://github.com/Azure/login) action receives a JWT from t
The following example exchanges an OIDC ID token with Azure to receive an access token, which can then be used to access cloud resources.
{% raw %}
```yaml{:copy}
```yaml copy
name: Run Azure Login with OIDC
on: [push]

Просмотреть файл

@ -62,7 +62,7 @@ This example has a job called `Get_OIDC_ID_token` that uses actions to request a
This action exchanges a {% data variables.product.prodname_dotcom %} OIDC token for a Google Cloud access token, using [Workload Identity Federation](https://cloud.google.com/iam/docs/workload-identity-federation).
{% raw %}
```yaml{:copy}
```yaml copy
name: List services in GCP
on:
pull_request:

Просмотреть файл

@ -34,11 +34,11 @@ To configure your Vault server to accept JSON Web Tokens (JWT) for authenticatio
1. Enable the JWT `auth` method, and use `write` to apply the configuration to your Vault.
For `oidc_discovery_url` and `bound_issuer` parameters, use {% ifversion ghes %}`https://HOSTNAME/_services/token`{% else %}`https://token.actions.githubusercontent.com`{% endif %}. These parameters allow the Vault server to verify the received JSON Web Tokens (JWT) during the authentication process.
```sh{:copy}
```sh copy
vault auth enable jwt
```
```sh{:copy}
```sh copy
vault write auth/jwt/config \
bound_issuer="{% ifversion ghes %}https://HOSTNAME/_services/token{% else %}https://token.actions.githubusercontent.com{% endif %}" \
oidc_discovery_url="{% ifversion ghes %}https://HOSTNAME/_services/token{% else %}https://token.actions.githubusercontent.com{% endif %}"
@ -54,7 +54,7 @@ To configure your Vault server to accept JSON Web Tokens (JWT) for authenticatio
2. Configure a policy that only grants access to the specific paths your workflows will use to retrieve secrets. For more advanced policies, see the HashiCorp Vault [Policies documentation](https://www.vaultproject.io/docs/concepts/policies).
```sh{:copy}
```sh copy
vault policy write myproject-production - <<EOF
# Read-only permission on 'secret/data/production/*' path
@ -65,7 +65,7 @@ To configure your Vault server to accept JSON Web Tokens (JWT) for authenticatio
```
3. Configure roles to group different policies together. If the authentication is successful, these policies are attached to the resulting Vault access token.
```sh{:copy}
```sh copy
vault write auth/jwt/role/myproject-production -<<EOF
{
"role_type": "jwt",
@ -124,7 +124,7 @@ This example demonstrates how to create a job that requests a secret from HashiC
- `<Role name>`: Replace this with the role you've set in the HashiCorp Vault trust relationship.
- `<Secret-Path>`: Replace this with the path to the secret you're retrieving from HashiCorp Vault. For example: `secret/data/production/ci npmToken`.
```yaml{:copy}
```yaml copy
jobs:
retrieve-secret:
runs-on: ubuntu-latest
@ -162,7 +162,7 @@ By default, the Vault server will automatically revoke access tokens when their
1. Set the `exportToken` option to `true` (default: `false`). This exports the issued Vault access token as an environment variable: `VAULT_TOKEN`.
2. Add a step to call the [Revoke a Token (Self)](https://www.vaultproject.io/api/auth/token#revoke-a-token-self) Vault API to revoke the access token.
```yaml{:copy}
```yaml copy
jobs:
retrieve-secret:
runs-on: ubuntu-latest

Просмотреть файл

@ -39,7 +39,7 @@ During a workflow run, {% data variables.product.prodname_dotcom %}'s OIDC provi
For example, the following OIDC token is for a job that was part of a called workflow. The `workflow`, `ref`, and other attributes describe the caller workflow, while `job_workflow_ref` refers to the called workflow:
```yaml{:copy}
```yaml copy
{
"typ": "JWT",
"alg": "RS256",

Просмотреть файл

@ -49,7 +49,7 @@ topics:
{% data reusables.actions.note-understanding-example %}
```yaml{:copy}
```yaml copy
name: Node.js Tests
# **What it does**: Runs our tests.
@ -214,7 +214,7 @@ jobs:
<tr>
<td>
```yaml{:copy}
```yaml copy
name: Node.js Tests
```
</td>
@ -226,7 +226,7 @@ name: Node.js Tests
<tr>
<td>
```yaml{:copy}
```yaml copy
on:
```
</td>
@ -238,7 +238,7 @@ The `on` keyword lets you define the events that trigger when the workflow is ru
<tr>
<td>
```yaml{:copy}
```yaml copy
workflow_dispatch:
```
</td>
@ -250,7 +250,7 @@ Add the `workflow_dispatch` event if you want to be able to manually run this wo
<tr>
<td>
```yaml{:copy}
```yaml copy
pull_request:
```
</td>
@ -262,7 +262,7 @@ Add the `pull_request` event, so that the workflow runs automatically every time
<tr>
<td>
```yaml{:copy}
```yaml copy
push:
branches:
- main
@ -276,7 +276,7 @@ Add the `push` event, so that the workflow runs automatically every time a commi
<tr>
<td>
```yaml{:copy}
```yaml copy
permissions:
contents: read
pull-requests: read
@ -291,7 +291,7 @@ Modifies the default permissions granted to `GITHUB_TOKEN`. This will vary depen
<td>
```yaml{:copy}
```yaml copy
concurrency:
group: {% raw %}'${{ github.workflow }} @ ${{ github.event.pull_request.head.label || github.head_ref || github.ref }}'{% endraw %}
```
@ -304,7 +304,7 @@ Creates a concurrency group for specific events, and uses the `||` operator to d
<tr>
<td>
```yaml{:copy}
```yaml copy
cancel-in-progress: true
```
</td>
@ -316,7 +316,7 @@ Cancels any currently running job or workflow in the same concurrency group.
<tr>
<td>
```yaml{:copy}
```yaml copy
jobs:
```
</td>
@ -328,7 +328,7 @@ Groups together all the jobs that run in the workflow file.
<tr>
<td>
```yaml{:copy}
```yaml copy
test:
```
</td>
@ -340,7 +340,7 @@ Defines a job with the ID `test` that is stored within the `jobs` key.
<tr>
<td>
```yaml{:copy}
```yaml copy
runs-on: {% raw %}${{ fromJSON('["ubuntu-latest", "self-hosted"]')[github.repository == 'github/docs-internal'] }}{% endraw %}
```
</td>
@ -352,7 +352,7 @@ Configures the job to run on a {% data variables.product.prodname_dotcom %}-host
<tr>
<td>
```yaml{:copy}
```yaml copy
timeout-minutes: 60
```
</td>
@ -364,7 +364,7 @@ Sets the maximum number of minutes to let the job run before it is automatically
<tr>
<td>
```yaml{:copy}
```yaml copy
strategy:
```
</td>
@ -375,7 +375,7 @@ Sets the maximum number of minutes to let the job run before it is automatically
<tr>
<td>
```yaml{:copy}
```yaml copy
fail-fast: false
```
</td>
@ -387,7 +387,7 @@ Setting `fail-fast` to `false` prevents {% data variables.product.prodname_dotco
<tr>
<td>
```yaml{:copy}
```yaml copy
matrix:
test-group:
[
@ -410,7 +410,7 @@ Creates a matrix named `test-group`, with an array of test groups. These values
<tr>
<td>
```yaml{:copy}
```yaml copy
steps:
```
</td>
@ -422,7 +422,7 @@ Groups together all the steps that will run as part of the `test` job. Each job
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Check out repo
uses: {% data reusables.actions.action-checkout %}
with:
@ -438,7 +438,7 @@ The `uses` keyword tells the job to retrieve the action named `actions/checkout`
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Figure out which docs-early-access branch to checkout, if internal repo
if: {% raw %}${{ github.repository == 'github/docs-internal' }}{% endraw %}
id: check-early-access
@ -478,7 +478,7 @@ If the current repository is the `github/docs-internal` repository, this step us
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Check out docs-early-access too, if internal repo
if: {% raw %}${{ github.repository == 'github/docs-internal' }}{% endraw %}
uses: {% data reusables.actions.action-checkout %}
@ -496,7 +496,7 @@ If the current repository is the `github/docs-internal` repository, this step ch
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Merge docs-early-access repo's folders
if: {% raw %}${{ github.repository == 'github/docs-internal' }}{% endraw %}
run: |
@ -514,7 +514,7 @@ If the current repository is the `github/docs-internal` repository, this step us
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Checkout LFS objects
run: git lfs checkout
```
@ -528,7 +528,7 @@ This step runs a command to check out LFS objects from the repository.
<td>
```yaml{:copy}
```yaml copy
- name: Gather files changed
uses: trilom/file-changes-action@a6ca26c14274c33b15e6499323aac178af06ad4b
id: get_diff_files
@ -546,7 +546,7 @@ This step uses the `trilom/file-changes-action` action to gather the files chang
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Insight into changed files
run: |
echo {% raw %}"${{ steps.get_diff_files.outputs.files }}" > get_diff_files.txt{% endraw %}
@ -560,7 +560,7 @@ This step runs a shell command that uses an output from the previous step to cre
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Setup node
uses: {% data reusables.actions.action-setup-node %}
with:
@ -576,7 +576,7 @@ This step uses the `actions/setup-node` action to install the specified version
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Install dependencies
run: npm ci
```
@ -589,7 +589,7 @@ This step runs the `npm ci` shell command to install the npm software packages f
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Cache nextjs build
uses: {% data reusables.actions.action-cache %}
with:
@ -606,7 +606,7 @@ This step uses the `actions/cache` action to cache the Next.js build, so that th
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Run build script
run: npm run build
```
@ -620,7 +620,7 @@ This step runs the build script.
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Run tests
env:
DIFF_FILE: get_diff_files.txt

Просмотреть файл

@ -45,7 +45,7 @@ topics:
{% data reusables.actions.note-understanding-example %}
```yaml{:copy}
```yaml copy
name: 'Link Checker: All English'
# **What it does**: Renders the content of every page and check all internal links.
@ -130,7 +130,7 @@ jobs:
<tr>
<td>
```yaml{:copy}
```yaml copy
name: 'Link Checker: All English'
```
</td>
@ -142,7 +142,7 @@ name: 'Link Checker: All English'
<tr>
<td>
```yaml{:copy}
```yaml copy
on:
```
</td>
@ -154,7 +154,7 @@ The `on` keyword lets you define the events that trigger when the workflow is ru
<tr>
<td>
```yaml{:copy}
```yaml copy
workflow_dispatch:
```
</td>
@ -166,7 +166,7 @@ Add the `workflow_dispatch` event if you want to be able to manually run this wo
<tr>
<td>
```yaml{:copy}
```yaml copy
push:
branches:
- main
@ -180,7 +180,7 @@ Add the `push` event, so that the workflow runs automatically every time a commi
<tr>
<td>
```yaml{:copy}
```yaml copy
pull_request:
```
</td>
@ -192,7 +192,7 @@ Add the `pull_request` event, so that the workflow runs automatically every time
<tr>
<td>
```yaml{:copy}
```yaml copy
permissions:
contents: read
pull-requests: read
@ -207,7 +207,7 @@ Modifies the default permissions granted to `GITHUB_TOKEN`. This will vary depen
<td>
{% raw %}
```yaml{:copy}
```yaml copy
concurrency:
group: '${{ github.workflow }} @ ${{ github.event.pull_request.head.label || github.head_ref || github.ref }}'
```
@ -221,7 +221,7 @@ Creates a concurrency group for specific events, and uses the `||` operator to d
<tr>
<td>
```yaml{:copy}
```yaml copy
cancel-in-progress: true
```
</td>
@ -233,7 +233,7 @@ Cancels any currently running job or workflow in the same concurrency group.
<tr>
<td>
```yaml{:copy}
```yaml copy
jobs:
```
</td>
@ -245,7 +245,7 @@ Groups together all the jobs that run in the workflow file.
<tr>
<td>
```yaml{:copy}
```yaml copy
check-links:
```
</td>
@ -258,7 +258,7 @@ Defines a job with the ID `check-links` that is stored within the `jobs` key.
<td>
{% raw %}
```yaml{:copy}
```yaml copy
runs-on: ${{ fromJSON('["ubuntu-latest", "self-hosted"]')[github.repository == 'github/docs-internal'] }}
```
{% endraw %}
@ -271,7 +271,7 @@ Configures the job to run on a {% data variables.product.prodname_dotcom %}-host
<tr>
<td>
```yaml{:copy}
```yaml copy
steps:
```
</td>
@ -283,7 +283,7 @@ Groups together all the steps that will run as part of the `check-links` job. Ea
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Checkout
uses: {% data reusables.actions.action-checkout %}
```
@ -296,7 +296,7 @@ The `uses` keyword tells the job to retrieve the action named `actions/checkout`
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Setup node
uses: {% data reusables.actions.action-setup-node %}
with:
@ -313,7 +313,7 @@ This step uses the `actions/setup-node` action to install the specified version
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Install
run: npm ci
```
@ -327,7 +327,7 @@ The `run` keyword tells the job to execute a command on the runner. In this case
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Gather files changed
uses: trilom/file-changes-action@a6ca26c14274c33b15e6499323aac178af06ad4b
with:
@ -343,7 +343,7 @@ Uses the `trilom/file-changes-action` action to gather all the changed files. Th
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Show files changed
run: cat $HOME/files.json
```
@ -356,7 +356,7 @@ Lists the contents of `files.json`. This will be visible in the workflow run's l
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Link check (warnings, changed files)
run: |
./script/rendered-content-link-checker.mjs \
@ -376,7 +376,7 @@ This step uses `run` command to execute a script that is stored in the repositor
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Link check (critical, all files)
run: |
./script/rendered-content-link-checker.mjs \

Просмотреть файл

@ -47,7 +47,7 @@ topics:
{% data reusables.actions.note-understanding-example %}
```yaml{:copy}
```yaml copy
name: Check all English links
# **What it does**: This script once a day checks all English links and reports in issues.
@ -178,7 +178,7 @@ jobs:
<tr>
<td>
```yaml{:copy}
```yaml copy
name: Check all English links
```
</td>
@ -190,7 +190,7 @@ name: Check all English links
<tr>
<td>
```yaml{:copy}
```yaml copy
on:
workflow_dispatch:
schedule:
@ -208,7 +208,7 @@ Defines the `workflow_dispatch` and `scheduled` as triggers for the workflow:
<tr>
<td>
```yaml{:copy}
```yaml copy
permissions:
contents: read
issues: write
@ -222,7 +222,7 @@ Modifies the default permissions granted to `GITHUB_TOKEN`. This will vary depen
<tr>
<td>
```yaml{:copy}
```yaml copy
jobs:
```
</td>
@ -234,7 +234,7 @@ Groups together all the jobs that run in the workflow file.
<tr>
<td>
```yaml{:copy}
```yaml copy
check_all_english_links:
name: Check all links
```
@ -247,7 +247,7 @@ Defines a job with the ID `check_all_english_links`, and the name `Check all lin
<tr>
<td>
```yaml{:copy}
```yaml copy
if: github.repository == 'github/docs-internal'
```
</td>
@ -259,7 +259,7 @@ Only run the `check_all_english_links` job if the repository is named `docs-inte
<tr>
<td>
```yaml{:copy}
```yaml copy
runs-on: ubuntu-latest
```
</td>
@ -271,7 +271,7 @@ Configures the job to run on an Ubuntu Linux runner. This means that the job wil
<tr>
<td>
```yaml{:copy}
```yaml copy
env:
GITHUB_TOKEN: {% raw %}${{ secrets.DOCUBOT_READORG_REPO_WORKFLOW_SCOPES }}{% endraw %}
REPORT_AUTHOR: docubot
@ -287,7 +287,7 @@ Creates custom environment variables, and redefines the built-in `GITHUB_TOKEN`
<tr>
<td>
```yaml{:copy}
```yaml copy
steps:
```
</td>
@ -299,7 +299,7 @@ Groups together all the steps that will run as part of the `check_all_english_li
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Check out repo's default branch
uses: {% data reusables.actions.action-checkout %}
```
@ -312,7 +312,7 @@ The `uses` keyword tells the job to retrieve the action named `actions/checkout`
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Setup Node
uses: {% data reusables.actions.action-setup-node %}
with:
@ -328,7 +328,7 @@ This step uses the `actions/setup-node` action to install the specified version
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Run the "npm ci" command
run: npm ci
- name: Run the "npm run build" command
@ -343,7 +343,7 @@ The `run` keyword tells the job to execute a command on the runner. In this case
<tr>
<td>
```yaml{:copy}
```yaml copy
- name: Run script
run: |
script/check-english-links.js > broken_links.md
@ -357,7 +357,7 @@ This `run` command executes a script that is stored in the repository at `script
<tr>
<td>
```yaml{:copy}
```yaml copy
- if: {% raw %}${{ failure() }}{% endraw %}
name: Get title for issue
id: check
@ -376,7 +376,7 @@ If the `check-english-links.js` script detects broken links and returns a non-ze
<tr>
<td>
```yaml{:copy}
```yaml copy
- if: {% raw %}${{ failure() }}{% endraw %}
name: Create issue from file
id: broken-link-report
@ -398,7 +398,7 @@ Uses the `peter-evans/create-issue-from-file` action to create a new {% data var
<tr>
<td>
```yaml{:copy}
```yaml copy
- if: {% raw %}${{ failure() }}{% endraw %}
name: Close and/or comment on old issues
env:
@ -428,7 +428,7 @@ Uses [`gh issue list`](https://cli.github.com/manual/gh_issue_list) to locate th
<tr>
<td>
```yaml{:copy}
```yaml copy
for issue_url in $(gh list-reports \
--json assignees,url \
--jq '.[] | select (.assignees != []) | .url'); do
@ -446,7 +446,7 @@ If an issue from a previous run is open and assigned to someone, then use [`gh i
<tr>
<td>
```yaml{:copy}
```yaml copy
for issue_url in $(gh list-reports \
--search 'no:assignee' \
--json url \

Просмотреть файл

@ -77,7 +77,7 @@ You can create your own runner image that meets your requirements. Your runner i
You can use the following example Dockerfile to start creating your own runner image.
```Dockerfile{:copy}
```Dockerfile copy
FROM mcr.microsoft.com/dotnet/runtime-deps:6.0 as build
# Replace value with the latest runner release version

Просмотреть файл

@ -70,7 +70,7 @@ ARC can use {% data variables.product.pat_v1_plural %} to register self-hosted r
{% endif %}
1. To create a Kubernetes secret with the value of your {% data variables.product.pat_v1 %}, use the following command.
```bash{:copy}
```bash copy
kubectl create secret generic pre-defined-secret \
--namespace=my_namespace \
--from-literal=github_token='<YOUR PAT>'

Просмотреть файл

@ -47,7 +47,7 @@ You can deploy runner scale sets with ARC's Helm charts or by deploying the nece
- Set the `GITHUB_CONFIG_URL` value to the URL of your repository, organization, or enterprise. This is the entity that the runners will belong to.
- This example command installs the latest version of the Helm chart. To install a specific version, you can pass the `--version` argument with the version of the chart you want to install. You can find the list of releases in the [`actions-runner-controller`](https://github.com/actions/actions-runner-controller/pkgs/container/actions-runner-controller-charts%2Fgha-runner-scale-set) repository.
```bash{:copy}
```bash copy
INSTALLATION_NAME="arc-runner-set"
NAMESPACE="arc-runners"
GITHUB_CONFIG_URL="https://github.com/<your_enterprise/org/repo>"
@ -64,7 +64,7 @@ You can deploy runner scale sets with ARC's Helm charts or by deploying the nece
2. To check your installation, run the following command in your terminal.
```bash{:copy}
```bash copy
helm list -A
```
@ -78,7 +78,7 @@ You can deploy runner scale sets with ARC's Helm charts or by deploying the nece
3. To check the manager pod, run the following command in your terminal.
```bash{:copy}
```bash copy
kubectl get pods -n arc-systems
```
@ -218,7 +218,7 @@ proxy:
ARC supports using anonymous or authenticated proxies. If you use authenticated proxies, you will need to set the `credentialSecretRef` value to reference a Kubernetes secret. You can create a secret with your proxy credentials with the following command.
```bash{:copy}
```bash copy
kubectl create secret generic proxy-auth \
--namespace=my_namespace \
--from-literal=username=proxyUsername \

Просмотреть файл

@ -47,7 +47,7 @@ In order to use ARC, ensure you have the following.
The following example installs the latest version of the chart. To install a specific version, you can pass the `--version` argument along with the version of the chart you wish to install. You can find the list of releases in the [GitHub Container Registry](https://github.com/actions/actions-runner-controller/pkgs/container/actions-runner-controller-charts%2Fgha-runner-scale-set-controller).
```bash{:copy}
```bash copy
NAMESPACE="arc-systems"
helm install arc \
--namespace "{% raw %}${NAMESPACE}{% endraw %}" \
@ -78,7 +78,7 @@ In order to use ARC, ensure you have the following.
{% endnote %}
```bash{:copy}
```bash copy
INSTALLATION_NAME="arc-runner-set"
NAMESPACE="arc-runners"
GITHUB_CONFIG_URL="https://github.com/<your_enterprise/org/repo>"
@ -95,7 +95,7 @@ In order to use ARC, ensure you have the following.
2. From your terminal, run the following command to check your installation.
```bash{:copy}
```bash copy
helm list -A
```
@ -109,7 +109,7 @@ In order to use ARC, ensure you have the following.
3. To check the manager pod, run the following command in your terminal.
```bash{:copy}
```bash copy
kubectl get pods -n arc-systems
```
@ -131,7 +131,7 @@ Now you will create and run a simple test workflow that uses the runner scale se
For more information on adding workflows to a repository, see "[AUTOTITLE](/actions/quickstart#creating-your-first-workflow)."
```yaml{:copy}
```yaml copy
name: Actions Runner Controller Demo
on:
workflow_dispatch:
@ -148,7 +148,7 @@ Now you will create and run a simple test workflow that uses the runner scale se
1. To view the runner pods being created while the workflow is running, run the following command from your terminal.
```bash{:copy}
```bash copy
kubectl get pods -n arc-runners
```

Просмотреть файл

@ -68,13 +68,13 @@ app.kubernetes.io/version= # Chart version
To check the logs of the controller pod, you can use the following command.
```bash{:copy}
```bash copy
kubectl logs -n <CONTROLLER_NAMESPACE> -l app.kubernetes.io/name=gha-runner-scale-set-controller
```
To check the logs of the runner set listener, you can use the following command.
```bash{:copy}
```bash copy
kubectl logs -n <CONTROLLER_NAMESPACE> -l auto-scaling-runner-set-namespace=arc-systems -l auto-scaling-runner-set-name=arc-runner-set
```
@ -120,7 +120,7 @@ To fix this, you can do one of the following things.
- Use a volume type that supports `securityContext.fsGroup`. `hostPath` volumes do not support this property, whereas `local` volumes and other types of volumes do support it. Update the `fsGroup` of your runner pod to match the GID of the runner. You can do this by updating the `gha-runner-scale-set` helm chart values to include the following. Replace `VERSION` with the version of the `actions-runner` container image you want to use.
```yaml{:copy}
```yaml copy
spec:
securityContext:
fsGroup: 123
@ -132,7 +132,7 @@ To fix this, you can do one of the following things.
- If updating the `securityContext` of your runner pod is not a viable solution, you can work around the issue by using `initContainers` to change the mounted volume's ownership, as follows.
```yaml{:copy}
```yaml copy
template:
spec:
initContainers:

Просмотреть файл

@ -89,7 +89,7 @@ The `prepare_job` command is called when a job is started. {% data variables.pro
#### Example input for `prepare_job`
```json{:copy}
```json copy
{
"command": "prepare_job",
"responseFile": "/users/octocat/runner/_work/{guid}.json",
@ -176,7 +176,7 @@ The `prepare_job` command is called when a job is started. {% data variables.pro
This example output is the contents of the `responseFile` defined in the input above.
```json{:copy}
```json copy
{
"state": {
"network": "example_network_53269bd575972817b43f7733536b200c",
@ -220,7 +220,7 @@ No arguments are provided for `cleanup_job`.
#### Example input for `cleanup_job`
```json{:copy}
```json copy
{
"command": "cleanup_job",
"responseFile": null,
@ -276,7 +276,7 @@ The `run_container_step` command is called once for each container action in you
If you're using a Docker image, you can specify the image name in the `"image":` parameter.
```json{:copy}
```json copy
{
"command": "run_container_step",
"responseFile": null,
@ -352,7 +352,7 @@ If you're using a Docker image, you can specify the image name in the `"image":`
If your container is defined by a Dockerfile, this example demonstrates how to specify the path to a `Dockerfile` in your input, using the `"dockerfile":` parameter.
```json{:copy}
```json copy
{
"command": "run_container_step",
"responseFile": null,
@ -445,7 +445,7 @@ No output is expected for `run_container_step`.
#### Example input for `run_script_step`
```json{:copy}
```json copy
{
"command": "run_script_step",
"responseFile": null,

Просмотреть файл

@ -113,7 +113,7 @@ You can print the contents of contexts to the log for debugging. The [`toJSON` f
{% data reusables.actions.github-context-warning %}
{% raw %}
```yaml{:copy}
```yaml copy
name: Context testing
on: push
@ -259,7 +259,7 @@ The following example context is from a workflow run triggered by the `push` eve
This example workflow uses the `github.event_name` context to run a job only if the workflow run was triggered by the `pull_request` event.
```yaml{:copy}
```yaml copy
name: Run CI
on: [push, pull_request]
@ -311,7 +311,7 @@ This example workflow shows how the `env` context can be configured at the workf
{% data reusables.repositories.actions-env-var-note %}
{% raw %}
```yaml{:copy}
```yaml copy
name: Hi Mascot
on: push
env:
@ -403,7 +403,7 @@ This example `job` context uses a PostgreSQL service container with mapped ports
This example workflow configures a PostgreSQL service container, and automatically maps port 5432 in the service container to a randomly chosen available port on the host. The `job` context is used to access the number of the port that was assigned on the host.
```yaml{:copy}
```yaml copy
name: PostgreSQL Service Example
on: push
jobs:
@ -457,7 +457,7 @@ This example `jobs` context contains the result and outputs of a job from a reus
This example reusable workflow uses the `jobs` context to set outputs for the reusable workflow. Note how the outputs flow up from the steps, to the job, then to the `workflow_call` trigger. For more information, see "[AUTOTITLE](/actions/using-workflows/reusing-workflows#using-outputs-from-a-reusable-workflow)."
{% raw %}
```yaml{:copy}
```yaml copy
name: Reusable workflow
on:
@ -532,7 +532,7 @@ This example `steps` context shows two previous steps that had an [`id`](/action
This example workflow generates a random number as an output in one step, and a later step uses the `steps` context to read the value of that output.
```yaml{:copy}
```yaml copy
name: Generate random failure
on: push
jobs:
@ -594,7 +594,7 @@ The following example context is from a Linux {% data variables.product.prodname
This example workflow uses the `runner` context to set the path to the temporary directory to write logs, and if the workflow fails, it uploads those logs as artifact.
```yaml{:copy}
```yaml copy
name: Build
on: push
@ -674,7 +674,7 @@ The following example contents of the `strategy` context is from a matrix with f
This example workflow uses the `strategy.job-index` property to set a unique name for a log file for each job in a matrix.
```yaml{:copy}
```yaml copy
name: Test matrix
on: push
@ -721,7 +721,7 @@ The following example contents of the `matrix` context is from a job in a matrix
This example workflow creates a matrix with `os` and `node` keys. It uses the `matrix.os` property to set the runner type for each job, and uses the `matrix.node` property to set the Node.js version for each job.
```yaml{:copy}
```yaml copy
name: Test matrix
on: push
@ -778,7 +778,7 @@ The following example contents of the `needs` context shows information for two
This example workflow has three jobs: a `build` job that does a build, a `deploy` job that requires the `build` job, and a `debug` job that requires both the `build` and `deploy` jobs and runs only if there is a failure in the workflow. The `deploy` job also uses the `needs` context to access an output from the `build` job.
```yaml{:copy}
```yaml copy
name: Build and deploy
on: push
@ -842,7 +842,7 @@ The following example contents of the `inputs` context is from a workflow that h
This example reusable workflow uses the `inputs` context to get the values of the `build_id`, `deploy_target`, and `perform_deploy` inputs that were passed to the reusable workflow from the caller workflow.
{% raw %}
```yaml{:copy}
```yaml copy
name: Reusable deploy workflow
on:
workflow_call:
@ -873,7 +873,7 @@ jobs:
This example workflow triggered by a `workflow_dispatch` event uses the `inputs` context to get the values of the `build_id`, `deploy_target`, and `perform_deploy` inputs that were passed to the workflow.
{% raw %}
```yaml{:copy}
```yaml copy
on:
workflow_dispatch:
inputs:

Просмотреть файл

@ -52,7 +52,7 @@ To set a custom environment variable{% ifversion actions-configuration-variables
* A specific step within a job, by using [`jobs.<job_id>.steps[*].env`](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsenv).
{% raw %}
```yaml{:copy}
```yaml copy
name: Greeting on variable day
on:
@ -222,7 +222,7 @@ In addition to runner environment variables, {% data variables.product.prodname_
Runner environment variables are always interpolated on the runner machine. However, parts of a workflow are processed by {% data variables.product.prodname_actions %} and are not sent to the runner. You cannot use environment variables in these parts of a workflow file. Instead, you can use contexts. For example, an `if` conditional, which determines whether a job or step is sent to the runner, is always processed by {% data variables.product.prodname_actions %}. You can use a context in an `if` conditional statement to access the value of an variable.
{% raw %}
```yaml{:copy}
```yaml copy
env:
DAY_OF_WEEK: Monday
@ -343,7 +343,7 @@ We strongly recommend that actions use variables to access the filesystem rather
You can write a single workflow file that can be used for different operating systems by using the `RUNNER_OS` default environment variable and the corresponding context property <span style="white-space: nowrap;">{% raw %}`${{ runner.os }}`{% endraw %}</span>. For example, the following workflow could be run successfully if you changed the operating system from `macos-latest` to `windows-latest` without having to alter the syntax of the environment variables, which differs depending on the shell being used by the runner.
{% raw %}
```yaml{:copy}
```yaml copy
jobs:
if-Windows-else:
runs-on: macos-latest

Просмотреть файл

@ -31,7 +31,7 @@ In the tutorial, you will first make a workflow file that uses the [`actions/git
2. {% data reusables.actions.make-workflow-file %}
3. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
name: Label issues
on:
issues:

Просмотреть файл

@ -29,7 +29,7 @@ In the tutorial, you will first make a workflow file that uses the [`actions/sta
2. {% data reusables.actions.make-workflow-file %}
3. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
name: Close inactive issues
on:
schedule:

Просмотреть файл

@ -29,7 +29,7 @@ In the tutorial, you will first make a workflow file that uses the [`peter-evans
2. {% data reusables.actions.make-workflow-file %}
3. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
{% indented_data_reference reusables.actions.actions-not-certified-by-github-comment spaces=4 %}
{% indented_data_reference reusables.actions.actions-use-sha-pinning-comment spaces=4 %}

Просмотреть файл

@ -30,7 +30,7 @@ In the tutorial, you will first make a workflow file that uses the [`alex-page/g
3. {% data reusables.actions.make-workflow-file %}
4. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
{% indented_data_reference reusables.actions.actions-not-certified-by-github-comment spaces=4 %}
{% indented_data_reference reusables.actions.actions-use-sha-pinning-comment spaces=4 %}

Просмотреть файл

@ -30,7 +30,7 @@ In the tutorial, you will first make a workflow file that uses the [`actions/git
3. {% data reusables.actions.make-workflow-file %}
4. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
name: Remove a label
on:
project_card:

Просмотреть файл

@ -29,7 +29,7 @@ In the tutorial, you will first make a workflow file that uses the [`imjohnbo/is
2. {% data reusables.actions.make-workflow-file %}
3. Copy the following YAML contents into your workflow file.
```yaml{:copy}
```yaml copy
{% indented_data_reference reusables.actions.actions-not-certified-by-github-comment spaces=4 %}
{% indented_data_reference reusables.actions.actions-use-sha-pinning-comment spaces=4 %}

Просмотреть файл

@ -65,7 +65,7 @@ You can create custom transformers that {% data variables.product.prodname_actio
The following example converts a build step that uses the "buildJavascriptApp" identifier to run various `npm` commands:
```ruby{:copy}
```ruby copy
transform "buildJavascriptApp" do |item|
command = ["build", "package", "deploy"].map do |script|
"npm run #{script}"
@ -109,19 +109,19 @@ You can customize the mapping between runners in your source CI/CD instance and
The following example shows a `runner` method that converts one runner label to one {% data variables.product.prodname_actions %} runner label in the resulting workflow.
```ruby{:copy}
```ruby copy
runner "linux", "ubuntu-latest"
```
You can also use the `runner` method to convert one runner label to multiple {% data variables.product.prodname_actions %} runner labels in the resulting workflow.
```ruby{:copy}
```ruby copy
runner "big-agent", ["self-hosted", "xl", "linux"]
```
{% data variables.product.prodname_actions_importer %} attempts to map the runner label as best it can. In cases where it cannot do this, the `ubuntu-latest` runner label is used as a default. You can use a special keyword with the `runner` method to control this default value. For example, the following custom transformer instructs {% data variables.product.prodname_actions_importer %} to use `macos-latest` as the default runner instead of `ubuntu-latest`.
```ruby{:copy}
```ruby copy
runner :default, "macos-latest"
```
@ -140,19 +140,19 @@ There are several ways you can set up custom transformers to map your environmen
- The following example sets the value of any existing environment variables named `OCTO`, to `CAT` when transforming a pipeline.
```ruby{:copy}
```ruby copy
env "OCTO", "CAT"
```
You can also remove all instances of a specific environment variable so they are not transformed to an {% data variables.product.prodname_actions %} workflow. The following example removes all environment variables with the name `MONA_LISA`.
```ruby{:copy}
```ruby copy
env "MONA_LISA", nil
```
- You can also map your existing environment variables to secrets. For example, the following `env` method maps an environment variable named `MONALISA` to a secret named `OCTOCAT`.
```ruby{:copy}
```ruby copy
env "MONALISA", secret("OCTOCAT")
```
@ -160,13 +160,13 @@ There are several ways you can set up custom transformers to map your environmen
- You can also use regular expressions to update the values of multiple environment variables at once. For example, the following custom transformer removes all environment variables from the converted workflow:
```ruby{:copy}
```ruby copy
env /.*/, nil
```
The following example uses a regular expression match group to transform environment variable values to dynamically generated secrets.
```ruby{:copy}
```ruby copy
env /^(.+)_SSH_KEY/, secret("%s_SSH_KEY)
```

Просмотреть файл

@ -101,7 +101,7 @@ The supported values for `--features` are:
- `ghes-<number>`, where `<number>` is the version of {% data variables.product.prodname_ghe_server %}, `3.0` or later. For example, `ghes-3.3`.
You can view the list of available feature flags by {% data variables.product.prodname_actions_importer %} by running the `list-features` command. For example:
```shell{:copy}
```shell copy
gh actions-importer list-features
```

Просмотреть файл

@ -72,7 +72,7 @@ The `build-push-action` options required for Docker Hub are:
* `tags`: The tag of your new image in the format `DOCKER-HUB-NAMESPACE/DOCKER-HUB-REPOSITORY:VERSION`. You can set a single tag as shown below, or specify multiple tags in a list.
* `push`: If set to `true`, the image will be pushed to the registry if it is built successfully.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}
@ -152,7 +152,7 @@ The above workflow is triggered by a push to the "release" branch. It checks out
{% else %}
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}
@ -203,7 +203,7 @@ In a single workflow, you can publish your Docker image to multiple registries b
The following example workflow uses the steps from the previous sections ("[Publishing images to Docker Hub](#publishing-images-to-docker-hub)" and "[Publishing images to {% data variables.product.prodname_registry %}](#publishing-images-to-github-packages)") to create a single workflow that pushes to both registries.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -50,7 +50,7 @@ Each time you create a new release, you can trigger a workflow to publish your p
You can define a new Maven repository in the publishing block of your _build.gradle_ file that points to your package repository. For example, if you were deploying to the Maven Central Repository through the OSSRH hosting project, your _build.gradle_ could specify a repository with the name `"OSSRH"`.
{% raw %}
```groovy{:copy}
```groovy copy
plugins {
...
id 'maven-publish'
@ -75,7 +75,7 @@ publishing {
With this configuration, you can create a workflow that publishes your package to the Maven Central Repository by running the `gradle publish` command. In the deploy step, youll need to set environment variables for the username and password or token that you use to authenticate to the Maven repository. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
@ -122,7 +122,7 @@ You can define a new Maven repository in the publishing block of your _build.gra
For example, if your organization is named "octocat" and your repository is named "hello-world", then the {% data variables.product.prodname_registry %} configuration in _build.gradle_ would look similar to the below example.
{% raw %}
```groovy{:copy}
```groovy copy
plugins {
...
id 'maven-publish'
@ -147,7 +147,7 @@ publishing {
With this configuration, you can create a workflow that publishes your package to {% data variables.product.prodname_registry %} by running the `gradle publish` command.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
@ -195,7 +195,7 @@ For example, if you deploy to the Central Repository through the OSSRH hosting p
If your organization is named "octocat" and your repository is named "hello-world", then the configuration in _build.gradle_ would look similar to the below example.
{% raw %}
```groovy{:copy}
```groovy copy
plugins {
...
id 'maven-publish'
@ -228,7 +228,7 @@ publishing {
With this configuration, you can create a workflow that publishes your package to both the Maven Central Repository and {% data variables.product.prodname_registry %} by running the `gradle publish` command.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}

Просмотреть файл

@ -54,7 +54,7 @@ In this workflow, you can use the `setup-java` action. This action installs the
For example, if you were deploying to the Maven Central Repository through the OSSRH hosting project, your _pom.xml_ could specify a distribution management repository with the `id` of `ossrh`.
{% raw %}
```xml{:copy}
```xml copy
<project ...>
...
<distributionManagement>
@ -72,7 +72,7 @@ With this configuration, you can create a workflow that publishes your package t
In the deploy step, youll need to set the environment variables to the username that you authenticate with to the repository, and to a secret that youve configured with the password or token to authenticate with. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
```yaml{:copy}
```yaml copy
name: Publish package to the Maven Central Repository
on:
release:
@ -118,7 +118,7 @@ For a Maven-based project, you can make use of these settings by creating a dist
For example, if your organization is named "octocat" and your repository is named "hello-world", then the {% data variables.product.prodname_registry %} configuration in _pom.xml_ would look similar to the below example.
{% raw %}
```xml{:copy}
```xml copy
<project ...>
...
<distributionManagement>
@ -134,7 +134,7 @@ For example, if your organization is named "octocat" and your repository is name
With this configuration, you can create a workflow that publishes your package to {% data variables.product.prodname_registry %} by making use of the automatically generated _settings.xml_.
```yaml{:copy}
```yaml copy
name: Publish package to GitHub Packages
on:
release:
@ -171,7 +171,7 @@ You can publish your packages to both the Maven Central Repository and {% data v
Ensure your _pom.xml_ file includes a distribution management repository for both your {% data variables.product.prodname_dotcom %} repository and your Maven Central Repository provider. For example, if you deploy to the Central Repository through the OSSRH hosting project, you might want to specify it in a distribution management repository with the `id` set to `ossrh`, and you might want to specify {% data variables.product.prodname_registry %} in a distribution management repository with the `id` set to `github`.
```yaml{:copy}
```yaml copy
name: Publish package to the Maven Central Repository and GitHub Packages
on:
release:

Просмотреть файл

@ -60,7 +60,7 @@ If you're publishing a package that includes a scope prefix, include the scope i
This example stores the `NPM_TOKEN` secret in the `NODE_AUTH_TOKEN` environment variable. When the `setup-node` action creates an *.npmrc* file, it references the token from the `NODE_AUTH_TOKEN` environment variable.
```yaml{:copy}
```yaml copy
name: Publish Package to npmjs
on:
release:
@ -120,7 +120,7 @@ If you want to publish your package to a different repository, you must use a {%
This example stores the `GITHUB_TOKEN` secret in the `NODE_AUTH_TOKEN` environment variable. When the `setup-node` action creates an *.npmrc* file, it references the token from the `NODE_AUTH_TOKEN` environment variable.
```yaml{:copy}
```yaml copy
name: Publish package to GitHub Packages
on:
release:
@ -158,7 +158,7 @@ always-auth=true
If you use the Yarn package manager, you can install and publish packages using Yarn.
```yaml{:copy}
```yaml copy
name: Publish Package to npmjs
on:
release:

Просмотреть файл

@ -29,7 +29,7 @@ The following example shows you how {% data variables.product.prodname_actions %
1. In the `.github/workflows` directory, create a file named `github-actions-demo.yml`. For more information, see "[AUTOTITLE](/repositories/working-with-files/managing-files/creating-new-files)."
1. Copy the following YAML contents into the `github-actions-demo.yml` file:
```yaml{:copy}
```yaml copy
name: GitHub Actions Demo
{%- ifversion actions-run-name %}
run-name: {% raw %}${{ github.actor }}{% endraw %} is testing out GitHub Actions 🚀

Просмотреть файл

@ -51,7 +51,7 @@ You can use the `services` keyword to create service containers that are part of
This example creates a service called `redis` in a job called `container-job`. The Docker host in this example is the `node:16-bullseye` container.
{% raw %}
```yaml{:copy}
```yaml copy
name: Redis container example
on: push
@ -93,7 +93,7 @@ When you specify the Docker host port but not the container port, the container
This example maps the service container `redis` port 6379 to the Docker host port 6379.
{% raw %}
```yaml{:copy}
```yaml copy
name: Redis Service Example
on: push

Просмотреть файл

@ -40,7 +40,7 @@ You may also find it helpful to have a basic understanding of YAML, the syntax f
{% data reusables.actions.copy-workflow-file %}
```yaml{:copy}
```yaml copy
name: PostgreSQL service example
on: push
@ -96,7 +96,7 @@ jobs:
{% data reusables.actions.postgres-label-description %}
```yaml{:copy}
```yaml copy
jobs:
# Label of the container job
container-job:
@ -126,7 +126,7 @@ jobs:
{% data reusables.actions.service-template-steps %}
```yaml{:copy}
```yaml copy
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -160,7 +160,7 @@ When you run a job directly on the runner machine, you'll need to map the ports
{% data reusables.actions.copy-workflow-file %}
```yaml{:copy}
```yaml copy
name: PostgreSQL Service Example
on: push
@ -220,7 +220,7 @@ jobs:
The workflow maps port 5432 on the PostgreSQL service container to the Docker host. For more information about the `ports` keyword, see "[AUTOTITLE](/actions/using-containerized-services/about-service-containers#mapping-docker-host-and-service-container-ports)."
```yaml{:copy}
```yaml copy
jobs:
# Label of the runner job
runner-job:
@ -251,7 +251,7 @@ jobs:
{% data reusables.actions.service-template-steps %}
```yaml{:copy}
```yaml copy
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -287,7 +287,7 @@ You can modify *client.js* to include any PostgreSQL operations needed by your w
{% data reusables.actions.service-container-add-script %}
```javascript{:copy}
```javascript copy
const { Client } = require('pg');
const pgclient = new Client({

Просмотреть файл

@ -40,7 +40,7 @@ You may also find it helpful to have a basic understanding of YAML, the syntax f
{% data reusables.actions.copy-workflow-file %}
```yaml{:copy}
```yaml copy
name: Redis container example
on: push
@ -93,7 +93,7 @@ jobs:
{% data reusables.actions.redis-label-description %}
```yaml{:copy}
```yaml copy
jobs:
# Label of the container job
container-job:
@ -120,7 +120,7 @@ jobs:
{% data reusables.actions.service-template-steps %}
```yaml{:copy}
```yaml copy
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -153,7 +153,7 @@ When you run a job directly on the runner machine, you'll need to map the ports
{% data reusables.actions.copy-workflow-file %}
```yaml{:copy}
```yaml copy
name: Redis runner example
on: push
@ -210,7 +210,7 @@ jobs:
The workflow maps port 6379 on the Redis service container to the Docker host. For more information about the `ports` keyword, see "[AUTOTITLE](/actions/using-containerized-services/about-service-containers#mapping-docker-host-and-service-container-ports)."
```yaml{:copy}
```yaml copy
jobs:
# Label of the runner job
runner-job:
@ -238,7 +238,7 @@ jobs:
{% data reusables.actions.service-template-steps %}
```yaml{:copy}
```yaml copy
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -274,7 +274,7 @@ You can modify *client.js* to include any Redis operations needed by your workfl
{% data reusables.actions.service-container-add-script %}
```javascript{:copy}
```javascript copy
const redis = require("redis");
// Creates a new Redis client

Просмотреть файл

@ -41,7 +41,7 @@ The following example workflow has two jobs, named `Run-npm-on-Ubuntu` and `Run-
- The job named `Run-npm-on-Ubuntu` is executed on a Linux VM, because the job's `runs-on:` specifies `ubuntu-latest`.
- The job named `Run-PSScriptAnalyzer-on-Windows` is executed on a Windows VM, because the job's `runs-on:` specifies `windows-latest`.
```yaml{:copy}
```yaml copy
name: Run commands on different operating systems
on:
push:

Просмотреть файл

@ -119,7 +119,7 @@ For a more detailed explanation of the cache matching process, see "[Matching a
This example creates a new cache when the packages in `package-lock.json` file change, or when the runner's operating system changes. The cache key uses contexts and expressions to generate a key that includes the runner's operating system and a SHA-256 hash of the `package-lock.json` file.
```yaml{:copy}
```yaml copy
name: Caching with npm
on: push
jobs:

Просмотреть файл

@ -49,7 +49,7 @@ This procedure demonstrates how to create a starter workflow and metadata file.
For example, this file named `octo-organization-ci.yml` demonstrates a basic workflow.
```yaml{:copy}
```yaml copy
name: Octo Organization CI
on:
@ -69,7 +69,7 @@ This procedure demonstrates how to create a starter workflow and metadata file.
run: echo Hello from Octo Organization
```
4. Create a metadata file inside the `workflow-templates` directory. The metadata file must have the same name as the workflow file, but instead of the `.yml` extension, it must be appended with `.properties.json`. For example, this file named `octo-organization-ci.properties.json` contains the metadata for a workflow file named `octo-organization-ci.yml`:
```json{:copy}
```json copy
{
"name": "Octo Organization Workflow",
"description": "Octo Organization CI starter workflow.",

Просмотреть файл

@ -163,7 +163,7 @@ You can define inputs and secrets, which can be passed from the caller workflow
This reusable workflow file named `workflow-B.yml` (we'll refer to this later in the [example caller workflow](#example-caller-workflow)) takes an input string and a secret from the caller workflow and uses them in an action.
{% raw %}
```yaml{:copy}
```yaml copy
name: Reusable workflow example
on:
@ -215,7 +215,7 @@ A matrix strategy lets you use variables in a single job definition to automatic
This example job below calls a reusable workflow and references the matrix context by defining the variable `target` with the values `[dev, stage, prod]`. It will run three jobs, one for each value in the variable.
{% raw %}
```yaml{:copy}
```yaml copy
jobs:
ReuseableMatrixJobForDeployment:
strategy:
@ -263,7 +263,7 @@ When you call a reusable workflow, you can only use the following keywords in th
This workflow file calls two workflow files. The second of these, `workflow-B.yml` (shown in the [example reusable workflow](#example-reusable-workflow)), is passed an input (`config-path`) and a secret (`token`).
{% raw %}
```yaml{:copy}
```yaml copy
name: Call a reusable workflow
on:
@ -295,7 +295,7 @@ You can connect a maximum of four levels of workflows - that is, the top-level c
From within a reusable workflow you can call another reusable workflow.
{% raw %}
```yaml{:copy}
```yaml copy
name: Reusable workflow
on:
@ -349,7 +349,7 @@ That means if the last successful completing reusable workflow sets an empty str
The following reusable workflow has a single job containing two steps. In each of these steps we set a single word as the output: "hello" and "world." In the `outputs` section of the job, we map these step outputs to job outputs called: `output1` and `output2`. In the `on.workflow_call.outputs` section we then define two outputs for the workflow itself, one called `firstword` which we map to `output1`, and one called `secondword` which we map to `output2`.
{% raw %}
```yaml{:copy}
```yaml copy
name: Reusable workflow
on:
@ -390,7 +390,7 @@ jobs:
We can now use the outputs in the caller workflow, in the same way you would use the outputs from a job within the same workflow. We reference the outputs using the names defined at the workflow level in the reusable workflow: `firstword` and `secondword`. In this workflow, `job1` calls the reusable workflow and `job2` prints the outputs from the reusable workflow ("hello world") to standard output in the workflow log.
{% raw %}
```yaml{:copy}
```yaml copy
name: Call a reusable workflow and use its outputs
on:

Просмотреть файл

@ -91,7 +91,7 @@ This example shows you how to create a workflow for a Node.js project that build
The workflow uploads the production artifacts in the `dist` directory, but excludes any markdown files. It also uploads the `code-coverage.html` report as another artifact.
```yaml{:copy}
```yaml copy
name: Node CI
on: [push]
@ -125,7 +125,7 @@ jobs:
You can define a custom retention period for individual artifacts created by a workflow. When using a workflow to create a new artifact, you can use `retention-days` with the `upload-artifact` action. This example demonstrates how to set a custom retention period of 5 days for the artifact named `my-artifact`:
```yaml{:copy}
```yaml copy
- name: 'Upload Artifact'
uses: {% data reusables.actions.action-upload-artifact %}
with:
@ -193,7 +193,7 @@ Job 3 displays the result uploaded in the previous job:
The full math operation performed in this workflow example is `(3 + 7) x 9 = 90`.
```yaml{:copy}
```yaml copy
name: Share data between jobs
on: [push]

Просмотреть файл

@ -23,7 +23,7 @@ type: how_to
You can execute any {% data variables.product.prodname_cli %} command. For example, this workflow uses the `gh issue comment` subcommand to add a comment when an issue is opened.
```yaml{:copy}
```yaml copy
name: Comment when opened
on:
issues:
@ -41,7 +41,7 @@ jobs:
You can also execute API calls through {% data variables.product.prodname_cli %}. For example, this workflow first uses the `gh api` subcommand to query the GraphQL API and parse the result. Then it stores the result in an environment variable that it can access in a later step. In the second step, it uses the `gh issue create` subcommand to create an issue containing the information from the first step.
```yaml{:copy}
```yaml copy
name: Report remaining open issues
on:
schedule:

Просмотреть файл

@ -30,7 +30,7 @@ Most workflow commands use the `echo` command in a specific format, while others
{% bash %}
```bash{:copy}
```bash copy
echo "::workflow-command parameter1={data},parameter2={data}::{command value}"
```
@ -38,7 +38,7 @@ echo "::workflow-command parameter1={data},parameter2={data}::{command value}"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::workflow-command parameter1={data},parameter2={data}::{command value}"
```
@ -63,7 +63,7 @@ The [actions/toolkit](https://github.com/actions/toolkit) includes a number of f
{%- ifversion actions-save-state-set-output-envs %}
For example, instead of using code to create an error annotation, as below:
```javascript{:copy}
```javascript copy
core.error('Missing semicolon', {file: 'app.js', startLine: 1})
```
@ -73,7 +73,7 @@ You can use the `error` command in your workflow to create the same error annota
{% bash %}
```yaml{:copy}
```yaml copy
- name: Create annotation for build error
run: echo "::error file=app.js,line=1::Missing semicolon"
```
@ -82,7 +82,7 @@ You can use the `error` command in your workflow to create the same error annota
{% powershell %}
```yaml{:copy}
```yaml copy
- name: Create annotation for build error
run: Write-Output "::error file=app.js,line=1::Missing semicolon"
```
@ -91,7 +91,7 @@ You can use the `error` command in your workflow to create the same error annota
{%- else %}
For example, instead of using code to set an output, as below:
```javascript{:copy}
```javascript copy
core.setOutput('SELECTED_COLOR', 'green');
```
@ -101,7 +101,7 @@ You can use the `set-output` command in your workflow to set the same value:
{% bash %}
```yaml{:copy}
```yaml copy
- name: Set selected color
run: echo '::set-output name=SELECTED_COLOR::green'
id: random-color-generator
@ -115,7 +115,7 @@ You can use the `set-output` command in your workflow to set the same value:
{% powershell %}
```yaml{:copy}
```yaml copy
- name: Set selected color
run: Write-Output "::set-output name=SELECTED_COLOR::green"
id: random-color-generator
@ -158,7 +158,7 @@ The following table shows which toolkit functions are available within a workflo
Sets an action's output parameter.
```{:copy}
```text copy
::set-output name={name}::{value}
```
@ -170,7 +170,7 @@ You can escape multiline strings for setting an output parameter by creating an
{% bash %}
```bash{:copy}
```bash copy
echo "::set-output name=action_fruit::strawberry"
```
@ -178,7 +178,7 @@ echo "::set-output name=action_fruit::strawberry"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::set-output name=action_fruit::strawberry"
```
@ -189,7 +189,7 @@ Write-Output "::set-output name=action_fruit::strawberry"
Prints a debug message to the log. You must create a secret named `ACTIONS_STEP_DEBUG` with the value `true` to see the debug messages set by this command in the log. For more information, see "[AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/enabling-debug-logging)."
```{:copy}
```text copy
::debug::{message}
```
@ -197,7 +197,7 @@ Prints a debug message to the log. You must create a secret named `ACTIONS_STEP_
{% bash %}
```bash{:copy}
```bash copy
echo "::debug::Set the Octocat variable"
```
@ -205,7 +205,7 @@ echo "::debug::Set the Octocat variable"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::debug::Set the Octocat variable"
```
@ -215,7 +215,7 @@ Write-Output "::debug::Set the Octocat variable"
Creates a notice message and prints the message to the log. {% data reusables.actions.message-annotation-explanation %}
```{:copy}
```text copy
::notice file={name},line={line},endLine={endLine},title={title}::{message}
```
@ -225,7 +225,7 @@ Creates a notice message and prints the message to the log. {% data reusables.ac
{% bash %}
```bash{:copy}
```bash copy
echo "::notice file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
@ -233,7 +233,7 @@ echo "::notice file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::notice file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
@ -243,7 +243,7 @@ Write-Output "::notice file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
Creates a warning message and prints the message to the log. {% data reusables.actions.message-annotation-explanation %}
```{:copy}
```text copy
::warning file={name},line={line},endLine={endLine},title={title}::{message}
```
@ -253,14 +253,14 @@ Creates a warning message and prints the message to the log. {% data reusables.a
{% bash %}
```bash{:copy}
```bash copy
echo "::warning file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
{% endbash %}
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::warning file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
@ -270,7 +270,7 @@ Write-Output "::warning file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
Creates an error message and prints the message to the log. {% data reusables.actions.message-annotation-explanation %}
```{:copy}
```text copy
::error file={name},line={line},endLine={endLine},title={title}::{message}
```
@ -280,7 +280,7 @@ Creates an error message and prints the message to the log. {% data reusables.ac
{% bash %}
```bash{:copy}
```bash copy
echo "::error file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
@ -288,7 +288,7 @@ echo "::error file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::error file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
```
@ -298,7 +298,7 @@ Write-Output "::error file=app.js,line=1,col=5,endColumn=7::Missing semicolon"
Creates an expandable group in the log. To create a group, use the `group` command and specify a `title`. Anything you print to the log between the `group` and `endgroup` commands is nested inside an expandable entry in the log.
```{:copy}
```text copy
::group::{title}
::endgroup::
```
@ -307,7 +307,7 @@ Creates an expandable group in the log. To create a group, use the `group` comma
{% bash %}
```yaml{:copy}
```yaml copy
jobs:
bash-example:
runs-on: ubuntu-latest
@ -323,7 +323,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
jobs:
powershell-example:
runs-on: windows-latest
@ -341,7 +341,7 @@ jobs:
## Masking a value in a log
```{:copy}
```text copy
::add-mask::{value}
```
@ -353,7 +353,7 @@ When you print `"Mona The Octocat"` in the log, you'll see `"***"`.
{% bash %}
```bash{:copy}
```bash copy
echo "::add-mask::Mona The Octocat"
```
@ -361,7 +361,7 @@ echo "::add-mask::Mona The Octocat"
{% powershell %}
```pwsh{:copy}
```pwsh copy
Write-Output "::add-mask::Mona The Octocat"
```
@ -379,7 +379,7 @@ When you print the variable `MY_NAME` or the value `"Mona The Octocat"` in the l
{% bash %}
```yaml{:copy}
```yaml copy
jobs:
bash-example:
runs-on: ubuntu-latest
@ -394,7 +394,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
jobs:
powershell-example:
runs-on: windows-latest
@ -427,7 +427,7 @@ If you do not need to pass your secret from one job to another job, you can:
{% bash %}
```yaml{:copy}
```yaml copy
on: push
jobs:
generate-a-secret-output:
@ -449,7 +449,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
on: push
jobs:
generate-a-secret-output:
@ -489,7 +489,7 @@ If you want to pass a masked secret between jobs or workflows, you should store
{% bash %}
```yaml{:copy}
```yaml copy
on: push
jobs:
@ -528,7 +528,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
on: push
jobs:
@ -569,7 +569,7 @@ jobs:
Stops processing any workflow commands. This special command allows you to log anything without accidentally running a workflow command. For example, you could stop logging to output an entire script that has comments.
```{:copy}
```text copy
::stop-commands::{endtoken}
```
@ -581,7 +581,7 @@ To stop the processing of workflow commands, pass a unique token to `stop-comman
{% endwarning %}
```{:copy}
```text copy
::{endtoken}::
```
@ -589,7 +589,7 @@ To stop the processing of workflow commands, pass a unique token to `stop-comman
{% bash %}
```yaml{:copy}
```yaml copy
jobs:
workflow-command-job:
runs-on: ubuntu-latest
@ -607,7 +607,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
jobs:
workflow-command-job:
runs-on: windows-latest
@ -629,7 +629,7 @@ jobs:
Enables or disables echoing of workflow commands. For example, if you use the `set-output` command in a workflow, it sets an output parameter but the workflow run's log does not show the command itself. If you enable command echoing, then the log shows the command, such as `::set-output name={name}::{value}`.
```{:copy}
```text copy
::echo::on
::echo::off
```
@ -644,7 +644,7 @@ You can also enable command echoing globally by turning on step debug logging us
{% bash %}
```yaml{:copy}
```yaml copy
jobs:
workflow-command-job:
runs-on: ubuntu-latest
@ -662,7 +662,7 @@ jobs:
{% powershell %}
```yaml{:copy}
```yaml copy
jobs:
workflow-command-job:
runs-on: windows-latest
@ -680,7 +680,7 @@ jobs:
The example above prints the following lines to the log:
```{:copy}
```text copy
::set-output name=action_echo::enabled
::echo::off
```
@ -700,7 +700,7 @@ If you have multiple `pre:` or `post:` actions, you can only access the saved va
{% ifversion actions-save-state-set-output-envs %}
This example uses JavaScript to write to the `GITHUB_STATE` file. The resulting environment variable is named `STATE_processID` with the value of `12345`:
```javascript{:copy}
```javascript copy
import * as fs from 'fs'
import * as os from 'os'
@ -712,14 +712,14 @@ fs.appendFileSync(process.env.GITHUB_STATE, `processID=12345${os.EOL}`, {
{% else %}
This example uses JavaScript to run the `save-state` command. The resulting environment variable is named `STATE_processID` with the value of `12345`:
```javascript{:copy}
```javascript copy
console.log('::save-state name=processID::12345')
```
{% endif %}
The `STATE_processID` variable is then exclusively available to the cleanup script running under the `main` action. This example runs in `main` and uses JavaScript to display the value assigned to the `STATE_processID` environment variable:
```javascript{:copy}
```javascript copy
console.log("The running PID from the main action is: " + process.env.STATE_processID);
```
@ -735,7 +735,7 @@ Most commands in the following examples use double quotes for echoing strings, w
**Note:** PowerShell versions 5.1 and below (`shell: powershell`) do not use UTF-8 by default, so you must specify the UTF-8 encoding. For example:
```yaml{:copy}
```yaml copy
jobs:
legacy-powershell-example:
runs-on: windows-latest
@ -747,7 +747,7 @@ jobs:
PowerShell Core versions 6 and higher (`shell: pwsh`) use UTF-8 by default. For example:
```yaml{:copy}
```yaml copy
jobs:
powershell-core-example:
runs-on: windows-latest
@ -765,7 +765,7 @@ jobs:
{% bash %}
```bash{:copy}
```bash copy
echo "{environment_variable_name}={value}" >> "$GITHUB_ENV"
```
@ -775,13 +775,13 @@ echo "{environment_variable_name}={value}" >> "$GITHUB_ENV"
- Using PowerShell version 6 and higher:
```pwsh{:copy}
```pwsh copy
"{environment_variable_name}={value}" >> $env:GITHUB_ENV
```
- Using PowerShell version 5.1 and below:
```powershell{:copy}
```powershell copy
"{environment_variable_name}={value}" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
```
@ -793,7 +793,7 @@ You can make an environment variable available to any subsequent steps in a work
{% bash %}
```yaml{:copy}
```yaml copy
steps:
- name: Set the value
id: step_one
@ -811,7 +811,7 @@ steps:
{% powershell %}
```yaml{:copy}
```yaml copy
steps:
- name: Set the value
id: step_one
@ -831,7 +831,7 @@ steps:
For multiline strings, you may use a delimiter with the following syntax.
```{:copy}
```text copy
{name}<<{delimiter}
{value}
{delimiter}
@ -849,7 +849,7 @@ This example selects a random value for `$EOF` as a delimiter, and sets the `JSO
{% bash %}
```yaml{:copy}
```yaml copy
steps:
- name: Set the value in bash
id: step_one
@ -864,7 +864,7 @@ steps:
{% powershell %}
```yaml{:copy}
```yaml copy
steps:
- name: Set the value in pwsh
id: step_one
@ -885,14 +885,14 @@ Sets a step's output parameter. Note that the step will need an `id` to be defin
{% bash %}
```bash{:copy}
```bash copy
echo "{name}={value}" >> "$GITHUB_OUTPUT"
```
{% endbash %}
{% powershell %}
```pwsh{:copy}
```pwsh copy
"{name}=value" >> $env:GITHUB_OUTPUT
```
@ -904,7 +904,7 @@ echo "{name}={value}" >> "$GITHUB_OUTPUT"
This example demonstrates how to set the `SELECTED_COLOR` output parameter and later retrieve it:
```yaml{:copy}
```yaml copy
- name: Set color
id: random-color-generator
run: echo "SELECTED_COLOR=green" >> "$GITHUB_OUTPUT"
@ -920,7 +920,7 @@ This example demonstrates how to set the `SELECTED_COLOR` output parameter and l
This example demonstrates how to set the `SELECTED_COLOR` output parameter and later retrieve it:
```yaml{:copy}
```yaml copy
- name: Set color
id: random-color-generator
run: |
@ -940,7 +940,7 @@ This example demonstrates how to set the `SELECTED_COLOR` output parameter and l
{% bash %}
```bash{:copy}
```bash copy
echo "{markdown content}" >> $GITHUB_STEP_SUMMARY
```
@ -948,7 +948,7 @@ echo "{markdown content}" >> $GITHUB_STEP_SUMMARY
{% powershell %}
```pwsh{:copy}
```pwsh copy
"{markdown content}" >> $env:GITHUB_STEP_SUMMARY
```
@ -964,7 +964,7 @@ When a job finishes, the summaries for all steps in a job are grouped together i
{% bash %}
```bash{:copy}
```bash copy
echo "### Hello world! :rocket:" >> $GITHUB_STEP_SUMMARY
```
@ -972,7 +972,7 @@ echo "### Hello world! :rocket:" >> $GITHUB_STEP_SUMMARY
{% powershell %}
```pwsh{:copy}
```pwsh copy
"### Hello world! :rocket:" >> $env:GITHUB_STEP_SUMMARY
```
@ -1084,14 +1084,14 @@ Prepends a directory to the system `PATH` variable and automatically makes it av
{% bash %}
```bash{:copy}
```bash copy
echo "{path}" >> $GITHUB_PATH
```
{% endbash %}
{% powershell %}
```pwsh{:copy}
```pwsh copy
"{path}" >> $env:GITHUB_PATH
```
@ -1103,7 +1103,7 @@ echo "{path}" >> $GITHUB_PATH
This example demonstrates how to add the user `$HOME/.local/bin` directory to `PATH`:
```bash{:copy}
```bash copy
echo "$HOME/.local/bin" >> $GITHUB_PATH
```
@ -1113,7 +1113,7 @@ echo "$HOME/.local/bin" >> $GITHUB_PATH
This example demonstrates how to add the user `$env:HOMEPATH/.local/bin` directory to `PATH`:
```pwsh{:copy}
```pwsh copy
"$env:HOMEPATH/.local/bin" >> $env:GITHUB_PATH
```

Просмотреть файл

@ -30,12 +30,12 @@ You can enable web commit signing, rotate the private key used for web commit si
{% data reusables.enterprise_site_admin_settings.update-commit-signing-service %}
1. Enable web commit signing.
```bash{:copy}
```bash copy
ghe-config app.github.web-commit-signing-enabled true
```
1. Apply the configuration, then wait for the configuration run to complete.
```bash{:copy}
```bash copy
ghe-config-apply
```
1. Create a new user on {% data variables.location.product_location %} via built-in authentication or external authentication. For more information, see "[AUTOTITLE](/admin/identity-and-access-management/managing-iam-for-your-enterprise/about-authentication-for-your-enterprise)."
@ -68,11 +68,11 @@ You can disable web commit signing for {% data variables.location.product_locati
1. In the administrative shell, run the following command.
```bash{:copy}
```bash copy
ghe-config app.github.web-commit-signing-enabled false
```
1. Apply the configuration.
```bash{:copy}
```bash copy
ghe-config-apply
```

Просмотреть файл

@ -53,18 +53,18 @@ By default, {% data variables.product.prodname_nes %} is disabled. You can enabl
{% data reusables.enterprise_installation.ssh-into-cluster-node %}
1. To verify whether {% data variables.product.prodname_nes %} is currently enabled, run the following command.
```shell{:copy}
```shell copy
ghe-config app.nes.enabled
```
1. To enable {% data variables.product.prodname_nes %}, run the following command.
```shell{:copy}
```shell copy
ghe-config app.nes.enabled true
```
{% data reusables.enterprise.apply-configuration %}
1. To verify that {% data variables.product.prodname_nes %} is running, from any node, run the following command.
```shell{:copy}
```shell copy
nomad status nes
```
@ -75,17 +75,17 @@ To determine how {% data variables.product.prodname_nes %} notifies you, you can
{% data reusables.enterprise_installation.ssh-into-cluster-node %}
1. To verify the current TTL settings, run the following command.
```shell{:copy}
```shell copy
nes get-node-ttl all
```
1. To set the TTL for the `fail` state, run the following command. Replace MINUTES with the number of minutes to use for failures.
```shell{:copy}
```shell copy
nes set-node-ttl fail MINUTES
```
1. To set the TTL for the `warn` state, run the following command. Replace MINUTES with the number of minutes to use for warnings.
```shell{:copy}
```shell copy
nes set-node-ttl warn MINUTES
```
@ -101,12 +101,12 @@ To manage whether {% data variables.product.prodname_nes %} can take a node and
1. To configure whether {% data variables.product.prodname_nes %} can take a node offline, run one of the following commands.
- To allow the service to automatically take administrative action when a node goes offline, run the following command. Replace HOSTNAME with the node's hostname.
```shell{:copy}
```shell copy
nes set-node-adminaction approved HOSTNAME
```
- To revoke {% data variables.product.prodname_nes %}'s ability to take a node offline, run the following command. Replace HOSTNAME with the node's hostname.
```shell{:copy}
```shell copy
nes set-node-adminaction none HOSTNAME
```
@ -124,33 +124,33 @@ After {% data variables.product.prodname_nes %} detects that a node has exceeded
{% data reusables.enterprise_installation.ssh-into-cluster-node %}
1. To check the current `adminaction` state for the node, run the following command. Replace HOSTNAME with the hostname of the ineligible node.
```shell{:copy}
```shell copy
nes get-node-adminaction HOSTNAME
```
1. If the `adminaction` state is currently set to `approved`, change the state to `none` by running the following command. Replace HOSTNAME with the hostname of the ineligible node.
```shell{:copy}
```shell copy
nes set-node-adminaction none HOSTNAME
```
1. To ensure the node is in a healthy state, run the following command and confirm that the node's status is `ready`.
```shell{:copy}
```shell copy
nomad node status
```
- If the node's status is `ineligible`, make the node eligible by connecting to the node via SSH and running the following command.
```shell{:copy}
```shell copy
nomad node eligibility -enable -self
```
1. To update the node's eligibility in {% data variables.product.prodname_nes %}, run the following command. Replace HOSTNAME with the node's hostname.
```shell{:copy}
```shell copy
nes set-node-eligibility eligible HOSTNAME
```
1. Wait 30 seconds, then check the cluster's health to confirm the target node is eligible by running the following command.
```shell{:copy}
```shell copy
nes get-cluster-health
```
@ -161,19 +161,19 @@ You can view logs for {% data variables.product.prodname_nes %} from any node in
{% data reusables.enterprise_installation.ssh-into-cluster-node %}
1. To view logs for {% data variables.product.prodname_nes %} from any node in the cluster, run the following command.
```shell{:copy}
```shell copy
nomad alloc logs -job nes
```
1. Alternatively, you can view logs for {% data variables.product.prodname_nes %} on the node that runs the service. The service writes logs to the systemd journal.
- To determine which node runs {% data variables.product.prodname_nes %}, run the following command.
```shell{:copy}
```shell copy
nomad job status "nes" | grep running | grep "${nomad_node_id}" | awk 'NR==2{ print $1 }' | xargs nomad alloc status | grep "Node Name"
```
- To view logs on the node, connect to the node via SSH, then run the following command.
```shell{:copy}
```shell copy
journalctl -t nes
```

Просмотреть файл

@ -28,12 +28,12 @@ In some cases, such as hardware failure, the underlying software that that manag
1. To see a list of allocations, run the following command. The utility displays healthy allocations in green. If any jobs are not properly distributed, the utility displays the allocation's count in red.
```shell{:copy}
```shell copy
ghe-cluster-balance status
```
1. If a job is not properly distributed, inspect the allocations by running the following command. Replace JOB with a single job or comma-delimited list of jobs.
```shell{:copy}
```shell copy
ghe-cluster-balance status -j JOB
```
@ -45,14 +45,14 @@ After you determine which jobs are unbalanced across your cluster's nodes, you c
1. To perform a dry run and see the result of rebalancing without making changes, run the following command. Replace JOB with a single job or comma-delimited list of jobs.
```shell{:copy}
```shell copy
ghe-cluster-balance rebalance --dry-run -j JOB
```
For example, to perform a dry run of rebalancing jobs for your instance's HTTP server and authorization service, you can run `ghe-cluster-balance rebalance --dry-run -j github-unicorn,authzd`.
1. To rebalance, run the following command. Replace JOB with a single job or comma-delimited list of jobs.
```shell{:copy}
```shell copy
ghe-cluster-balance rebalance -j JOB
```
@ -68,12 +68,12 @@ You can schedule rebalancing of jobs on your cluster by setting and applying con
1. To configure automatic, hourly balancing of jobs, run the following command.
```shell{:copy}
```shell copy
ghe-config app.cluster-rebalance.enabled true
```
1. Optionally, you can override the default schedule by defining a cron expression. For example, run the following command to balance jobs every three hours.
```shell{:copy}
```shell copy
ghe-config app.cluster-rebalance.schedule '0 */3 * * *'
```
{% data reusables.enterprise.apply-configuration %}

Просмотреть файл

@ -85,14 +85,14 @@ The following instructions are only intended for {% data variables.product.prod
{% data reusables.enterprise_installation.ssh-into-instance %}
1. To validate the current flushing method for InnoDB, run the following command.
```shell{:copy}
```shell copy
ghe-config mysql.innodb-flush-no-fsync
```
By default, the command returns `false`, indicating that your instance performs an `fsync()` system call after each write operation.
1. To configure InnoDB to skip the `fsync()` system call after each write operation, run the following command.
```shell{:copy}
```shell copy
ghe-config mysql.innodb-flush-no-fsync true
```
{% data reusables.enterprise.apply-configuration %}

Просмотреть файл

@ -28,7 +28,7 @@ To restore a backup of {% data variables.location.product_location %} with {% da
1. Manually configure network settings on the replacement {% data variables.product.prodname_ghe_server %} instance. Network settings are excluded from the backup snapshot, and are not overwritten by `ghe-restore`. For more information, see "[AUTOTITLE](/admin/configuration/configuring-network-settings)."
1. SSH into the destination instance. For more information, see "[AUTOTITLE](/admin/configuration/configuring-your-enterprise/accessing-the-administrative-shell-ssh)."
```shell{:copy}
```shell copy
$ ssh -p 122 admin@HOSTNAME
```
1. Configure the destination instance to use the same external storage service for {% data variables.product.prodname_actions %} as the source instance by entering one of the following commands.
@ -36,7 +36,7 @@ To restore a backup of {% data variables.location.product_location %} with {% da
{% data reusables.actions.configure-storage-provider %}
1. To prepare to enable {% data variables.product.prodname_actions %} on the destination instance, enter the following command.
```shell{:copy}
```shell copy
ghe-config app.actions.enabled true
```
{% data reusables.actions.apply-configuration-and-enable %}

Просмотреть файл

@ -51,7 +51,7 @@ To configure {% data variables.product.prodname_ghe_server %} to use OIDC with a
1. Get the thumbprint for {% data variables.location.product_location_enterprise %}.
1. Use the following OpenSSL command to get the SHA1 thumbprint for {% data variables.location.product_location_enterprise %}, replacing `HOSTNAME` with the public hostname for {% data variables.location.product_location_enterprise %}
```shell{:copy}
```shell copy
openssl s_client -connect HOSTNAME:443 < /dev/null 2>/dev/null | openssl x509 -fingerprint -noout -sha1 -in /dev/stdin
```
@ -75,7 +75,7 @@ To configure {% data variables.product.prodname_ghe_server %} to use OIDC with a
```
1. Using the AWS CLI, use the following command to create an OIDC provider for {% data variables.location.product_location_enterprise %}. Replace `HOSTNAME` with the public hostname for {% data variables.location.product_location_enterprise %}, and `THUMBPRINT` with the thumbprint value from the previous step.
```shell{:copy}
```shell copy
aws iam create-open-id-connect-provider \
--url https://HOSTNAME/_services/token \
--client-id-list "sts.amazonaws.com" \
@ -84,7 +84,7 @@ To configure {% data variables.product.prodname_ghe_server %} to use OIDC with a
For example:
```shell{:copy}
```shell copy
aws iam create-open-id-connect-provider \
--url https://my-ghes-host.example.com/_services/token \
--client-id-list "sts.amazonaws.com" \

Просмотреть файл

@ -82,7 +82,7 @@ Optionally, if you use {% data variables.product.prodname_actions %} on your pro
{% data reusables.actions.configure-storage-provider %}
1. To prepare to enable {% data variables.product.prodname_actions %} on the staging instance, enter the following command.
```shell{:copy}
```shell copy
ghe-config app.actions.enabled true
```
@ -103,14 +103,14 @@ Optionally, if you use {% data variables.product.prodname_registry %} on your pr
1. Configure the external storage connection by entering the following commands, replacing the placeholder values with actual values for your connection.
- Azure Blob Storage:
```shell{:copy}
```shell copy
ghe-config secrets.packages.blob-storage-type "azure"
ghe-config secrets.packages.azure-container-name "AZURE CONTAINER NAME"
ghe-config secrets.packages.azure-connection-string "CONNECTION STRING"
```
- Amazon S3:
```shell{:copy}
```shell copy
ghe-config secrets.packages.blob-storage-type "s3"
ghe-config secrets.packages.service-url "S3 SERVICE URL"
ghe-config secrets.packages.s3-bucket "S3 BUCKET NAME"
@ -119,7 +119,7 @@ Optionally, if you use {% data variables.product.prodname_registry %} on your pr
```
1. To prepare to enable {% data variables.product.prodname_registry %} on the staging instance, enter the following command.
```shell{:copy}
```shell copy
ghe-config app.packages.enabled true
```

Просмотреть файл

@ -26,7 +26,7 @@ For MacOS and Linux, you can use `echo -n TOKEN | openssl dgst -sha256 -binary |
For Powershell, you can use the following script to return a SHA-256 hash for a given string.
```shell{:copy}
```shell copy
Param (
[Parameter(Mandatory=$true)]
[string]

Просмотреть файл

@ -74,7 +74,7 @@ You must install and import `octokit` in order to use the Octokit.js library. Th
You can also use the REST API to find the ID for an installation of your app. For example, you can get an installation ID with the `GET /users/{username}/installation`, `GET /repos/{owner}/{repo}/installation`, `GET /orgs/{org}/installation`, or `GET /app/installations` endpoints. For more information, see "[AUTOTITLE](/rest/apps/apps)".
1. Import `App` from `octokit`. Create a new instance of `App`. In the following example, replace `APP_ID` with a reference to your app's ID. Replace `PRIVATE_KEY` with a reference to your app's private key.
```javascript{:copy}
```javascript copy
import { App } from "octokit";
const app = new App({
@ -85,7 +85,7 @@ You must install and import `octokit` in order to use the Octokit.js library. Th
1. Use the `getInstallationOctokit` method to create an authenticated `octokit` instance. In the following example, replace `INSTALLATION_ID` with the ID of the installation of your app that you want to authenticate on behalf of.
```javascript{:copy}
```javascript copy
const octokit = await app.getInstallationOctokit(INSTALLATION_ID);
```
@ -95,7 +95,7 @@ You must install and import `octokit` in order to use the Octokit.js library. Th
For example, to make a request to the GraphQL API:
```javascript{:copy}
```javascript copy
await octokit.graphql(`
query {
viewer {
@ -107,7 +107,7 @@ You must install and import `octokit` in order to use the Octokit.js library. Th
For example, to make a request to the REST API:
```javascript{:copy}
```javascript copy
await octokit.request("GET /meta")
```
@ -120,7 +120,7 @@ The Octokit.js SDK also passes a pre-authenticated `octokit` instance to webhook
1. Get the webhook secret that you specified in your app's settings. For more information about webhook secrets, see "[AUTOTITLE](/apps/creating-github-apps/setting-up-a-github-app/using-webhooks-with-github-apps#securing-your-webhooks-with-a-webhook-secret)."
1. Import `App` from `octokit`. Create a new instance of `App`. In the following example, replace `APP_ID` with a reference to your app's ID. Replace `PRIVATE_KEY` with a reference to your app's private key. Replace `WEBHOOK_SECRET` with the your app's webhook secret.
```javascript{:copy}
```javascript copy
import { App } from "octokit";
const app = new App({

Просмотреть файл

@ -48,12 +48,12 @@ You can use {% data variables.product.company_short %}'s Octokit.js SDK to authe
1. Generate a private key. For more information, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/managing-private-keys-for-github-apps)."
1. Import `App` from `octokit`.
```javascript{:copy}
```javascript copy
import { App } from "octokit";
```
1. Create a new instance of `App`. In the following example, replace `APP_ID` with a reference to your app's ID. Replace `PRIVATE_KEY` with a reference to the value of your app's private key.
```javascript{:copy}
```javascript copy
const app = new App({
appId: APP_ID,
privateKey: PRIVATE_KEY,
@ -62,6 +62,6 @@ You can use {% data variables.product.company_short %}'s Octokit.js SDK to authe
1. Use an `octokit` method to make a request to a REST API endpoint that requires a JWT. For example:
```javascript{:copy}
```javascript copy
await app.octokit.request("/app")
```

Просмотреть файл

@ -86,7 +86,7 @@ puts jwt
{% endnote %}
```python{:copy}
```python copy
#!/usr/bin/env python3
import jwt
import time

Просмотреть файл

@ -35,7 +35,7 @@ In order to use a {% data variables.product.prodname_github_app %} to make authe
In the following workflow, replace `APP_ID` with the name of the secret where you stored your app ID. Replace `APP_PRIVATE_KEY` with the name of the secret where you stored your app private key.
```yaml{:copy}
```yaml copy
{% data reusables.actions.actions-not-certified-by-github-comment %}
{% data reusables.actions.actions-use-sha-pinning-comment %}

Просмотреть файл

@ -60,19 +60,19 @@ These steps lead you through building a CLI and using device flow to get a user
1. Create a Ruby file to hold the code that will generate a user access token. This tutorial will name the file `app_cli.rb`.
1. In your terminal, from the directory where `app_cli.rb` is stored, run the following command to make `app_cli.rb` executable:
```{:copy}
```text copy
chmod +x app_cli.rb
```
1. Add this line to the top of `app_cli.rb` to indicate that the Ruby interpreter should be used to run the script:
```ruby{:copy}
```ruby copy
#!/usr/bin/env ruby
```
1. Add these dependencies to the top of `app_cli.rb`, following `#!/usr/bin/env ruby`:
```ruby{:copy}
```ruby copy
require "net/http"
require "json"
require "uri"
@ -82,7 +82,7 @@ These steps lead you through building a CLI and using device flow to get a user
These are all part of the Ruby standard library, so you don't need to install any gems.
1. Add the following `main` function that will serve as an entry point. The function includes a `case` statement to take different actions depending on which command is specified. You will expand this `case` statement later.
```ruby{:copy}
```ruby copy
def main
case ARGV[0]
when "help"
@ -99,7 +99,7 @@ These steps lead you through building a CLI and using device flow to get a user
1. At the bottom of the file, add the following line to call the entry point function. This function call should remain at the bottom of your file as you add more functions to this file later in the tutorial.
```ruby{:copy}
```ruby copy
main
```
@ -107,7 +107,7 @@ These steps lead you through building a CLI and using device flow to get a user
`app_cli.rb` now looks like this:
```ruby{:copy}
```ruby copy
#!/usr/bin/env ruby
require "net/http"
@ -147,7 +147,7 @@ These steps lead you through building a CLI and using device flow to get a user
1. Add the following `help` function to `app_cli.rb`. Currently, the `help` function prints a line to tell users that this CLI takes one command, "help". You will expand this `help` function later.
```ruby{:copy}
```ruby copy
def help
puts "usage: app_cli <help>"
end
@ -155,7 +155,7 @@ These steps lead you through building a CLI and using device flow to get a user
1. Update the `main` function to call the `help` function when the `help` command is given:
```ruby{:copy}
```ruby copy
def main
case ARGV[0]
when "help"
@ -174,7 +174,7 @@ These steps lead you through building a CLI and using device flow to get a user
`app_cli.rb` now looks like this. The order of the functions don't matter as long as the `main` function call is at the end of the file.
```ruby{:copy}
```ruby copy
#!/usr/bin/env ruby
require "net/http"
@ -214,13 +214,13 @@ The `login` command will run the device flow to get a user access token. For mor
1. Near the top of your file, after the `require` statements, add the `CLIENT_ID` of your {% data variables.product.prodname_github_app %} as a constant in `app_cli.rb`. For more information about finding your app's client ID, see "[Get the client ID](#get-the-client-id)." Replace `YOUR_CLIENT_ID` with the client ID of your app:
```ruby{:copy}
```ruby copy
CLIENT_ID="YOUR_CLIENT_ID"
```
1. Add the following `parse_response` function to `app_cli.rb`. This function parses a response from the {% data variables.product.company_short %} REST API. When the response status is `200 OK` or `201 Created`, the function returns the parsed response body. Otherwise, the function prints the response and body an exits the program.
```ruby{:copy}
```ruby copy
def parse_response(response)
case response
when Net::HTTPOK, Net::HTTPCreated
@ -235,7 +235,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Add the following `request_device_code` function to `app_cli.rb`. This function makes a `POST` request to `{% data variables.product.oauth_host_code %}/login/device/code` and returns the response.
```ruby{:copy}
```ruby copy
def request_device_code
uri = URI("{% data variables.product.oauth_host_code %}/login/device/code")
parameters = URI.encode_www_form("client_id" => CLIENT_ID)
@ -248,7 +248,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Add the following `request_token` function to `app_cli.rb`. This function makes a `POST` request to `{% data variables.product.oauth_host_code %}/login/oauth/access_token` and returns the response.
```ruby{:copy}
```ruby copy
def request_token(device_code)
uri = URI("{% data variables.product.oauth_host_code %}/login/oauth/access_token")
parameters = URI.encode_www_form({
@ -264,7 +264,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Add the following `poll_for_token` function to `app_cli.rb`. This function polls `{% data variables.product.oauth_host_code %}/login/oauth/access_token` at the specified interval until {% data variables.product.company_short %} responds with an `access_token` parameter instead of an `error` parameter. Then, it writes the user access token to a file and restricts the permissions on the file.
```ruby{:copy}
```ruby copy
def poll_for_token(device_code, interval)
loop do
@ -316,7 +316,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Calls the `poll_for_token` to poll {% data variables.product.company_short %} for an access token.
1. Lets the user know that authentication was successful.
```ruby{:copy}
```ruby copy
def login
verification_uri, user_code, device_code, interval = request_device_code.values_at("verification_uri", "user_code", "device_code", "interval")
@ -331,7 +331,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Update the `main` function to call the `login` function when the `login` command is given:
```ruby{:copy}
```ruby copy
def main
case ARGV[0]
when "help"
@ -348,7 +348,7 @@ The `login` command will run the device flow to get a user access token. For mor
1. Update the `help` function to include the `login` command:
```ruby{:copy}
```ruby copy
def help
puts "usage: app_cli <login | help>"
end
@ -358,7 +358,7 @@ The `login` command will run the device flow to get a user access token. For mor
`app_cli.rb` now looks something like this, where `YOUR_CLIENT_ID` is the client ID of your app. The order of the functions don't matter as long as the `main` function call is at the end of the file.
```ruby{:copy}
```ruby copy
#!/usr/bin/env ruby
require "net/http"
@ -489,7 +489,7 @@ Now that your app can generate a user access token, you can make API requests on
1. Add the following `whoami` function to `app_cli.rb`. This function gets information about the user with the `/user` REST API endpoint. It outputs the username that corresponds to the user access token. If the `.token` file was not found, it prompts the user to run the `login` function.
```ruby{:copy}
```ruby copy
def whoami
uri = URI("{% data variables.product.api_url_code %}/user")
@ -514,7 +514,7 @@ Now that your app can generate a user access token, you can make API requests on
1. Update the `parse_response` function to handle the case where the token has expired or been revoked. Now, if you get a `401 Unauthorized` response, the CLI will prompt the user to run the `login` command.
```ruby{:copy}
```ruby copy
def parse_response(response)
case response
when Net::HTTPOK, Net::HTTPCreated
@ -532,7 +532,7 @@ Now that your app can generate a user access token, you can make API requests on
1. Update the `main` function to call the `whoami` function when the `whoami` command is given:
```ruby{:copy}
```ruby copy
def main
case ARGV[0]
when "help"
@ -549,7 +549,7 @@ Now that your app can generate a user access token, you can make API requests on
1. Update the `help` function to include the `whoami` command:
```ruby{:copy}
```ruby copy
def help
puts "usage: app_cli <login | whoami | help>"
end
@ -561,7 +561,7 @@ Now that your app can generate a user access token, you can make API requests on
This is the full code example that was outlined in the previous section. Replace `YOUR_CLIENT_ID` with the client ID of your app.
```ruby{:copy}
```ruby copy
#!/usr/bin/env ruby
require "net/http"

Просмотреть файл

@ -131,7 +131,7 @@ Make sure that you are on a secure machine before performing these steps since y
1. Add `.env` to your `.gitignore` file. This will prevent you from accidentally committing your app's credentials.
1. Add the following contents to your `.env` file. {% ifversion ghes or ghae %}Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}. You will update the other values in a later step.{% else %}You will update the values in a later step.{% endif %}
```{:copy}
```text copy
APP_ID="YOUR_APP_ID"
WEBHOOK_SECRET="YOUR_WEBHOOK_SECRET"
PRIVATE_KEY_PATH="YOUR_PRIVATE_KEY_PATH"{% ifversion ghes or ghae %}
@ -154,7 +154,7 @@ These steps lead you through writing code to make an API request in response to
1. At the top level of this directory, create a JavaScript file to hold the code for your app. This tutorial will name the file `app.js`.
1. To the `scripts` object in your `package.json` file, add a script called `server` that runs `node app.js`. For example:
```json{:copy}
```json copy
"scripts": {
"server": "node app.js"
}
@ -187,7 +187,7 @@ These steps lead you through writing code to make an API request in response to
1. At the top of `app.js`, add these dependencies:
```javascript{:copy}
```javascript copy
import dotenv from "dotenv";
import {App} from "octokit";
import {createNodeMiddleware} from "@octokit/webhooks";
@ -199,13 +199,13 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js`. This will read your `.env` file and add the variables from that file to the `process.env` object in Node.js.
```javascript{:copy}
```javascript copy
dotenv.config();
```
1. Add the following code to `app.js` to read the values of your environment variables.
```javascript{:copy}
```javascript copy
const appId = process.env.APP_ID;
const webhookSecret = process.env.WEBHOOK_SECRET;
const privateKeyPath = process.env.PRIVATE_KEY_PATH;{% ifversion ghes or ghae %}
@ -214,13 +214,13 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js` to read the contents of your private key file.
```javascript{:copy}
```javascript copy
const privateKey = fs.readFileSync(privateKeyPath, "utf8");
```
1. Add the following code to `app.js` to create a new instance of the Octokit App class.
```javascript{:copy}
```javascript copy
const app = new App({
appId: appId,
privateKey: privateKey,
@ -236,7 +236,7 @@ These steps lead you through writing code to make an API request in response to
1. Optionally, check your progress:
1. Add the following code to `app.js` to make an API request and log the app's name:
```javascript{:copy}
```javascript copy
try {
const {data} = await app.octokit.request("/app");
console.log(`Authenticated as '${data.name}'`);
@ -250,7 +250,7 @@ These steps lead you through writing code to make an API request in response to
1. Verify that `app.js` now looks like this:
```javascript{:copy}
```javascript copy
import dotenv from "dotenv";
import {App} from "octokit";
import {createNodeMiddleware} from "@octokit/webhooks";
@ -297,7 +297,7 @@ These steps lead you through writing code to make an API request in response to
1. Delete the following code from `app.js`, which you added for testing purposes:
```javascript{:copy}
```javascript copy
try {
const {data} = await app.octokit.request("/app");
console.log(`Authenticated as '${data.name}'`);
@ -311,7 +311,7 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js` to define the message that your app will post to pull requests.
```javascript{:copy}
```javascript copy
const messageForNewPRs = "Thanks for opening a new PR! Please follow our contributing guidelines to make your PR easier to review.";
```
@ -319,7 +319,7 @@ These steps lead you through writing code to make an API request in response to
This code adds an event handler that you will call later. When this event handler is called, it will log the event to the console. Then, it will use {% data variables.product.company_short %}'s REST API to add a comment to the pull request that triggered the event.
```javascript{:copy}
```javascript copy
async function handlePullRequestOpened({octokit, payload}) {
console.log(`Received a pull request event for #${payload.pull_request.number}`);
@ -346,13 +346,13 @@ These steps lead you through writing code to make an API request in response to
This code sets up a webhook event listener. When your app receives a webhook event from {% data variables.product.company_short %} with a `X-GitHub-Event` header value of `pull_request` and an `action` payload value of `opened`, it calls the `handlePullRequestOpened` event handler that you added in the previous step.
```javascript{:copy}
```javascript copy
app.webhooks.on("pull_request.opened", handlePullRequestOpened);
```
1. Add the following code to `app.js` to handle errors.
```javascript{:copy}
```javascript copy
app.webhooks.onError((error) => {
if (error.name === "AggregateError") {
console.error(`Error processing request: ${error.event}`);
@ -364,7 +364,7 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js` to determine where your server will listen:
```javascript{:copy}
```javascript copy
const port = 3000;
const host = 'localhost';
const path = "/api/webhook";
@ -375,7 +375,7 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js` to set up a middleware function to handle incoming webhook events:
```javascript{:copy}
```javascript copy
const middleware = createNodeMiddleware(app.webhooks, {path});
```
@ -387,7 +387,7 @@ These steps lead you through writing code to make an API request in response to
1. Add the following code to `app.js`:
```javascript{:copy}
```javascript copy
http.createServer(middleware).listen(port, () => {
console.log(`Server is listening for events at: ${localWebhookUrl}`);
console.log('Press Ctrl + C to quit.')
@ -402,7 +402,7 @@ These steps lead you through writing code to make an API request in response to
This is the full code example that was outlined in the previous section. In addition to this code, you must also create a `.env` file with your app's credentials. For more information, see "[Store your app's identifying information and credentials](#store-your-apps-identifying-information-and-credentials)."
```javascript{:copy}
```javascript copy
import dotenv from "dotenv";
import {App} from "octokit";
import {createNodeMiddleware} from "@octokit/webhooks";
@ -551,7 +551,7 @@ When you deploy your app, you will want to change the host and port where your s
For example, you can set a `PORT` environment variable on your server to indicate the port where your server should listen. You can set a `NODE_ENV` environment variable on your server to `production`. Then, you can update the place where your code defines the `port` and `host` constants so that your server listens to all available network interfaces (`0.0.0.0`) instead of the local network interface (`localhost`) on your deployment port:
```javascript{:copy}
```javascript copy
const port = process.env.PORT || 3000;
const host = process.env.NODE_ENV === 'production' ? '0.0.0.0' : 'localhost';
```

Просмотреть файл

@ -99,7 +99,7 @@ These steps lead you through writing code to generate a user access token. To sk
1. In the same directory as your `.env` file, create a Ruby file to hold the code that will generate a user access token. This tutorial will name the file `app.rb`.
1. At the top of `app.rb`, add these dependencies:
```ruby{:copy}
```ruby copy
require "sinatra"
require "dotenv/load"
require "net/http"
@ -110,14 +110,14 @@ These steps lead you through writing code to generate a user access token. To sk
1. Add the following code to `app.rb`, to get your app's client ID and client secret from your `.env` file.
```ruby{:copy}
```ruby copy
CLIENT_ID = ENV.fetch("CLIENT_ID")
CLIENT_SECRET = ENV.fetch("CLIENT_SECRET")
```
1. Add the following code to `app.rb` to display a link that will prompt users to authenticate your app.
```ruby{:copy}
```ruby copy
get "/" do
link = '<a href="{% data variables.product.oauth_host_code %}/login/oauth/authorize?client_id=<%= CLIENT_ID %>">Login with GitHub</a>'
erb link
@ -126,7 +126,7 @@ These steps lead you through writing code to generate a user access token. To sk
1. Add the following code to `app.rb` to handle requests to your app's callback URL and get the `code` parameter from the request. Replace `CALLBACK_URL` with the callback URL for your app, minus the domain. For example, if your callback URL is `http://localhost:4567/github/callback`, replace `CALLBACK_URL` with `/github/callback`.
```ruby{:copy}
```ruby copy
get "CALLBACK_URL" do
code = params["code"]
render = "Successfully authorized! Got code #{code}."
@ -140,7 +140,7 @@ These steps lead you through writing code to generate a user access token. To sk
`app.rb` now looks like this, where `CALLBACK_URL` is the callback URL for your app, minus the domain:
```ruby{:copy}
```ruby copy
require "sinatra"
require "dotenv/load"
require "net/http"
@ -179,7 +179,7 @@ These steps lead you through writing code to generate a user access token. To sk
- The handler for the callback URL request now calls `exchange_code` to exchange the code parameter for a user access token.
- The callback page now shows text to indicate that a token was generated. If the token generation was not successful, the page will indicate that failure.
```ruby{:copy}
```ruby copy
require "sinatra"
require "dotenv/load"
require "net/http"
@ -248,7 +248,7 @@ These steps lead you through writing code to generate a user access token. To sk
Add this function to `app.rb` to get information about the user with the `/user` REST API endpoint:
```ruby{:copy}
```ruby copy
def user_info(token)
uri = URI("{% data variables.product.api_url_code %}/user")
@ -267,7 +267,7 @@ These steps lead you through writing code to generate a user access token. To sk
Update the callback handler to call the `user_info` function and to display the user's name and GitHub login. Remember to replace `CALLBACK_URL` with the callback URL for your app, minus the domain.
```ruby{:copy}
```ruby copy
get "CALLBACK_URL" do
code = params["code"]
@ -297,7 +297,7 @@ This is the full code example that was outlined in the previous section.
Replace `CALLBACK_URL` with the callback URL for your app, minus the domain. For example, if your callback URL is `http://localhost:4567/github/callback`, replace `CALLBACK_URL` with `/github/callback`.
```ruby{:copy}
```ruby copy
require "sinatra"
require "dotenv/load"
require "net/http"

Просмотреть файл

@ -75,7 +75,7 @@ Your cloned repository includes `.env` in the `.gitignore` file. This will preve
1. Create a file called `.env` at the top level of this directory.
1. Add the following contents to your `.env` file. {% ifversion ghes or ghae %}Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}. You will update the other values in a later step.{% else %}You will update the values in a later step.{% endif %}
```{:copy}
```text copy
APP_ID="YOUR_APP_ID"
WEBHOOK_SECRET="YOUR_WEBHOOK_SECRET"
PRIVATE_KEY_PATH="YOUR_PRIVATE_KEY_PATH"{% ifversion ghes or ghae %}

Просмотреть файл

@ -28,11 +28,11 @@ topics:
{% data reusables.command_line.open_the_multi_os_terminal %}
3. Generate a GPG key pair. Since there are multiple versions of GPG, you may need to consult the relevant [_man page_](https://en.wikipedia.org/wiki/Man_page) to find the appropriate key generation command.
- If you are on version 2.1.17 or greater, paste the text below to generate a GPG key pair.
```shell{:copy}
```shell copy
gpg --full-generate-key
```
- If you are not on version 2.1.17 or greater, the `gpg --full-generate-key` command doesn't work. Paste the text below and skip to step 6.
```shell{:copy}
```shell copy
gpg --default-new-key-algo rsa4096 --gen-key
```
4. At the prompt, specify the kind of key you want, or press `Enter` to accept the default.

Просмотреть файл

@ -101,7 +101,7 @@ Using the `pull_request` trigger, configured to scan the pull request's merge co
You might want to avoid a code scan being triggered on specific pull requests targeted against the default branch, irrespective of which files have been changed. You can configure this by specifying `on:pull_request:paths-ignore` or `on:pull_request:paths` in the {% data variables.product.prodname_code_scanning %} workflow. For example, if the only changes in a pull request are to files with the file extensions `.md` or `.txt` you can use the following `paths-ignore` array.
``` yaml{:copy}
``` yaml copy
on:
push:
branches: [main, protected]
@ -138,7 +138,7 @@ If you use the default {% data variables.code-scanning.codeql_workflow %}, the w
The following example shows a {% data variables.code-scanning.codeql_workflow %} for a particular repository that has a default branch called `main` and one protected branch called `protected`.
``` yaml{:copy}
``` yaml copy
on:
push:
branches: [main, protected]
@ -165,7 +165,7 @@ This workflow scans:
{% endif %}
If your code requires a specific operating system to compile, you can configure the operating system in your {% data variables.code-scanning.codeql_workflow %}. Edit the value of `jobs.analyze.runs-on` to specify the operating system for the machine that runs your {% data variables.product.prodname_code_scanning %} actions. {% ifversion ghes %}You specify the operating system by using an appropriate label as the second element in a two-element array, after `self-hosted`.{% else %}
``` yaml{:copy}
``` yaml copy
jobs:
analyze:
name: Analyze
@ -174,7 +174,7 @@ jobs:
If you choose to use a self-hosted runner for code scanning, you can specify an operating system by using an appropriate label as the second element in a two-element array, after `self-hosted`.{% endif %}
``` yaml{:copy}
``` yaml copy
jobs:
analyze:
name: Analyze
@ -191,7 +191,7 @@ For recommended specifications (RAM, CPU cores, and disk) for running {% data va
In general, you do not need to worry about where the {% data variables.code-scanning.codeql_workflow %} places {% data variables.product.prodname_codeql %} databases since later steps will automatically find databases created by previous steps. However, if you are writing a custom workflow step that requires the {% data variables.product.prodname_codeql %} database to be in a specific disk location, for example to upload the database as a workflow artifact, you can specify that location using the `db-location` parameter under the `init` action.
``` yaml{:copy}
``` yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
db-location: {% raw %}'${{ github.workspace }}/codeql_dbs'{% endraw %}
@ -213,7 +213,7 @@ The default {% data variables.code-scanning.codeql_workflow %} file contains a m
If your workflow uses the `language` matrix then {% data variables.product.prodname_codeql %} is hardcoded to analyze only the languages in the matrix. To change the languages you want to analyze, edit the value of the matrix variable. You can remove a language to prevent it being analyzed or you can add a language that was not present in the repository when {% data variables.product.prodname_code_scanning %} was configured. For example, if the repository initially only contained JavaScript when {% data variables.product.prodname_code_scanning %} was configured, and you later added Python code, you will need to add `python` to the matrix.
```yaml{:copy}
```yaml copy
jobs:
analyze:
name: Analyze
@ -226,7 +226,7 @@ jobs:
If your workflow does not contain a matrix called `language`, then {% data variables.product.prodname_codeql %} is configured to run analysis sequentially. If you don't specify languages in the workflow, {% data variables.product.prodname_codeql %} automatically detects, and attempts to analyze, any supported languages in the repository. If you want to choose which languages to analyze, without using a matrix, you can use the `languages` parameter under the `init` action.
```yaml{:copy}
```yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
languages: cpp, csharp, python
@ -242,7 +242,7 @@ For GitHub-hosted runners that use Linux only, the {% data variables.code-scanni
Alternatively, you can install Python dependencies manually on any operating system. You will need to add `setup-python-dependencies` and set it to `false`, as well as set `CODEQL_PYTHON` to the Python executable that includes the dependencies, as shown in this workflow extract:
```yaml{:copy}
```yaml copy
jobs:
CodeQL-Build:
runs-on: ubuntu-latest
@ -292,7 +292,7 @@ Use `category` to distinguish between multiple analyses for the same tool and co
This parameter is particularly useful if you work with monorepos and have multiple SARIF files for different components of the monorepo.
``` yaml{:copy}
``` yaml copy
- name: Perform CodeQL Analysis
uses: {% data reusables.actions.action-codeql-action-analyze %}
with:
@ -333,7 +333,7 @@ In the example below, `scope` is the organization or personal account that publi
- The latest version of `pack3` that is compatible with version 3.2.1 is downloaded and all queries are run.
- Version 4.5.6 of `pack4` is downloaded and only the queries found in `path/to/queries` are run.
``` yaml{:copy}
``` yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
# Comma-separated list of packs to download
@ -359,7 +359,7 @@ For more information about pack compatibility, see "[AUTOTITLE](/code-security/c
If your workflow uses packs that are published on a {% data variables.product.prodname_ghe_server %} installation, you need to tell your workflow where to find them. You can do this by using the `registries` input of the {% data reusables.actions.action-codeql-action-init %} action. This input accepts a list of `url`, `packages`, and `token` properties as shown below.
```yaml{:copy}
```yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
registries: {% raw %}|
@ -393,7 +393,7 @@ To add one or more queries, add a `with: queries:` entry within the `uses: {% da
You can also specify query suites in the value of `queries`. Query suites are collections of queries, usually grouped by purpose or language.
``` yaml{:copy}
``` yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
# Comma-separated list of queries / packs / suites to run.
@ -414,7 +414,7 @@ If you also use a configuration file for custom settings, any additional {% ifve
In the following example, the `+` symbol ensures that the specified additional {% ifversion codeql-packs %}packs and {% endif %}queries are used together with any specified in the referenced configuration file.
``` yaml{:copy}
``` yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
config-file: ./.github/codeql/codeql-config.yml
@ -435,7 +435,7 @@ A custom configuration file is an alternative way to specify additional {% ifver
In the workflow file, use the `config-file` parameter of the `init` action to specify the path to the configuration file you want to use. This example loads the configuration file _./.github/codeql/codeql-config.yml_.
``` yaml{:copy}
``` yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
config-file: ./.github/codeql/codeql-config.yml
@ -445,7 +445,7 @@ In the workflow file, use the `config-file` parameter of the `init` action to sp
If the configuration file is located in an external private repository, use the `external-repository-token` parameter of the `init` action to specify a token that has access to the private repository.
```yaml{:copy}
```yaml copy
- uses: {% data reusables.actions.action-codeql-action-init %}
with:
external-repository-token: {% raw %}${{ secrets.ACCESS_TOKEN }}{% endraw %}
@ -461,7 +461,7 @@ The settings in the configuration file are written in YAML format.
You specify {% data variables.product.prodname_codeql %} query packs in an array. Note that the format is different from the format used by the workflow file.
{% raw %}
``` yaml{:copy}
``` yaml copy
packs:
# Use the latest version of 'pack1' published by 'scope'
- scope/pack1
@ -483,7 +483,7 @@ The full format for specifying a query pack is `scope/name[@version][:path]`. Bo
If you have a workflow that generates more than one {% data variables.product.prodname_codeql %} database, you can specify any {% data variables.product.prodname_codeql %} query packs to run in a custom configuration file using a nested map of packs.
{% raw %}
``` yaml{:copy}
``` yaml copy
packs:
# Use these packs for JavaScript and TypeScript analysis
javascript:
@ -501,7 +501,7 @@ packs:
You specify additional queries in a `queries` array. Each element of the array contains a `uses` parameter with a value that identifies a single query file, a directory containing query files, or a query suite definition file.
``` yaml{:copy}
``` yaml copy
queries:
- uses: ./my-basic-queries/example-query.ql
- uses: ./my-advanced-queries
@ -526,7 +526,7 @@ This is useful if you want to exclude, for example:
You can use `exclude` filters similar to those in the configuration file below to exclude queries that you want to remove from the default analysis. In the example of configuration file below, both the `js/redundant-assignment` and the `js/useless-assignment-to-local` queries are excluded from analysis.
```yaml{:copy}
```yaml copy
query-filters:
- exclude:
id: js/redundant-assignment
@ -553,7 +553,7 @@ For more information about using `exclude` and `include` filters in your custom
For the interpreted languages that {% data variables.product.prodname_codeql %} supports (Python, Ruby, and JavaScript/TypeScript), you can restrict {% data variables.product.prodname_code_scanning %} to files in specific directories by adding a `paths` array to the configuration file. You can exclude the files in specific directories from analysis by adding a `paths-ignore` array.
``` yaml{:copy}
``` yaml copy
paths:
- src
paths-ignore:

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql bqrs decode [--output=<file>] [--result-set=<name>] [--sort-key=<col>[,<col>...]] <options>... -- <file>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql bqrs diff <options>... -- <file1> <file2>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql bqrs hash <options>... -- <file>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql bqrs info <options>... -- <file>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql bqrs interpret --format=<format> --output=<output> -t=<String=String> [--threads=<num>] [--source-archive=<sourceArchive>] [--source-location-prefix=<sourceLocationPrefix>] <options>... -- <bqrs-file>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database add-diagnostic --source-id=<id> --source-name=<name> <options>... -- <database>
```

Просмотреть файл

@ -26,7 +26,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database analyze --format=<format> --output=<output> [--threads=<num>] [--ram=<MB>] <options>... -- <database> <query|dir|suite|pack>...
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database bundle --output=<output> [--mode=<mode>] <options>... -- <database>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database cleanup [--mode=<mode>] <options>... -- <database>
```

Просмотреть файл

@ -26,7 +26,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database create [--language=<lang>[,<lang>...]] [--github-auth-stdin] [--github-url=<url>] [--source-root=<dir>] [--threads=<num>] [--ram=<MB>] [--command=<command>] [--mode=<mode>] [--extractor-option=<extractor-option-name=value>] <options>... -- <database>
```

Просмотреть файл

@ -26,7 +26,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database export-diagnostics --format=<format> [--output=<output>] <options>... -- <database>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database finalize [--dbscheme=<file>] [--threads=<num>] [--ram=<MB>] [--mode=<mode>] <options>... -- <database>
```

Просмотреть файл

@ -26,7 +26,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database import [--dbscheme=<file>] [--threads=<num>] [--ram=<MB>] <options>... -- <database> <additionalDbs>...
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database index-files --language=<lang> [--threads=<num>] [--ram=<MB>] [--extractor-option=<extractor-option-name=value>] <options>... -- <database>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database init --source-root=<dir> [--language=<lang>[,<lang>...]] [--github-auth-stdin] [--github-url=<url>] [--extractor-option=<extractor-option-name=value>] <options>... -- <database>
```

Просмотреть файл

@ -26,7 +26,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database interpret-results --format=<format> --output=<output> [--threads=<num>] <options>... -- <database> <file|dir|suite>...
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database print-baseline <options>... -- <database>
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database run-queries [--threads=<num>] [--ram=<MB>] <options>... -- <database> <query|dir|suite|pack>...
```

Просмотреть файл

@ -24,7 +24,7 @@ redirect_from:
## Synopsis
```shell{:copy}
```shell copy
codeql database trace-command [--threads=<num>] [--ram=<MB>] [--extractor-option=<extractor-option-name=value>] <options>... -- <database> <command>...
```

Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше