Merge branch 'main' of github.com:github/docs-internal into make-developer-redirects-static

This commit is contained in:
Sarah Schneider 2021-01-20 15:08:35 -05:00
Родитель 0d7e04f399 aee3e7aced
Коммит 8198dd4ce7
79 изменённых файлов: 9111 добавлений и 25625 удалений

Просмотреть файл

@ -21,8 +21,10 @@ jobs:
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: npm ci
- name: Install dependencies
run: npm ci
- name: Run build scripts
run: npm run build
- name: (Dry run) sync indices
env:
ALGOLIA_APPLICATION_ID: ${{ secrets.ALGOLIA_APPLICATION_ID }}

Просмотреть файл

@ -24,8 +24,10 @@ jobs:
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: npm ci
- name: Install dependencies
run: npm ci
- name: Run build scripts
run: npm run build
- name: sync indices
env:
ALGOLIA_APPLICATION_ID: ${{ secrets.ALGOLIA_APPLICATION_ID }}

1
.gitignore поставляемый
Просмотреть файл

@ -1,4 +1,5 @@
.algolia-cache
.search-cache
.DS_Store
.env
/node_modules/

Просмотреть файл

@ -39,7 +39,7 @@ Before you begin, you'll need to create a GitHub repository.
1. From your terminal, change directories into your new repository.
```shell
```shell{:copy}
cd hello-world-docker-action
```
@ -48,7 +48,7 @@ Before you begin, you'll need to create a GitHub repository.
In your new `hello-world-docker-action` directory, create a new `Dockerfile` file. For more information, see "[Dockerfile support for {% data variables.product.prodname_actions %}](/actions/creating-actions/dockerfile-support-for-github-actions)."
**Dockerfile**
```dockerfile
```dockerfile{:copy}
# Container image that runs your code
FROM alpine:3.10
@ -65,7 +65,7 @@ Create a new `action.yml` file in the `hello-world-docker-action` directory you
{% raw %}
**action.yml**
```yaml
```yaml{:copy}
# action.yml
name: 'Hello World'
description: 'Greet someone and record the time'
@ -93,29 +93,28 @@ This metadata defines one `who-to-greet` input and one `time` output parameter.
You can choose any base Docker image and, therefore, any language for your action. The following shell script example uses the `who-to-greet` input variable to print "Hello [who-to-greet]" in the log file.
Next, the script gets the current time and sets it as an output variable that actions running later in a job can use. In order for {% data variables.product.prodname_dotcom %} to recognize output variables, you must use a workflow command in a specific syntax: `echo "::set-output name=<output name>::<value>"`. For more information, see "[Workflow commands for {% data variables.product.prodname_actions %}](/actions/reference/workflow-commands-for-github-actions#setting-an-output-parameter)."
Next, the script gets the current time and sets it as an output variable that actions running later in a job can use. In order for {% data variables.product.prodname_dotcom %} to recognize output variables, you must use a workflow command in a specific syntax: `echo "::set-output name=<output name>::<value>"`. For more information, see "[Workflow commands for {% data variables.product.prodname_actions %}](/actions/reference/workflow-commands-for-github-actions#setting-an-output-parameter)."
1. Create a new `entrypoint.sh` file in the `hello-world-docker-action` directory.
1. Make your `entrypoint.sh` file executable:
```shell
chmod +x entrypoint.sh
```
1. Add the following code to your `entrypoint.sh` file.
**entrypoint.sh**
```shell
```shell{:copy}
#!/bin/sh -l
echo "Hello $1"
time=$(date)
echo "::set-output name=time::$time"
```
If `entrypoint.sh` executes without any errors, the action's status is set to `success`. You can also explicitly set exit codes in your action's code to provide an action's status. For more information, see "[Setting exit codes for actions](/actions/creating-actions/setting-exit-codes-for-actions)."
1. Make your `entrypoint.sh` file executable by running the following command on your system.
```shell{:copy}
$ chmod +x entrypoint.sh
```
### Creating a README
To let people know how to use your action, you can create a README file. A README is most helpful when you plan to share your action publicly, but is also a great way to remind you or your team how to use the action.
@ -130,7 +129,7 @@ In your `hello-world-docker-action` directory, create a `README.md` file that sp
- An example of how to use your action in a workflow.
**README.md**
```markdown
```markdown{:copy}
# Hello world docker action
This action prints "Hello World" or "Hello" + the name of a person to greet to the log.
@ -160,7 +159,7 @@ From your terminal, commit your `action.yml`, `entrypoint.sh`, `Dockerfile`, and
It's best practice to also add a version tag for releases of your action. For more information on versioning your action, see "[About actions](/actions/automating-your-workflow-with-github-actions/about-actions#using-release-management-for-actions)."
```shell
```shell{:copy}
git add action.yml entrypoint.sh Dockerfile README.md
git commit -m "My first action is ready"
git tag -a -m "My first action release" v1
@ -175,11 +174,11 @@ Now you're ready to test your action out in a workflow. When an action is in a p
#### Example using a public action
The following workflow code uses the completed hello world action in the public [`actions/hello-world-docker-action`](https://github.com/actions/hello-world-docker-action) repository. Copy the following workflow example code into a `.github/workflows/main.yml` file, but replace the `actions/hello-world-docker-action` with your repository and action name. You can also replace the `who-to-greet` input with your name.
The following workflow code uses the completed _hello world_ action in the public [`actions/hello-world-docker-action`](https://github.com/actions/hello-world-docker-action) repository. Copy the following workflow example code into a `.github/workflows/main.yml` file, but replace the `actions/hello-world-docker-action` with your repository and action name. You can also replace the `who-to-greet` input with your name. {% if currentVersion == "free-pro-team@latest" %}Public actions can be used even if they're not published to {% data variables.product.prodname_marketplace %}. For more information, see "[Publishing an action](/actions/creating-actions/publishing-actions-in-github-marketplace#publishing-an-action)." {% endif %}
{% raw %}
**.github/workflows/main.yml**
```yaml
```yaml{:copy}
on: [push]
jobs:
@ -200,11 +199,11 @@ jobs:
#### Example using a private action
Copy the following example workflow code into a `.github/workflows/main.yml` file in your action's repository. You can also replace the `who-to-greet` input with your name.
Copy the following example workflow code into a `.github/workflows/main.yml` file in your action's repository. You can also replace the `who-to-greet` input with your name. {% if currentVersion == "free-pro-team@latest" %}This private action can't be published to {% data variables.product.prodname_marketplace %}, and can only be used in this repository.{% endif %}
{% raw %}
**.github/workflows/main.yml**
```yaml
```yaml{:copy}
on: [push]
jobs:

Просмотреть файл

@ -47,7 +47,7 @@ You can use the `services` keyword to create service containers that are part of
This example creates a service called `redis` in a job called `container-job`. The Docker host in this example is the `node:10.18-jessie` container.
{% raw %}
```yaml
```yaml{:copy}
name: Redis container example
on: push
@ -89,7 +89,7 @@ When you specify the Docker host port but not the container port, the container
This example maps the service container `redis` port 6379 to the Docker host port 6379.
{% raw %}
```yaml
```yaml{:copy}
name: Redis Service Example
on: push

Просмотреть файл

@ -38,7 +38,7 @@ To get started quickly, you can choose the preconfigured Ant template when you c
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
{% raw %}
```yaml
```yaml{:copy}
name: Java CI
on: [push]
@ -79,7 +79,7 @@ The starter workflow will run the default target specified in your _build.xml_ f
If you use different commands to build your project, or you want to run a different target, you can specify those. For example, you may want to run the `jar` target that's configured in your _build-ci.xml_ file.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1
@ -97,7 +97,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Ant will usually create output files like JARs, EARs, or WARs in the `build/jar` directory. You can upload the contents of that directory using the `upload-artifact` action.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1

Просмотреть файл

@ -38,7 +38,7 @@ To get started quickly, you can choose the preconfigured Gradle template when yo
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
{% raw %}
```yaml
```yaml{:copy}
name: Java CI
on: [push]
@ -79,7 +79,7 @@ The starter workflow will run the `build` task by default. In the default Gradle
If you use different commands to build your project, or you want to use a different task, you can specify those. For example, you may want to run the `package` task that's configured in your _ci.gradle_ file.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1
@ -95,7 +95,7 @@ steps:
When using {% data variables.product.prodname_dotcom %}-hosted runners, you can cache your dependencies to speed up your workflow runs. After a successful run, your local Gradle package cache will be stored on GitHub Actions infrastructure. In future workflow runs, the cache will be restored so that dependencies don't need to be downloaded from remote package repositories. For more information, see "<a href="/actions/guides/caching-dependencies-to-speed-up-workflows" class="dotcom-only">Caching dependencies to speed up workflows</a>" and the [`cache` action](https://github.com/marketplace/actions/cache).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up JDK 1.8
@ -122,7 +122,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Gradle will usually create output files like JARs, EARs, or WARs in the `build/libs` directory. You can upload the contents of that directory using the `upload-artifact` action.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1

Просмотреть файл

@ -38,7 +38,7 @@ To get started quickly, you can choose the preconfigured Maven template when you
You can also add this workflow manually by creating a new file in the `.github/workflows` directory of your repository.
{% raw %}
```yaml
```yaml{:copy}
name: Java CI
on: [push]
@ -79,7 +79,7 @@ The starter workflow will run the `package` target by default. In the default Ma
If you use different commands to build your project, or you want to use a different target, you can specify those. For example, you may want to run the `verify` target that's configured in a _pom-ci.xml_ file.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1
@ -95,7 +95,7 @@ steps:
When using {% data variables.product.prodname_dotcom %}-hosted runners, you can cache your dependencies to speed up your workflow runs. After a successful run, your local Maven repository will be stored on GitHub Actions infrastructure. In future workflow runs, the cache will be restored so that dependencies don't need to be downloaded from remote Maven repositories. For more information, see "<a href="/actions/guides/caching-dependencies-to-speed-up-workflows" class="dotcom-only">Caching dependencies to speed up workflows</a>" and the [`cache` action](https://github.com/marketplace/actions/cache).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up JDK 1.8
@ -122,7 +122,7 @@ After your build has succeeded and your tests have passed, you may want to uploa
Maven will usually create output files like JARs, EARs, or WARs in the `target` directory. To upload those as artifacts, you can copy them into a new directory that contains artifacts to upload. For example, you can create a directory called `staging`. Then you can upload the contents of that directory using the `upload-artifact` action.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-java@v1

Просмотреть файл

@ -77,7 +77,7 @@ The template includes a matrix strategy that builds and tests your code with fou
Each job can access the value defined in the matrix `node-version` array using the `matrix` context. The `setup-node` action uses the context as the `node-version` input. The `setup-node` action configures each job with a different Node.js version before building and testing code. For more information about matrix strategies and contexts, see "[Workflow syntax for {% data variables.product.prodname_actions %}](/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix)" and "[Context and expression syntax for {% data variables.product.prodname_actions %}](/actions/reference/context-and-expression-syntax-for-github-actions)."
{% raw %}
```yaml
```yaml{:copy}
strategy:
matrix:
node-version: [10.x, 12.x, 14.x, 15.x]
@ -93,7 +93,7 @@ steps:
Alternatively, you can build and test with exact Node.js versions.
```yaml
```yaml{:copy}
strategy:
matrix:
node-version: [8.16.2, 10.17.0]
@ -102,7 +102,7 @@ strategy:
Or, you can build and test using a single version of Node.js too.
{% raw %}
```yaml
```yaml{:copy}
name: Node.js CI
on: [push]
@ -136,7 +136,7 @@ When using {% data variables.product.prodname_dotcom %}-hosted runners, you can
This example installs the dependencies defined in the *package.json* file. For more information, see [`npm install`](https://docs.npmjs.com/cli/install).
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -150,7 +150,7 @@ steps:
Using `npm ci` installs the versions in the *package-lock.json* or *npm-shrinkwrap.json* file and prevents updates to the lock file. Using `npm ci` is generally faster than running `npm install`. For more information, see [`npm ci`](https://docs.npmjs.com/cli/ci.html) and "[Introducing `npm ci` for faster, more reliable builds](https://blog.npmjs.org/post/171556855892/introducing-npm-ci-for-faster-more-reliable)."
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -166,7 +166,7 @@ steps:
This example installs the dependencies defined in the *package.json* file. For more information, see [`yarn install`](https://yarnpkg.com/en/docs/cli/install).
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -179,7 +179,7 @@ steps:
Alternatively, you can pass `--frozen-lockfile` to install the versions in the *yarn.lock* file and prevent updates to the *yarn.lock* file.
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -201,7 +201,7 @@ In the example below, the secret `NPM_TOKEN` stores the npm authentication token
Before installing dependencies, use the `setup-node` action to create the *.npmrc* file. The action has two input parameters. The `node-version` parameter sets the Node.js version, and the `registry-url` parameter sets the default registry. If your package registry uses scopes, you must use the `scope` parameter. For more information, see [`npm-scope`](https://docs.npmjs.com/misc/scope).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -231,7 +231,7 @@ always-auth=true
When using {% data variables.product.prodname_dotcom %}-hosted runners, you can cache dependencies using a unique key, and restore the dependencies when you run future workflows using the `cache` action. For more information, see "<a href="/actions/guides/caching-dependencies-to-speed-up-workflows" class="dotcom-only">Caching dependencies to speed up workflows</a>" and the [`cache` action](https://github.com/marketplace/actions/cache).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js
@ -256,7 +256,7 @@ steps:
You can use the same commands that you use locally to build and test your code. For example, if you run `npm run build` to run build steps defined in your *package.json* file and `npm test` to run your test suite, you would add those commands in your workflow file.
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Use Node.js

Просмотреть файл

@ -37,7 +37,7 @@ We recommend that you have a basic understanding of Python, PyPy, and pip. For m
To get started quickly, add the template to the `.github/workflows` directory of your repository.
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -94,7 +94,7 @@ If you are using a self-hosted runner, you can configure the runner to use the `
#### Using multiple Python versions
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -126,7 +126,7 @@ jobs:
You can configure a specific version of python. For example, 3.8. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest minor release of Python 3.
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -158,7 +158,7 @@ If you specify a version of Python that is not available, `setup-python` fails w
You can also use the `exclude` keyword in your workflow if there is a configuration of Python that you do not wish to run. For more information, see "[Workflow syntax for {% data variables.product.prodname_actions %}](/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions#jobsjob_idstrategy)."
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -196,7 +196,7 @@ We recommend using `setup-python` to configure the version of Python used in you
When using {% data variables.product.prodname_dotcom %}-hosted runners, you can also cache dependencies to speed up your workflow. For more information, see "<a href="/actions/guides/caching-dependencies-to-speed-up-workflows" class="dotcom-only">Caching dependencies to speed up workflows</a>."
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up Python
@ -213,7 +213,7 @@ steps:
After you update `pip`, a typical next step is to install dependencies from *requirements.txt*.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up Python
@ -234,7 +234,7 @@ When using {% data variables.product.prodname_dotcom %}-hosted runners, you can
Pip caches dependencies in different locations, depending on the operating system of the runner. The path you'll need to cache may differ from the Ubuntu example below depending on the operating system you use. For more information, see [Python caching examples](https://github.com/actions/cache/blob/main/examples.md#python---pip).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Setup Python
@ -271,7 +271,7 @@ You can use the same commands that you use locally to build and test your code.
This example installs or upgrades `pytest` and `pytest-cov`. Tests are then run and output in JUnit format while code coverage results are output in Cobertura. For more information, see [JUnit](https://junit.org/junit5/) and [Cobertura](https://cobertura.github.io/cobertura/).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up Python
@ -295,7 +295,7 @@ steps:
The following example installs or upgrades `flake8` and uses it to lint all files. For more information, see [Flake8](http://flake8.pycqa.org/en/latest/).
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up Python
@ -318,7 +318,7 @@ steps:
With {% data variables.product.prodname_actions %}, you can run tests with tox and spread the work across multiple jobs. You'll need to invoke tox using the `-e py` option to choose the version of Python in your `PATH`, rather than specifying a specific version. For more information, see [tox](https://tox.readthedocs.io/en/latest/).
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -352,7 +352,7 @@ You can upload artifacts to view after a workflow completes. For example, you ma
The following example demonstrates how you can use the `upload-artifact` action to archive test results from running `pytest`. For more information, see the [`upload-artifact` action](https://github.com/actions/upload-artifact).
{% raw %}
```yaml
```yaml{:copy}
name: Python package
on: [push]
@ -395,7 +395,7 @@ You can configure your workflow to publish your Python package to any package re
You can store any access tokens or credentials needed to publish your package using secrets. The following example creates and publishes a package to PyPI using `twine` and `dist`. For more information, see "[Creating and using encrypted secrets](/github/automating-your-workflow-with-github-actions/creating-and-using-encrypted-secrets)."
{% raw %}
```yaml
```yaml{:copy}
name: Upload Python Package
on:

Просмотреть файл

@ -68,7 +68,7 @@ For more information, see [`actions/cache`](https://github.com/actions/cache).
This example creates a new cache when the packages in `package-lock.json` file change, or when the runner's operating system changes. The cache key uses contexts and expressions to generate a key that includes the runner's operating system and a SHA-256 hash of the `package-lock.json` file.
{% raw %}
```yaml
```yaml{:copy}
name: Caching with npm
on: push

Просмотреть файл

@ -37,7 +37,7 @@ You may also find it helpful to have a basic understanding of YAML, the syntax f
{% data reusables.github-actions.copy-workflow-file %}
{% raw %}
```yaml
```yaml{:copy}
name: PostgreSQL service example
on: push
@ -94,7 +94,7 @@ jobs:
{% data reusables.github-actions.postgres-label-description %}
```yaml
```yaml{:copy}
jobs:
# Label of the container job
container-job:
@ -124,7 +124,7 @@ jobs:
{% data reusables.github-actions.service-template-steps %}
```yaml
```yaml{:copy}
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -159,7 +159,7 @@ When you run a job directly on the runner machine, you'll need to map the ports
{% data reusables.github-actions.copy-workflow-file %}
{% raw %}
```yaml
```yaml{:copy}
name: PostgreSQL Service Example
on: push
@ -220,7 +220,7 @@ jobs:
The workflow maps port 5432 on the PostgreSQL service container to the Docker host. For more information about the `ports` keyword, see "[About service containers](/actions/automating-your-workflow-with-github-actions/about-service-containers#mapping-docker-host-and-service-container-ports)."
```yaml
```yaml{:copy}
jobs:
# Label of the runner job
runner-job:
@ -251,7 +251,7 @@ jobs:
{% data reusables.github-actions.service-template-steps %}
```yaml
```yaml{:copy}
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -287,7 +287,7 @@ You can modify *client.js* to include any PostgreSQL operations needed by your w
{% data reusables.github-actions.service-container-add-script %}
```javascript
```javascript{:copy}
const { Client } = require('pg');
const pgclient = new Client({

Просмотреть файл

@ -37,7 +37,7 @@ You may also find it helpful to have a basic understanding of YAML, the syntax f
{% data reusables.github-actions.copy-workflow-file %}
{% raw %}
```yaml
```yaml{:copy}
name: Redis container example
on: push
@ -91,7 +91,7 @@ jobs:
{% data reusables.github-actions.redis-label-description %}
```yaml
```yaml{:copy}
jobs:
# Label of the container job
container-job:
@ -118,7 +118,7 @@ jobs:
{% data reusables.github-actions.service-template-steps %}
```yaml
```yaml{:copy}
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -152,7 +152,7 @@ When you run a job directly on the runner machine, you'll need to map the ports
{% data reusables.github-actions.copy-workflow-file %}
{% raw %}
```yaml
```yaml{:copy}
name: Redis runner example
on: push
@ -210,7 +210,7 @@ jobs:
The workflow maps port 6379 on the Redis service container to the Docker host. For more information about the `ports` keyword, see "[About service containers](/actions/automating-your-workflow-with-github-actions/about-service-containers#mapping-docker-host-and-service-container-ports)."
```yaml
```yaml{:copy}
jobs:
# Label of the runner job
runner-job:
@ -238,7 +238,7 @@ jobs:
{% data reusables.github-actions.service-template-steps %}
```yaml
```yaml{:copy}
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
@ -274,7 +274,7 @@ You can modify *client.js* to include any Redis operations needed by your workfl
{% data reusables.github-actions.service-container-add-script %}
```javascript
```javascript{:copy}
const redis = require("redis");
// Creates a new Redis client

Просмотреть файл

@ -55,7 +55,7 @@ The `build-push-action` options required for Docker Hub are:
* `repository`: Your Docker Hub repository in the format `DOCKER-HUB-NAMESPACE/DOCKER-HUB-REPOSITORY`.
{% raw %}
```yaml
```yaml{:copy}
name: Publish Docker image
on:
release:
@ -93,7 +93,7 @@ The `build-push-action` options required for {% data variables.product.prodname_
* `repository`: Must be set in the format `OWNER/REPOSITORY/IMAGE_NAME`. For example, for an image named `octo-image` stored on {% data variables.product.prodname_dotcom %} at `http://github.com/octo-org/octo-repo`, the `repository` option should be set to `octo-org/octo-repo/octo-image`.
{% raw %}
```yaml
```yaml{:copy}
name: Publish Docker image
on:
release:
@ -126,7 +126,7 @@ In a single workflow, you can publish your Docker image to multiple registries b
The following example workflow uses the `build-push-action` steps from the previous sections ("[Publishing images to Docker Hub](#publishing-images-to-docker-hub)" and "[Publishing images to {% data variables.product.prodname_registry %}](#publishing-images-to-github-packages)") to create a single workflow that pushes to both registries.
{% raw %}
```yaml
```yaml{:copy}
name: Publish Docker image
on:
release:

Просмотреть файл

@ -43,7 +43,7 @@ Each time you create a new release, you can trigger a workflow to publish your p
You can define a new Maven repository in the publishing block of your _build.gradle_ file that points to your package repository. For example, if you were deploying to the Maven Central Repository through the OSSRH hosting project, your _build.gradle_ could specify a repository with the name `"OSSRH"`.
{% raw %}
```groovy
```groovy{:copy}
publishing {
...
@ -67,7 +67,7 @@ In the deploy step, youll need to set environment variables for the username
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to the Maven Central Repository
on:
release:
@ -105,7 +105,7 @@ The `GITHUB_TOKEN` exists in your repository by default and has read and write p
For example, if your organization is named "octocat" and your repository is named "hello-world", then the {% data variables.product.prodname_registry %} configuration in _build.gradle_ would look similar to the below example.
{% raw %}
```groovy
```groovy{:copy}
publishing {
...
@ -126,7 +126,7 @@ publishing {
With this configuration, you can create a workflow that publishes your package to the Maven Central Repository by running the `gradle publish` command.
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to GitHub Packages
on:
release:
@ -162,7 +162,7 @@ For example, if you deploy to the Central Repository through the OSSRH hosting p
If your organization is named "octocat" and your repository is named "hello-world", then the {% data variables.product.prodname_registry %} configuration in _build.gradle_ would look similar to the below example.
{% raw %}
```groovy
```groovy{:copy}
publishing {
...
@ -191,7 +191,7 @@ publishing {
With this configuration, you can create a workflow that publishes your package to both the Maven Central Repository and {% data variables.product.prodname_registry %} by running the `gradle publish` command.
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to the Maven Central Repository and GitHub Packages
on:
release:

Просмотреть файл

@ -47,7 +47,7 @@ In this workflow, you can use the `setup-java` action. This action installs the
For example, if you were deploying to the Maven Central Repository through the OSSRH hosting project, your _pom.xml_ could specify a distribution management repository with the `id` of `ossrh`.
{% raw %}
```xml
```xml{:copy}
<project ...>
...
<distributionManagement>
@ -67,7 +67,7 @@ In the deploy step, youll need to set the environment variables to the userna
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to the Maven Central Repository
on:
release:
@ -113,7 +113,7 @@ For a Maven-based project, you can make use of these settings by creating a dist
For example, if your organization is named "octocat" and your repository is named "hello-world", then the {% data variables.product.prodname_registry %} configuration in _pom.xml_ would look similar to the below example.
{% raw %}
```xml
```xml{:copy}
<project ...>
...
<distributionManagement>
@ -130,7 +130,7 @@ For example, if your organization is named "octocat" and your repository is name
With this configuration, you can create a workflow that publishes your package to {% data variables.product.prodname_registry %} by making use of the automatically generated _settings.xml_.
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to GitHub Packages
on:
release:
@ -165,7 +165,7 @@ You can publish your packages to both the Maven Central Repository and {% data v
Ensure your _pom.xml_ file includes a distribution management repository for both your {% data variables.product.prodname_dotcom %} repository and your Maven Central Repository provider. For example, if you deploy to the Central Repository through the OSSRH hosting project, you might want to specify it in a distribution management repository with the `id` set to `ossrh`, and you might want to specify {% data variables.product.prodname_registry %} in a distribution management repository with the `id` set to `github`.
{% raw %}
```yaml
```yaml{:copy}
name: Publish package to the Maven Central Repository and GitHub Packages
on:
release:

Просмотреть файл

@ -54,7 +54,7 @@ If you're publishing a package that includes a scope prefix, include the scope i
This example stores the `NPM_TOKEN` secret in the `NODE_AUTH_TOKEN` environment variable. When the `setup-node` action creates an *.npmrc* file, it references the token from the `NODE_AUTH_TOKEN` environment variable.
{% raw %}
```yaml
```yaml{:copy}
name: Node.js Package
on:
release:
@ -114,7 +114,7 @@ If you want to publish your package to a different repository, you must use a pe
This example stores the `GITHUB_TOKEN` secret in the `NODE_AUTH_TOKEN` environment variable. When the `setup-node` action creates an *.npmrc* file, it references the token from the `NODE_AUTH_TOKEN` environment variable.
{% raw %}
```yaml
```yaml{:copy}
name: Node.js Package
on:
release:
@ -151,7 +151,7 @@ always-auth=true
If you use the Yarn package manager, you can install and publish packages using Yarn.
{% raw %}
```yaml
```yaml{:copy}
name: Node.js Package
on:
release:
@ -196,7 +196,7 @@ When you use the `scope` input to the `setup-node` action, the action creates an
This workflow calls the `setup-node` action two times. Each time the `setup-node` action runs, it overwrites the *.npmrc* file. The *.npmrc* file references the token that allows you to perform authenticated operations against the package registry from the `NODE_AUTH_TOKEN` environment variable. The workflow sets the `NODE_AUTH_TOKEN` environment variable each time the `npm publish` command is run, first with a token to publish to npm (`NPM_TOKEN`) and then with a token to publish to {% data variables.product.prodname_registry %} (`GITHUB_TOKEN`).
{% raw %}
```yaml
```yaml{:copy}
name: Node.js Package
on:
release:

Просмотреть файл

@ -79,7 +79,7 @@ This example shows you how to create a workflow for a Node.js project that build
The workflow uploads the production artifacts in the `dist` directory, but excludes any markdown files. It also and uploads the `code-coverage.html` report as another artifact.
```yaml
```yaml{:copy}
name: Node CI
on: [push]
@ -114,7 +114,7 @@ jobs:
You can define a custom retention period for individual artifacts created by a workflow. When using a workflow to create a new artifact, you can use `retention-days` with the `upload-artifact` action. This example demonstrates how to set a custom retention period of 5 days for the artifact named `my-artifact`:
```yaml
```yaml{:copy}
- name: 'Upload Artifact'
uses: actions/upload-artifact@v2
with:
@ -183,7 +183,7 @@ Job 3 displays the result uploaded in the previous job:
The full math operation performed in this workflow example is `(3 + 7) x 9 = 90`.
```yaml
```yaml{:copy}
name: Share data between jobs
on: [push]

Просмотреть файл

@ -10,30 +10,27 @@ featuredLinks:
- /actions/learn-github-actions
- /actions/guides/about-continuous-integration
- /actions/guides/about-packaging-with-github-actions
gettingStarted:
- /actions/managing-workflow-runs
- /actions/hosting-your-own-runners
guideCards:
- /actions/guides/setting-up-continuous-integration-using-workflow-templates
- /actions/guides/publishing-nodejs-packages
- /actions/guides/building-and-testing-powershell
popular:
- /actions/reference/workflow-syntax-for-github-actions
- /actions/reference/events-that-trigger-workflows
- /actions/learn-github-actions
- /actions/reference/events-that-trigger-workflows
- /actions/reference/context-and-expression-syntax-for-github-actions
- /actions/reference/workflow-commands-for-github-actions
- /actions/reference/environment-variables
- /actions/reference/encrypted-secrets
changelog:
- title: Environments, environment protection rules and environment secrets (beta)
date: '2020-12-15'
href: https://github.blog/changelog/2020-12-15-github-actions-environments-environment-protection-rules-and-environment-secrets-beta/
- title: Workflow visualization
date: '2020-12-08'
href: https://github.blog/changelog/2020-12-08-github-actions-workflow-visualization/
- title: Removing set-env and add-path commands on November 16
date: '2020-11-09'
href: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/
- title: Ubuntu-latest workflows will use Ubuntu-20.04
date: '2020-10-29'
href: https://github.blog/changelog/2020-10-29-github-actions-ubuntu-latest-workflows-will-use-ubuntu-20-04
product_video: https://www.youtube-nocookie.com/embed/cP0I9w2coGU

Просмотреть файл

@ -22,7 +22,7 @@ featuredLinks:
- /discussions/guides/finding-discussions-across-multiple-repositories
- /discussions/collaborating-with-your-community-using-discussions/collaborating-with-maintainers-using-discussions
- /discussions/managing-discussions-for-your-community/managing-categories-for-discussions-in-your-repository
product_video: https://www.youtube-nocookie.com/embed/DbTWBP3_RbM
product_video: https://www.youtube-nocookie.com/embed/IpBw2SJkFyk
layout: product-landing
versions:
free-pro-team: '*'

Просмотреть файл

@ -43,7 +43,7 @@ versions:
![Copy remote repository URL field](/assets/images/help/repository/copy-remote-repository-url-quick-setup.png)
8. In Terminal, [add the URL for the remote repository](/articles/adding-a-remote) where your local repository will be pushed.
```shell
$ git remote add origin <em>remote repository URL</em>
$ git remote add origin <em> &lt;REMOTE_URL> </em>
# Sets the new remote
$ git remote -v
# Verifies the new remote URL
@ -80,7 +80,7 @@ versions:
![Copy remote repository URL field](/assets/images/help/repository/copy-remote-repository-url-quick-setup.png)
8. In the Command prompt, [add the URL for the remote repository](/articles/adding-a-remote) where your local repository will be pushed.
```shell
$ git remote add origin <em>remote repository URL</em>
$ git remote add origin <em> &lt;REMOTE_URL> </em>
# Sets the new remote
$ git remote -v
# Verifies the new remote URL
@ -117,7 +117,7 @@ versions:
![Copy remote repository URL field](/assets/images/help/repository/copy-remote-repository-url-quick-setup.png)
8. In Terminal, [add the URL for the remote repository](/articles/adding-a-remote) where your local repository will be pushed.
```shell
$ git remote add origin <em>remote repository URL</em>
$ git remote add origin <em> &lt;REMOTE_URL> </em>
# Sets the new remote
$ git remote -v
# Verifies the new remote URL

Просмотреть файл

@ -16,6 +16,7 @@ versions:
- You must install dependencies on the application server.
- [Docker](https://docs.docker.com/install/) 1.13.0+
- [Docker Compose](https://docs.docker.com/compose/install/) v1.17.0+
- [netcat](http://netcat.sourceforge.net/), available via apt for [Debian](https://packages.debian.org/search?keywords=netcat) and [Ubuntu](https://packages.ubuntu.com/search?keywords=netcat&searchon=names)
{% note %}

Просмотреть файл

@ -12,22 +12,23 @@ featuredLinks:
- /packages/manage-packages/installing-a-package
popular:
- /packages/guides/configuring-npm-for-use-with-github-packages
- /packages/learn-github-packages/about-github-packages
- /packages/guides/configuring-docker-for-use-with-github-packages
- /packages/learn-github-packages
- /packages/guides/configuring-apache-maven-for-use-with-github-packages
guideCards:
- /packages/guides/configuring-npm-for-use-with-github-packages
- /packages/guides/enabling-improved-container-support
- /packages/guides/configuring-rubygems-for-use-with-github-packages
changelog:
- title: ghcr.io maintenance mode on 2021-01-09
date: '2021-01-08'
href: https://github.blog/changelog/2021-01-08-packages-ghcr-io-maintenance-mode-on-2021-01-09/
- title: ghcr.io container names redirect to the container page
date: '2020-12-14'
href: https://github.blog/changelog/2020-12-14-ghcr-io-container-names-redirect-to-the-container-page/
- title: Filter for tagged and untagged containers
date: '2020-12-14'
href: https://github.blog/changelog/2020-12-14-packages-can-filter-for-tagged-and-untagged-containers/
- title: Packages container support is an opt-in beta
date: '2020-11-17'
href: https://docs.github.com/packages/getting-started-with-github-container-registry/enabling-improved-container-support
redirect_from:
- /github/managing-packages-with-github-packages
- /categories/managing-packages-with-github-package-registry

Просмотреть файл

@ -90,10 +90,10 @@ Why do we need this? For our daily shipping needs, it's tolerable that search up
### Code files
- [javascripts/search.js](javascripts/search.js) - The browser-side code that enables search using Algolia's [InstantSearch.js](https://github.com/algolia/instantsearch.js/) library.
- [lib/algolia/client.js](lib/algolia/client.js) - A thin wrapper around the [algoliasearch](https://ghub.io/algoliasearch) Node.js module for interacting with the Algolia API.
- [lib/algolia/search-index.js](lib/algolia/search-index.js) - A class for generating structured search data from repository content and syncing it with the remote Algolia service. This class has built-in validation to ensure that all records are valid before they're uploaded. This class also takes care of removing deprecated records, and compares existing remote records with the latest local records to avoid uploading records that haven't changed.
- [script/sync-algolia-search-indices.js](script/sync-algolia-search-indices.js) - The script used by the Actions workflow to update search indices on our Algolia account. This can also be [run in the development environment](#development).
- [javascripts/search.js](javascripts/search.js) - The browser-side code that enables search.
- [lib/search/algolia-client.js](lib/search/algolia-client.js) - A thin wrapper around the [algoliasearch](https://ghub.io/algoliasearch) Node.js module for interacting with the Algolia API.
- [lib/search/algolia-search-index.js](lib/search/algolia-search-index.js) - A class for generating structured search data from repository content and syncing it with the remote Algolia service. This class has built-in validation to ensure that all records are valid before they're uploaded. This class also takes care of removing deprecated records, and compares existing remote records with the latest local records to avoid uploading records that haven't changed.
- [script/sync-search-indices.js](script/sync-search-indices.js) - The script used by the Actions workflow to update search indices on our Algolia account. This can also be [run in the development environment](#development).
- [tests/algolia-search.js](tests/algolia-search.js) - Tests!
## Indices
@ -136,4 +136,4 @@ Each record represents a section of a page. Sections are derived by splitting up
- It's not strictly necessary to set an `objectID` as Algolia will create one automatically, but by creating our own we have a guarantee that subsequent invocations of this upload script will overwrite existing records instead of creating numerous duplicate records with differing IDs.
- Algolia has typo tolerance. Try spelling something wrong and see what you get!
- Algolia has lots of controls for customizing each index, so we can add weights to certain attributes and create rules like "title is more important than body", etc. But it works pretty well as-is without any configuration.
- Algolia has support for "advanced query syntax" for exact matching of quoted expressions and exclusion of words preceded by a `-` sign. This is off by default but we have it enabled in our browser client. This and many other settings can be configured in Algolia.com web interface. The settings in the web interface can be overridden by the InstantSearch.js client. See [javascripts/search.js]([javascripts/search.js).
- Algolia has support for "advanced query syntax" for exact matching of quoted expressions and exclusion of words preceded by a `-` sign. This is off by default but we have it enabled in our browser client. This and many other settings can be configured in Algolia.com web interface. The settings in the web interface can be overridden by the search endpoint. See [middleware/search.js]([middleware/search.js).

Просмотреть файл

@ -134,7 +134,8 @@ sections:
- '`ghe-config-apply` occassionally fails with `ERROR: Failure waiting for nomad jobs to apply` until the Nomad job queue is cleared. This currently requires as admin to delete `/etc/nomad-jobs/queue`.'
- When configuring a multiple replica node, the status of the replica can be incorrectly synchronized.
- Customers attempting to restore a 3.0 backup to a new instance should not pre-configure the instance, as it may lead to a bad state for user logins. We recommend restoring to a fresh, unconfigured instance.
- GitHub Enterprise Server 3.0 release candidates are not yet available in the Azure marketplace. To test RC1 in staging environments, start a 2.21 or 2.22 instance, and then upgrade it with the Azure upgrade package on the download page.
- GitHub Enterprise Server 3.0 release candidates are not yet available in the Azure marketplace. To test release candidates in staging environments, start a 2.21 or 2.22 instance, and then upgrade it with the Azure upgrade package on the download page.
- The image and upgrade package download size has increased. Customers on slow internet connections may find the packages take longer to download.
backups:
- '{% data variables.product.prodname_ghe_server %} 3.0 requires at least [GitHub Enterprise Backup Utilities 3.0.0](https://github.com/github/backup-utils) for [Backups and Disaster Recovery](/enterprise-server@3.0/admin/configuration/configuring-backups-on-your-appliance).'

Просмотреть файл

@ -1,7 +1,7 @@
{% if currentVersion == "free-pro-team@latest" %}
{% note %}
**Note:** For private and internal repositories, {% data variables.product.prodname_code_scanning %} is available when {% data variables.product.prodname_GH_advanced_security %} features are enabled for the repository. If you see the error `Advanced Security must be enabled for this repository to use code scanning.` check that {% data variables.product.prodname_GH_advanced_security %} is enabled. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository)."
**Note:** For private and internal repositories, {% data variables.product.prodname_code_scanning %} is available when {% data variables.product.prodname_GH_advanced_security %} features are enabled for the repository. If you see the error `Advanced Security must be enabled for this repository to use code scanning`, check that {% data variables.product.prodname_GH_advanced_security %} is enabled. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository)."
{% endnote %}
{% endif %}

Просмотреть файл

@ -1 +1 @@
{% data variables.product.prodname_actions %} usage is free for public repositories and self-hosted runners. For private repositories, each {% data variables.product.prodname_dotcom %} account receives a certain amount of free minutes and storage, depending on the product used with the account. Any usage beyond the included amounts is controlled by spending limits.
{% data variables.product.prodname_actions %} usage is free for both public repositories and self-hosted runners. For private repositories, each {% data variables.product.prodname_dotcom %} account receives a certain amount of free minutes and storage, depending on the product used with the account. Any usage beyond the included amounts is controlled by spending limits.

Просмотреть файл

@ -5,7 +5,7 @@ The starter workflow template sets up the `PATH` to contain OpenJDK 8 for the x6
For example, to use version 9.0.4 of the JDK for the x64 platform, you can use the `setup-java` action and configure the `java-version` and `architecture` parameters to `'9.0.4'` and `x64`.
{% raw %}
```yaml
```yaml{:copy}
steps:
- uses: actions/checkout@v2
- name: Set up JDK 9.0.4 for x64

Просмотреть файл

@ -23,7 +23,7 @@
/>
{% endfor %}
<link rel="stylesheet" href="/dist/index.css">
<link rel="stylesheet" href="{{ builtAssets.main.css }}">
<link rel="alternate icon" type="image/png" href="/assets/images/site/favicon.png">
<link rel="icon" type="image/svg+xml" href="/assets/images/site/favicon.svg">
</head>

Просмотреть файл

@ -1 +1 @@
<script src="/dist/index.js"></script>
<script src="{{ builtAssets.main.js }}"></script>

Просмотреть файл

@ -5,8 +5,6 @@
- On all other pages, in the header
-->
<form class="mb-0" aria-hidden="true">
<div id="search-input-container">
<!-- Algolia instantsearch.js will add a search input here -->
</div>
</form>
<div id="search-input-container" aria-hidden="true">
<!-- will add a search input here -->
</div>

Просмотреть файл

@ -1,5 +1,6 @@
import murmur from 'imurmurhash'
import { getUserEventsId, sendEvent } from './events'
// import h from './hyperscript'
const TREATMENT = 'TREATMENT'
const CONTROL = 'CONTROL'
@ -19,23 +20,6 @@ export async function sendSuccess (test) {
})
}
const xmlns = 'http://www.w3.org/2000/svg'
export function h (tagName, attributes = {}, children = []) {
const el = ['svg', 'path'].includes(tagName)
? document.createElementNS(xmlns, tagName)
: document.createElement(tagName)
Object.entries(attributes).forEach(
([key, value]) => el.setAttribute(key, value)
)
children.forEach(child =>
typeof child === 'string'
? el.append(document.createTextNode(child))
: el.append(child)
)
return el
}
export default function () {
// const testName = '$test-name$'
// const xbucket = bucket(testName)

Просмотреть файл

@ -1,15 +0,0 @@
// This module overrides "Hogan" that instantsearch.js uses
// Hogan uses `new Function`,
// so we can't use it with our content security policy.
// Turns out, we use all our own templates anyway,
// so we just have to shim out Hogan so it doesn't error!
export default {
compile (template) {
return {
render (data) {
return ''
}
}
}
}

Просмотреть файл

@ -0,0 +1,44 @@
const xmlns = 'http://www.w3.org/2000/svg'
const plainObjectConstructor = {}.constructor
function exists (value) {
return value !== null && typeof value !== 'undefined'
}
function isPlainObject (value) {
return value.constructor === plainObjectConstructor
}
function isString (value) {
return typeof value === 'string'
}
function renderChildren (el, children) {
for (const child of children) {
if (isPlainObject(child)) {
Object.entries(child)
.filter(([key, value]) => exists(value))
.forEach(([key, value]) => el.setAttribute(key, value))
} else if (Array.isArray(child)) {
renderChildren(el, child)
} else if (isString(child)) {
el.append(document.createTextNode(child))
} else {
el.append(child)
}
}
}
export default function h (tagName, ...children) {
const el = ['svg', 'path'].includes(tagName)
? document.createElementNS(xmlns, tagName)
: document.createElement(tagName)
renderChildren(el, children)
return el
}
export const tags = Object.fromEntries(
['div', 'form', 'a', 'input', 'button', 'ol', 'li', 'em']
.map(tagName => [tagName, (...args) => h(tagName, ...args)])
)

Просмотреть файл

@ -1,9 +1,6 @@
import { tags } from './hyperscript'
import { sendEvent } from './events'
const instantsearch = require('instantsearch.js').default
const { searchBox, hits, configure, analytics } = require('instantsearch.js/es/widgets')
const algoliasearch = require('algoliasearch')
const searchWithYourKeyboard = require('search-with-your-keyboard')
const querystring = require('querystring')
const truncate = require('html-truncate')
const languages = require('../lib/languages')
const allVersions = require('../lib/all-versions')
@ -12,261 +9,96 @@ const nonEnterpriseDefaultVersion = require('../lib/non-enterprise-default-versi
const languageCodes = Object.keys(languages)
const maxContentLength = 300
const hasStandaloneSearch = () => document.getElementById('landing') || document.querySelector('body.error-404') !== null
let $searchInputContainer
let $searchResultsContainer
let $searchOverlay
let $searchInput
const resultTemplate = (item) => {
// Attach an `algolia-query` param to each result link so analytics
// can track the search query that led the user to this result
const input = document.querySelector('#search-input-container input')
if (input) {
const url = new URL(item.objectID, window.location.origin)
const queryParams = new URLSearchParams(url.search.slice(1))
queryParams.append('algolia-query', input.value)
url.search = queryParams.toString()
item.modifiedURL = url.toString()
}
let placeholder = 'Search topics, products...'
let version
let language
// Display page title and heading (if present exists)
const title = item._highlightResult.heading
? [item._highlightResult.title.value, item._highlightResult.heading.value].join(': ')
: item._highlightResult.title.value
export default function search () {
$searchInputContainer = document.getElementById('search-input-container')
$searchResultsContainer = document.getElementById('search-results-container')
// Remove redundant title from the end of breadcrumbs
if (item.breadcrumbs && item.breadcrumbs.endsWith(item.title)) {
item.modifiedBreadcrumbs = item.breadcrumbs.replace(' / ' + item.title, '')
} else {
item.modifiedBreadcrumbs = item.breadcrumbs
}
if (!$searchInputContainer || !$searchResultsContainer) return
// Truncate and ellipsize the content string without breaking any HTML
// within it, such as the <mark> tags added by Algolia for emphasis.
item.modifiedContent = truncate(item._highlightResult.content.value, maxContentLength)
$searchOverlay = document.querySelector('.search-overlay-desktop')
// Construct the template to return
const html = `
<div class="search-result border-top border-gray-light py-3 px-2">
<a href="#" class="no-underline">
<div class="search-result-breadcrumbs d-block text-gray-dark opacity-60 text-small pb-1">${item.modifiedBreadcrumbs}</div>
<div class="search-result-title d-block h4-mktg text-gray-dark">${title}</div>
<div class="search-result-content d-block text-gray">${item.modifiedContent}</div>
</a>
</div>
`
// Sanitize the link's href attribute using the DOM API to prevent XSS
const fragment = document.createRange().createContextualFragment(html)
fragment.querySelector('a').setAttribute('href', item.modifiedURL)
const div = document.createElement('div')
div.appendChild(fragment.cloneNode(true))
return div.innerHTML
}
export default function () {
if (!document.querySelector('#search-results-container')) return
window.initialPageLoad = true
const opts = {
// https://www.algolia.com/apps/ZI5KPY1HBE/dashboard
// This API key is public. There's also a private API key for writing to the Algolia API
searchClient: algoliasearch('ZI5KPY1HBE', '685df617246c3a10abba589b4599288f'),
// There's an index for every version/language combination
indexName: `github-docs-${deriveVersionFromPath()}-${deriveLanguageCodeFromPath()}`,
// allows "phrase queries" and "prohibit operator"
// https://www.algolia.com/doc/api-reference/api-parameters/advancedSyntax/
advancedSyntax: true,
// sync query params to search input
routing: true,
searchFunction: helper => {
// console.log('searchFunction', helper.state)
const query = helper.state.query
const queryPresent = query && query.length > 0
const results = document.querySelector('.ais-Hits')
// avoid conducting an empty search on page load;
if (window.initialPageLoad && !queryPresent) return
// after page load, search should be executed (even if the query is empty)
// so as not to upset the default instantsearch.js behaviors like clearing
// the input when [x] is clicked.
helper.search()
// If on homepage, toggle results container if query is present
if (hasStandaloneSearch()) {
const container = document.getElementById('search-results-container')
// Primer classNames for showing and hiding the results container
const activeClass = container.getAttribute('data-active-class')
const inactiveClass = container.getAttribute('data-inactive-class')
if (!activeClass) {
console.error('container is missing required `data-active-class` attribute', container)
return
}
if (!inactiveClass) {
console.error('container is missing required `data-inactive-class` attribute', container)
return
}
// hide the container when no query is present
container.classList.toggle(activeClass, queryPresent)
container.classList.toggle(inactiveClass, !queryPresent)
}
// Hack to work around a mysterious bug where the input is not cleared
// when the [x] is clicked. Note: this bug only occurs on pages
// loaded with a ?query=foo param already present
if (!queryPresent) {
setTimeout(() => {
document.querySelector('#search-input-container input').value = ''
}, 50)
results.style.display = 'none'
}
if (queryPresent && results) results.style.display = 'block'
window.initialPageLoad = false
toggleSearchDisplay()
}
}
const search = instantsearch(opts)
// There's an index for every version/language combination
version = deriveVersionFromPath()
language = deriveLanguageCodeFromPath()
// Find search placeholder text in a <meta> tag, falling back to a default
const placeholderMeta = document.querySelector('meta[name="site.data.ui.search.placeholder"]')
const placeholder = placeholderMeta ? placeholderMeta.content : 'Search topics, products...'
const $placeholderMeta = document.querySelector('meta[name="site.data.ui.search.placeholder"]')
if ($placeholderMeta) {
placeholder = $placeholderMeta.content
}
search.addWidgets(
[
hits({
container: '#search-results-container',
templates: {
empty: 'No results',
item: resultTemplate
},
// useful for debugging template context, if needed
transformItems: items => {
// console.log(`transformItems`, items)
return items
}
}),
configure({
analyticsTags: [
'site:docs.github.com',
`env:${process.env.NODE_ENV}`
]
}),
searchBox({
container: '#search-input-container',
placeholder,
// only autofocus on the homepage, and only if no #hash is present in the URL
autofocus: (hasStandaloneSearch()) && !window.location.hash.length,
showReset: false,
showSubmit: false
}),
analytics({
pushFunction (params, state, results) {
sendEvent({
type: 'search',
search_query: results.query
// search_context
})
}
})
]
)
$searchInputContainer.append(tmplSearchInput())
$searchInput = $searchInputContainer.querySelector('input')
// enable for debugging
search.on('render', (...args) => {
// console.log(`algolia render`, args)
})
search.on('error', (...args) => {
console.error('algolia error', args)
})
search.start()
searchWithYourKeyboard('#search-input-container input', '.ais-Hits-item')
toggleSearchDisplay()
// delay removal of the query param so analytics client code has a chance to track it
setTimeout(() => { removeAlgoliaQueryTrackingParam() }, 500)
$searchInput.addEventListener('keyup', debounce(onSearch))
}
// When a user performs an in-site search an `algolia-query` param is
// added to the URL so analytics can track the queries and the pages
// they lead to. This function strips the query from the URL after page load,
// so the bare article URL can be copied/bookmarked/shared, sans tracking param
function removeAlgoliaQueryTrackingParam () {
if (
history &&
history.replaceState &&
location &&
location.search &&
location.search.includes('algolia-query=')
) {
// parse the query string, remove the `algolia-query`, and put it all back together
let q = querystring.parse(location.search.replace(/^\?/, ''))
delete q['algolia-query']
q = Object.keys(q).length ? '?' + querystring.stringify(q) : ''
// update the URL in the address bar without modifying the history
history.replaceState(null, '', `${location.pathname}${q}${location.hash}`)
}
// The home page and 404 pages have a standalone search
function hasStandaloneSearch () {
return document.getElementById('landing') ||
document.querySelector('body.error-404') !== null
}
function toggleSearchDisplay (isReset) {
const input = document.querySelector('#search-input-container input')
const overlay = document.querySelector('.search-overlay-desktop')
// If not on homepage...
if (!hasStandaloneSearch()) {
// Open modal if input is clicked
input.addEventListener('focus', () => {
openSearch()
})
// Close modal if overlay is clicked
if (overlay) {
overlay.addEventListener('click', () => {
closeSearch()
})
}
// Open modal if page loads with query in the params/input
if (input.value) {
openSearch()
}
}
function toggleSearchDisplay () {
// Clear/close search, if ESC is clicked
document.addEventListener('keyup', (e) => {
if (e.key === 'Escape') {
closeSearch()
}
})
// If not on homepage...
if (hasStandaloneSearch()) return
const $input = $searchInput
// Open modal if input is clicked
$input.addEventListener('focus', () => {
openSearch()
})
// Close modal if overlay is clicked
if ($searchOverlay) {
$searchOverlay.addEventListener('click', () => {
closeSearch()
})
}
// Open modal if page loads with query in the params/input
if ($input.value) {
openSearch()
}
}
function openSearch () {
document.querySelector('#search-input-container input').classList.add('js-open')
document.querySelector('#search-results-container').classList.add('js-open')
document.querySelector('.search-overlay-desktop').classList.add('js-open')
$searchInput.classList.add('js-open')
$searchResultsContainer.classList.add('js-open')
$searchOverlay.classList.add('js-open')
}
function closeSearch () {
// Close modal if not on homepage
if (!hasStandaloneSearch()) {
document.querySelector('#search-input-container input').classList.remove('js-open')
document.querySelector('#search-results-container').classList.remove('js-open')
document.querySelector('.search-overlay-desktop').classList.remove('js-open')
$searchInput.classList.remove('js-open')
$searchResultsContainer.classList.remove('js-open')
$searchOverlay.classList.remove('js-open')
}
document.querySelector('.ais-Hits').style.display = 'none'
document.querySelector('#search-input-container input').value = ''
window.history.replaceState({}, 'clear search query', window.location.pathname)
const $hits = $searchResultsContainer.querySelector('.ais-Hits')
if ($hits) $hits.style.display = 'none'
$searchInput.value = ''
}
function deriveLanguageCodeFromPath () {
@ -277,8 +109,8 @@ function deriveLanguageCodeFromPath () {
function deriveVersionFromPath () {
// fall back to the non-enterprise default version (FPT currently) on the homepage, 404 page, etc.
const version = location.pathname.split('/')[2] || nonEnterpriseDefaultVersion
const versionObject = allVersions[version] || allVersions[nonEnterpriseDefaultVersion]
const versionStr = location.pathname.split('/')[2] || nonEnterpriseDefaultVersion
const versionObject = allVersions[versionStr] || allVersions[nonEnterpriseDefaultVersion]
// if GHES, returns the release number like 2.21, 2.22, etc.
// if FPT, returns 'dotcom'
@ -287,3 +119,148 @@ function deriveVersionFromPath () {
? versionObject.currentRelease
: versionObject.miscBaseName
}
function debounce (fn, delay = 300) {
let timer
return (...args) => {
clearTimeout(timer)
timer = setTimeout(() => fn.apply(null, args), delay)
}
}
async function onSearch (evt) {
const query = evt.target.value
const url = new URL(location.origin)
url.pathname = '/search'
url.search = new URLSearchParams({ query, version, language }).toString()
const response = await fetch(url, {
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
})
const results = response.ok ? await response.json() : []
$searchResultsContainer.querySelectorAll('*').forEach(el => el.remove())
$searchResultsContainer.append(
tmplSearchResults(results)
)
toggleStandaloneSearch()
// Analytics tracking
sendEvent({
type: 'search',
search_query: query
// search_context
})
}
// If on homepage, toggle results container if query is present
function toggleStandaloneSearch () {
if (!hasStandaloneSearch()) return
const query = $searchInput.value
const queryPresent = query && query.length > 0
const $results = document.querySelector('.ais-Hits')
// Primer classNames for showing and hiding the results container
const activeClass = $searchResultsContainer.getAttribute('data-active-class')
const inactiveClass = $searchResultsContainer.getAttribute('data-inactive-class')
if (!activeClass) {
console.error('container is missing required `data-active-class` attribute', $searchResultsContainer)
return
}
if (!inactiveClass) {
console.error('container is missing required `data-inactive-class` attribute', $searchResultsContainer)
return
}
// hide the container when no query is present
$searchResultsContainer.classList.toggle(activeClass, queryPresent)
$searchResultsContainer.classList.toggle(inactiveClass, !queryPresent)
if (queryPresent && $results) $results.style.display = 'block'
}
/** * Template functions ***/
function tmplSearchInput () {
// only autofocus on the homepage, and only if no #hash is present in the URL
const autofocus = (hasStandaloneSearch() && !location.hash.length) || null
const { div, form, input, button } = tags
return div(
{ class: 'ais-SearchBox' },
form(
{ role: 'search', class: 'ais-SearchBox-form', novalidate: true },
input({
class: 'ais-SearchBox-input',
type: 'search',
placeholder,
autofocus,
autocomplete: 'off',
autocorrect: 'off',
autocapitalize: 'off',
spellcheck: 'false',
maxlength: '512'
}),
button({
class: 'ais-SearchBox-submit',
type: 'submit',
title: 'Submit the search query.',
hidden: true
})
)
)
}
function tmplSearchResults (items) {
const { div, ol, li } = tags
return div(
{ class: 'ais-Hits', style: 'display:block' },
ol(
{ class: 'ais-Hits-list' },
items.map(item => li(
{ class: 'ais-Hits-item' },
tmplSearchResult(item)
))
)
)
}
function tmplSearchResult ({ url, breadcrumbs, heading, title, content }) {
const { div, a } = tags
return div(
{ class: 'search-result border-top border-gray-light py-3 px-2' },
a(
{ href: url, class: 'no-underline' },
div(
{ class: 'search-result-breadcrumbs d-block text-gray-dark opacity-60 text-small pb-1' },
// Remove redundant title from the end of breadcrumbs
emify((breadcrumbs || '').replace(` / ${title}`, ''))
),
div(
{ class: 'search-result-title d-block h4-mktg text-gray-dark' },
// Display page title and heading (if present exists)
emify(heading ? `${title}: ${heading}` : title)
),
div(
{ class: 'search-result-content d-block text-gray' },
// Truncate without breaking inner HTML tags
emify(truncate(content, maxContentLength))
)
)
)
}
// Allow em tags in search responses
function emify (text) {
const { em } = tags
return text
.split(/<\/?em>/g)
.map((el, i) => i % 2 ? em(el) : el)
}

Просмотреть файл

@ -3,7 +3,7 @@
<head>
<meta charset="utf-8" />
<title>Docs TOC</title>
<link rel="stylesheet" href="/dist/index.css">
<link rel="stylesheet" href="{{ builtAssets.main.css }}">
<link rel="alternate icon" type="image/png" href="/assets/images/site/favicon.png">
<link rel="icon" type="image/svg+xml" href="/assets/images/site/favicon.svg">
</head>

23
lib/built-asset-urls.js Normal file
Просмотреть файл

@ -0,0 +1,23 @@
const fs = require('fs')
const path = require('path')
const crypto = require('crypto')
// Get an MD4 Digest Hex content hash, loosely based on Webpack `[contenthash]`
function getContentHash (absFilePath) {
const buffer = fs.readFileSync(absFilePath)
const hash = crypto.createHash('md4')
hash.update(buffer)
return hash.digest('hex')
}
function getUrl (relFilePath) {
const absFilePath = path.join(process.cwd(), relFilePath)
return `/${relFilePath}?hash=${getContentHash(absFilePath)}`
}
module.exports = {
main: {
js: getUrl('dist/index.js'),
css: getUrl('dist/index.css')
}
}

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Просмотреть файл

@ -3,4 +3,6 @@ require('dotenv').config()
const algoliasearch = require('algoliasearch')
const { ALGOLIA_APPLICATION_ID, ALGOLIA_API_KEY } = process.env
module.exports = algoliasearch(ALGOLIA_APPLICATION_ID, ALGOLIA_API_KEY)
module.exports = function () {
return algoliasearch(ALGOLIA_APPLICATION_ID, ALGOLIA_API_KEY)
}

Просмотреть файл

@ -1,13 +1,14 @@
const algoliaClient = require('./client')
const AlgoliaIndex = require('./search-index')
const { namePrefix } = require('./config')
const getAlgoliaClient = require('./algolia-client')
module.exports = async function getRemoteIndexNames () {
const algoliaClient = getAlgoliaClient()
const indices = await algoliaClient.listIndexes()
// ignore other indices that may be present in the Algolia account like `helphub-`, etc
const indexNames = indices.items
.map(field => field.name)
.filter(name => name.startsWith(AlgoliaIndex.namePrefix))
.filter(name => name.startsWith(namePrefix))
return indexNames
}

Просмотреть файл

@ -1,17 +1,11 @@
const assert = require('assert')
const { chain, chunk, difference, isArray, isString, inRange } = require('lodash')
const { chain, chunk, difference } = require('lodash')
const eventToPromise = require('event-to-promise')
const objectHash = require('object-hash')
const countArrayValues = require('count-array-values')
const isURL = require('is-url')
const rank = require('./rank')
const validateRecords = require('./validate-records')
const getAlgoliaClient = require('./algolia-client')
class AlgoliaIndex {
// records must be truncated to avoid going over Algolia's 10K limit
static get maxRecordLength () { return 8000 }
static get maxContentLength () { return 5000 }
static get namePrefix () { return 'github-docs' }
constructor (name, records) {
this.name = name
this.records = records
@ -24,52 +18,14 @@ class AlgoliaIndex {
}
validate () {
assert(isString(this.name) && this.name.length, '`name` is required')
assert(isArray(this.records) && this.records.length, '`records` must be a non-empty array')
// each ID is unique
const objectIDs = this.records.map(record => record.objectID)
const dupes = countArrayValues(objectIDs)
.filter(({ value, count }) => count > 1)
.map(({ value }) => value)
assert(!dupes.length, `every objectID must be unique. dupes: ${dupes.join('; ')}`)
this.records.forEach(record => {
assert(
isString(record.objectID) && record.objectID.length,
`objectID must be a string. received: ${record.objectID}, ${JSON.stringify(record)}`
)
assert(
isString(record.title) && record.title.length,
`title must be a string. received: ${record.title}, ${JSON.stringify(record)}`
)
assert(
isURL(record.url),
`url must be a fully qualified URL. received: ${record.url}, ${JSON.stringify(record)}`
)
assert(
inRange(record.customRanking, 0, 4),
`customRanking must be an in-range number. received: ${record.customRanking}, (record: ${record.url})`
)
const recordLength = JSON.stringify(record).length
assert(
recordLength <= AlgoliaIndex.maxRecordLength,
`record ${record.url} is too long! ${recordLength} (max: ${AlgoliaIndex.maxRecordLength})`
)
})
return true
return validateRecords(this.name, this.records)
}
// This method consumes Algolia's `browseAll` event emitter,
// aggregating results into an array of all the records
// https://www.algolia.com/doc/api-reference/api-methods/browse/
async fetchExistingRecords () {
const client = require('./client')
const client = getAlgoliaClient()
// return an empty array if the index does not exist yet
const { items: indices } = await client.listIndexes()
@ -97,7 +53,7 @@ class AlgoliaIndex {
}
async syncWithRemote () {
const client = require('./client')
const client = getAlgoliaClient()
console.log('\n\nsyncing %s with remote', this.name)
this.validate()

Просмотреть файл

Просмотреть файл

6
lib/search/config.js Normal file
Просмотреть файл

@ -0,0 +1,6 @@
module.exports = {
// records must be truncated to avoid going over Algolia's 10K limit
maxRecordLength: 8000,
maxContentLength: 5000,
namePrefix: 'github-docs'
}

Просмотреть файл

Просмотреть файл

@ -4,11 +4,11 @@
const { chain } = require('lodash')
const urlPrefix = 'https://docs.github.com'
const AlgoliaIndex = require('./search-index')
const ignoredHeadingSlugs = [
'in-this-article',
'further-reading'
]
const { maxContentLength } = require('./config')
module.exports = function parsePageSectionsIntoRecords (href, $) {
const title = $('h1').text().trim()
@ -46,7 +46,7 @@ module.exports = function parsePageSectionsIntoRecords (href, $) {
.get()
.join(' ')
.trim()
.slice(0, AlgoliaIndex.maxContentLength)
.slice(0, maxContentLength)
return {
objectID,
url,
@ -67,7 +67,7 @@ module.exports = function parsePageSectionsIntoRecords (href, $) {
.get()
.join(' ')
.trim()
.slice(0, AlgoliaIndex.maxContentLength)
.slice(0, maxContentLength)
records = [{
objectID,

Просмотреть файл

Просмотреть файл

@ -6,14 +6,17 @@ const chalk = require('chalk')
const languages = require('../languages')
const buildRecords = require('./build-records')
const findIndexablePages = require('./find-indexable-pages')
const getRemoteIndexNames = require('./get-remote-index-names')
const Index = require('./search-index')
const cacheDir = path.join(process.cwd(), './.algolia-cache')
const cacheDir = path.join(process.cwd(), './.search-cache')
const allVersions = require('../all-versions')
const { namePrefix } = require('./config')
// Algolia
const getRemoteIndexNames = require('./algolia-get-remote-index-names')
const AlgoliaIndex = require('./algolia-search-index')
// Build a search data file for every combination of product version and language
// e.g. `github-docs-dotcom-en.json` and `github-docs-2.14-ja.json`
module.exports = async function syncAlgoliaIndices (opts = {}) {
module.exports = async function syncSearchIndexes (opts = {}) {
if (opts.dryRun) {
console.log('This is a dry run! The script will build the indices locally but not upload anything.\n')
rimraf(cacheDir)
@ -60,11 +63,11 @@ module.exports = async function syncAlgoliaIndices (opts = {}) {
: allVersions[pageVersion].miscBaseName
// github-docs-dotcom-en, github-docs-2.22-en
const indexName = `${Index.namePrefix}-${indexVersion}-${languageCode}`
const indexName = `${namePrefix}-${indexVersion}-${languageCode}`
// The page version will be the new version, e.g., free-pro-team@latest, enterprise-server@2.22
const records = await buildRecords(indexName, indexablePages, pageVersion, languageCode)
const index = new Index(indexName, records)
const index = new AlgoliaIndex(indexName, records)
if (opts.dryRun) {
const cacheFile = path.join(cacheDir, `${indexName}.json`)
@ -87,7 +90,7 @@ module.exports = async function syncAlgoliaIndices (opts = {}) {
)
if (!process.env.CI) {
console.log(chalk.green(`\nCached remote index names in ${path.relative(process.cwd(), cachedIndexNamesFile)}`))
console.log(chalk.green(`\nCached index names in ${path.relative(process.cwd(), cachedIndexNamesFile)}`))
console.log(chalk.green('(If this file has any changes, please commit them)'))
}

Просмотреть файл

@ -0,0 +1,47 @@
const assert = require('assert')
const { isArray, isString, inRange } = require('lodash')
const isURL = require('is-url')
const countArrayValues = require('count-array-values')
const { maxRecordLength } = require('./config')
module.exports = function validateRecords (name, records) {
assert(isString(name) && name.length, '`name` is required')
assert(isArray(records) && records.length, '`records` must be a non-empty array')
// each ID is unique
const objectIDs = records.map(record => record.objectID)
const dupes = countArrayValues(objectIDs)
.filter(({ value, count }) => count > 1)
.map(({ value }) => value)
assert(!dupes.length, `every objectID must be unique. dupes: ${dupes.join('; ')}`)
records.forEach(record => {
assert(
isString(record.objectID) && record.objectID.length,
`objectID must be a string. received: ${record.objectID}, ${JSON.stringify(record)}`
)
assert(
isString(record.title) && record.title.length,
`title must be a string. received: ${record.title}, ${JSON.stringify(record)}`
)
assert(
isURL(record.url),
`url must be a fully qualified URL. received: ${record.url}, ${JSON.stringify(record)}`
)
assert(
inRange(record.customRanking, 0, 4),
`customRanking must be an in-range number. received: ${record.customRanking}, (record: ${record.url})`
)
const recordLength = JSON.stringify(record).length
assert(
recordLength <= maxRecordLength,
`record ${record.url} is too long! ${recordLength} (max: ${maxRecordLength})`
)
})
return true
}

13
lib/search/versions.js Normal file
Просмотреть файл

@ -0,0 +1,13 @@
const allVersions = require('../all-versions')
module.exports = new Set(
Object.values(allVersions)
.map(version =>
// if GHES, resolves to the release number like 2.21, 2.22, etc.
// if FPT, resolves to 'dotcom'
// if GHAE, resolves to 'ghae'
version.plan === 'enterprise-server'
? version.currentRelease
: version.miscBaseName
)
)

Просмотреть файл

@ -7,6 +7,7 @@ const { getVersionStringFromPath, getProductStringFromPath, getPathWithoutLangua
const productNames = require('../lib/product-names')
const warmServer = require('../lib/warm-server')
const featureFlags = Object.keys(require('../feature-flags'))
const builtAssets = require('../lib/built-asset-urls')
// Supply all route handlers with a baseline `req.context` object
// Note that additional middleware in middleware/index.js adds to this context object
@ -42,5 +43,8 @@ module.exports = async function contextualize (req, res, next) {
req.context.siteTree = siteTree
req.context.pages = pageMap
// JS + CSS asset paths
req.context.builtAssets = builtAssets
return next()
}

Просмотреть файл

@ -66,6 +66,7 @@ module.exports = function (app) {
app.use('/public', express.static('data/graphql'))
app.use('/events', require('./events'))
app.use('/csrf', require('./csrf-route'))
app.use('/search', require('./search'))
app.use(require('./archived-enterprise-versions'))
app.use(require('./robots'))
app.use(/(\/.*)?\/early-access$/, require('./contextualizers/early-access-links'))

58
middleware/search.js Normal file
Просмотреть файл

@ -0,0 +1,58 @@
const express = require('express')
const algoliasearch = require('algoliasearch')
const { namePrefix } = require('../lib/search/config')
const languages = new Set(Object.keys(require('../lib/languages')))
const versions = require('../lib/search/versions')
const { get } = require('lodash')
const router = express.Router()
// https://www.algolia.com/apps/ZI5KPY1HBE/dashboard
// This API key is public. There's also a private API key for writing to the Algolia API
const searchClient = algoliasearch('ZI5KPY1HBE', '685df617246c3a10abba589b4599288f')
async function loadAlgoliaResults ({ version, language, query, limit }) {
const indexName = `${namePrefix}-${version}-${language}`
const index = searchClient.initIndex(indexName)
// allows "phrase queries" and "prohibit operator"
// https://www.algolia.com/doc/api-reference/api-parameters/advancedSyntax/
const { hits } = await index.search(query, {
hitsPerPage: limit,
advancedSyntax: true
})
return hits.map(hit => ({
url: hit.url,
breadcrumbs: get(hit, '_highlightResult.breadcrumbs.value'),
heading: get(hit, '_highlightResult.heading.value'),
title: get(hit, '_highlightResult.title.value'),
content: get(hit, '_highlightResult.content.value')
}))
}
router.get('/', async (req, res) => {
res.set({
'surrogate-control': 'private, no-store',
'cache-control': 'private, no-store'
})
const { query, version, language } = req.query
const limit = Math.min(parseInt(req.query.limit, 10) || 10, 100)
if (!versions.has(version) || !languages.has(language)) {
return res.status(400).json([])
}
if (!query || !limit) {
return res.status(200).json([])
}
try {
const results = await loadAlgoliaResults({ version, language, query, limit })
return res.status(200).json(results)
} catch (err) {
console.error(err)
return res.status(400).json([])
}
})
module.exports = router

Просмотреть файл

@ -57,7 +57,6 @@
"html-truncate": "^1.2.2",
"hubdown": "^2.6.0",
"imurmurhash": "^0.1.4",
"instantsearch.js": "^4.8.2",
"ioredis": "^4.19.4",
"ioredis-mock": "^5.2.0",
"is-url": "^1.2.4",
@ -166,7 +165,7 @@
"sync-search": "start-server-and-test sync-search-server 4002 sync-search-indices",
"sync-search-dry-run": "DRY_RUN=1 npm run sync-search",
"sync-search-server": "cross-env NODE_ENV=production PORT=4002 node server.js",
"sync-search-indices": "script/sync-algolia-search-indices.js",
"sync-search-indices": "script/sync-search-indices.js",
"test-watch": "jest --watch --notify --notifyMode=change --coverage",
"check-deps": "node script/check-deps.js",
"prevent-pushes-to-main": "node script/prevent-pushes-to-main.js",

Просмотреть файл

@ -32,7 +32,6 @@ const main = async () => {
'@babel/*',
'babel-preset-env',
'@primer/*',
'instantsearch.js',
'querystring',
'pa11y-ci',
'sass',

Просмотреть файл

@ -67,6 +67,9 @@ async function main () {
// Update CLI output and append to logfile after each checked link.
checker.on('link', result => {
// We don't need to dump all of the HTTP and HTML details
delete result.failureDetails
fs.appendFileSync(logFile, JSON.stringify(result) + '\n')
})
@ -113,11 +116,7 @@ function displayBrokenLinks (brokenLinks) {
const allStatusCodes = uniq(brokenLinks
// Coerce undefined status codes into `Invalid` strings so we can display them.
// Without this, undefined codes get JSON.stringified as `0`, which is not useful output.
.map(link => {
if (!link.status) link.status = 'Invalid'
return link
})
.map(link => link.status)
.map(link => link.status || 'Invalid')
)
allStatusCodes.forEach(statusCode => {
@ -126,6 +125,9 @@ function displayBrokenLinks (brokenLinks) {
console.log(`## Status ${statusCode}: Found ${brokenLinksForStatus.length} broken links`)
console.log('```')
brokenLinksForStatus.forEach(brokenLinkObj => {
// We don't need to dump all of the HTTP and HTML details
delete brokenLinkObj.failureDetails
console.log(JSON.stringify(brokenLinkObj, null, 2))
})
console.log('```')

Просмотреть файл

@ -2,8 +2,8 @@
// [start-readme]
//
// This script is run automatically via GitHub Actions on every push to `master` to generate searchable data
// and upload it to our Algolia account. It can also be run manually. For more info see [contributing/search.md](contributing/search.md)
// This script is run automatically via GitHub Actions on every push to `main` to generate searchable data.
// It can also be run manually. For more info see [contributing/search.md](contributing/search.md)
//
// [end-readme]
@ -12,7 +12,7 @@ require('make-promises-safe')
main()
async function main () {
const sync = require('../lib/algolia/sync')
const sync = require('../lib/search/sync')
const opts = {
dryRun: 'DRY_RUN' in process.env,
language: process.env.LANGUAGE,

Просмотреть файл

@ -3,7 +3,7 @@
/* Global styles
Gets applied to both the search input on homepage and in the header nav
Form and inputs using .ais- prefix gets added by Algolia InstantSearch.js */
Form and inputs using .ais- prefix gets added by search.js */
.ais-SearchBox {
position: relative;
}

Просмотреть файл

@ -42,84 +42,45 @@ describe('algolia browser search', () => {
})
it('sends the correct data to algolia for Enterprise Server', async () => {
expect.assertions(12) // 3 assertions x 4 letters ('test')
expect.assertions(2)
const newPage = await browser.newPage()
await newPage.goto('http://localhost:4001/ja/enterprise/2.22/admin/installation')
await newPage.setRequestInterception(true)
newPage.on('request', interceptedRequest => {
if (interceptedRequest.method() === 'POST' && /algolia/i.test(interceptedRequest.url())) {
const data = JSON.parse(interceptedRequest.postData())
const { indexName, params } = data.requests[0]
const parsedParams = querystring.parse(params)
const analyticsTags = JSON.parse(parsedParams.analyticsTags)
expect(indexName).toBe('github-docs-2.22-ja')
expect(analyticsTags).toHaveLength(2)
// browser tests are run against production build, so we are expecting env:production
expect(analyticsTags).toEqual(expect.arrayContaining(['site:docs.github.com', 'env:production']))
if (interceptedRequest.method() === 'GET' && /search/i.test(interceptedRequest.url())) {
const { version, language } = querystring.parse(interceptedRequest.url())
expect(version).toBe('2.22')
expect(language).toBe('ja')
}
interceptedRequest.continue()
})
await newPage.click('#search-input-container input[type="search"]')
await newPage.type('#search-input-container input[type="search"]', 'test')
await newPage.waitForSelector('.search-result')
})
it('sends the correct data to algolia for GHAE', async () => {
expect.assertions(12) // 3 assertions x 4 letters ('test')
expect.assertions(2)
const newPage = await browser.newPage()
await newPage.goto('http://localhost:4001/en/github-ae@latest/admin/overview')
await newPage.setRequestInterception(true)
newPage.on('request', interceptedRequest => {
if (interceptedRequest.method() === 'POST' && /algolia/i.test(interceptedRequest.url())) {
const data = JSON.parse(interceptedRequest.postData())
const { indexName, params } = data.requests[0]
const parsedParams = querystring.parse(params)
const analyticsTags = JSON.parse(parsedParams.analyticsTags)
expect(indexName).toBe('github-docs-ghae-en')
expect(analyticsTags).toHaveLength(2)
// browser tests are run against production build, so we are expecting env:production
expect(analyticsTags).toEqual(expect.arrayContaining(['site:docs.github.com', 'env:production']))
if (interceptedRequest.method() === 'GET' && /search/i.test(interceptedRequest.url())) {
const { version, language } = querystring.parse(interceptedRequest.url())
expect(version).toBe('ghae')
expect(language).toBe('en')
}
interceptedRequest.continue()
})
await newPage.click('#search-input-container input[type="search"]')
await newPage.type('#search-input-container input[type="search"]', 'test')
})
it('removes `algolia-query` query param after page load', async () => {
await page.goto('http://localhost:4001/en?algolia-query=helpme')
// check that the query is still present at page load
let location = await getLocationObject(page)
expect(location.search).toBe('?algolia-query=helpme')
// query removal is in a setInterval, so wait a bit
await sleep(1000)
// check that the query has been removed after a bit
location = await getLocationObject(page)
expect(location.search).toBe('')
})
it('does not remove hash when removing `algolia-query` query', async () => {
await page.goto('http://localhost:4001/en?algolia-query=helpme#some-header')
// check that the query is still present at page load
let location = await getLocationObject(page)
expect(location.search).toBe('?algolia-query=helpme')
// query removal is in a setInterval, so wait a bit
await sleep(1000)
// check that the query has been removed after a bit
location = await getLocationObject(page)
expect(location.search).toBe('')
expect(location.hash).toBe('#some-header')
await newPage.waitForSelector('.search-result')
})
})
@ -166,13 +127,6 @@ describe('csrf meta', () => {
})
})
async function getLocationObject (page) {
const location = await page.evaluate(() => {
return Promise.resolve(JSON.stringify(window.location, null, 2))
})
return JSON.parse(location)
}
describe('platform specific content', () => {
// from tests/javascripts/user-agent.js
const userAgents = [

Просмотреть файл

@ -1,14 +1,14 @@
const { dates, supported } = require('../../lib/enterprise-server-releases')
const languageCodes = Object.keys(require('../../lib/languages'))
const AlgoliaIndex = require('../../lib/algolia/search-index')
const remoteIndexNames = require('../../lib/algolia/cached-index-names.json')
const { namePrefix } = require('../../lib/search/config')
const remoteIndexNames = require('../../lib/search/cached-index-names.json')
describe('algolia', () => {
test('has remote indexNames in every language for every supported GHE version', () => {
expect(supported.length).toBeGreaterThan(1)
supported.forEach(version => {
languageCodes.forEach(languageCode => {
const indexName = `${AlgoliaIndex.namePrefix}-${version}-${languageCode}`
const indexName = `${namePrefix}-${version}-${languageCode}`
// workaround for GHES release branches not in production yet
if (!remoteIndexNames.includes(indexName)) {
@ -28,7 +28,7 @@ describe('algolia', () => {
test('has remote indexNames in every language for dotcom', async () => {
expect(languageCodes.length).toBeGreaterThan(0)
languageCodes.forEach(languageCode => {
const indexName = `${AlgoliaIndex.namePrefix}-dotcom-${languageCode}`
const indexName = `${namePrefix}-dotcom-${languageCode}`
expect(remoteIndexNames.includes(indexName)).toBe(true)
})
})

Просмотреть файл

@ -4,6 +4,7 @@ const { get, getDOM, head } = require('../helpers/supertest')
const { describeViaActionsOnly } = require('../helpers/conditional-runs')
const path = require('path')
const { loadPages } = require('../../lib/pages')
const builtAssets = require('../../lib/built-asset-urls')
describe('server', () => {
jest.setTimeout(60 * 1000)
@ -694,7 +695,8 @@ describe('?json query param for context debugging', () => {
describe('stylesheets', () => {
it('compiles and sets the right content-type header', async () => {
const res = await get('/dist/index.css')
const stylesheetUrl = builtAssets.main.css
const res = await get(stylesheetUrl)
expect(res.statusCode).toBe(200)
expect(res.headers['content-type']).toBe('text/css; charset=UTF-8')
})
@ -703,7 +705,8 @@ describe('stylesheets', () => {
describe('client-side JavaScript bundle', () => {
let res
beforeAll(async (done) => {
res = await get('/dist/index.js')
const scriptUrl = builtAssets.main.js
res = await get(scriptUrl)
done()
})

Просмотреть файл

@ -1,13 +1,13 @@
const fs = require('fs')
const path = require('path')
const cheerio = require('cheerio')
const parsePageSectionsIntoRecords = require('../../../lib/algolia/parse-page-sections-into-records')
const parsePageSectionsIntoRecords = require('../../../lib/search/parse-page-sections-into-records')
const fixtures = {
pageWithSections: fs.readFileSync(path.join(__dirname, 'fixtures/page-with-sections.html'), 'utf8'),
pageWithoutSections: fs.readFileSync(path.join(__dirname, 'fixtures/page-without-sections.html'), 'utf8')
}
describe('algolia parsePageSectionsIntoRecords module', () => {
describe('search parsePageSectionsIntoRecords module', () => {
test('works for pages with sections', () => {
const html = fixtures.pageWithSections
const $ = cheerio.load(html)

Просмотреть файл

@ -1,6 +1,6 @@
const rank = require('../../../lib/algolia/rank')
const rank = require('../../../lib/search/rank')
test('algolia custom rankings', () => {
test('search custom rankings', () => {
const expectedRankings = [
['https://docs.github.com/en/github/actions', 3],
['https://docs.github.com/en/rest/reference', 2],

Просмотреть файл

@ -86,12 +86,5 @@ module.exports = {
]
}),
new EnvironmentPlugin(['NODE_ENV'])
],
resolve: {
alias: {
// Hogan uses `new Function` which breaks content security policy
// Turns out, we aren't even using it anyways!
'hogan.js': path.resolve(__dirname, 'javascripts/fake-hogan.js')
}
}
]
}