Tools for validating OpenAPI (Swagger) files.
Перейти к файлу
Scott Beddall 72365a9f76 update package lock with transitive npm audit update 2024-11-20 11:42:34 -08:00
.vscode Cleanup Test Folders, Add `debug` tests to repro api-specs failures (#1022) 2023-12-13 15:46:55 -08:00
documentation Add Documentation to reflect migrated internal wiki items (#998) 2023-08-30 18:16:20 -07:00
eng Refactor pipelines - Spring Grove (#1031) 2024-06-06 13:42:23 -07:00
lib [generate-examples] Include Min/Max in default example titles (#1054) 2024-10-15 15:34:12 -07:00
regression Refactor pipelines - Spring Grove (#1031) 2024-06-06 13:42:23 -07:00
test Add Passed opeartions to the report (#1044) 2024-09-12 17:54:58 -07:00
.eslintrc.js Apiscenario (#747) 2022-06-10 15:01:39 +08:00
.gitattributes adding gitattributes. 2017-04-18 16:41:23 -07:00
.gitignore Cleanup Test Folders, Add `debug` tests to repro api-specs failures (#1022) 2023-12-13 15:46:55 -08:00
.gitmodules Add regression suites for validateExamples (#403) 2019-05-08 11:39:02 -07:00
.prettierrc Release 2.0.0 version 2020-11-24 11:07:36 +08:00
CONTRIBUTING.md Add CONTRIBUTING.md file (#890) 2022-10-28 11:28:02 +08:00
ChangeLog.md update package lock with transitive npm audit update 2024-11-20 11:42:34 -08:00
LICENSE.txt move to sway and promises, better logging support, structured cli commands using yargs package, initial protype for live testing 2017-01-16 23:06:51 -08:00
README.md [README.md] Update node version (#1033) 2024-06-12 12:41:47 -07:00
SECURITY.md Microsoft mandatory file (#842) 2022-08-02 10:36:44 +08:00
ci.yml Refactor pipelines - Spring Grove (#1031) 2024-06-06 13:42:23 -07:00
cli.ts Test scenario main (#599) 2021-03-24 16:48:10 +08:00
index.ts Fix empty html report bug (#946) 2023-01-19 17:47:39 +08:00
jest.config.js Cleanup Test Folders, Add `debug` tests to repro api-specs failures (#1022) 2023-12-13 15:46:55 -08:00
jest.setup-file.js Test scenario main (#599) 2021-03-24 16:48:10 +08:00
oav.code-workspace Remove generated CloudError (#366) 2018-12-06 16:11:21 -08:00
package-lock.json update package lock with transitive npm audit update 2024-11-20 11:42:34 -08:00
package.json update package lock with transitive npm audit update 2024-11-20 11:42:34 -08:00
tsconfig.json Refactor pipelines - Spring Grove (#1031) 2024-06-06 13:42:23 -07:00

README.md

openapi-validation-tools [oav]

Package version

Build Status code style: prettier

Regression: Build Status How to fix this

Tools for validating OpenAPI (Swagger) files.

Requirements

  • node.js version >= 18.x

You can install the latest stable release of node.js from here. For a machine with a linux flavored OS, please follow the node.js installation instructions over here

How to install the tool

npm install -g oav@latest

Command usage:

$ oav -h    Commands:
  analyze-dependency                        analyze swagger resource type
                                            dependency.
  analyze-report <newman-report-path>       analyze report. default format:
                                            newman json report
  example-quality <spec-path>               Performs example quality validation
                                            of x-ms-examples and examples
                                            present in the spec.
  extract-xmsexamples <spec-path>           Extracts the x-ms-examples for a
  <recordings>                              given swagger from the .NET session
                                            recordings and saves them in a file.
  generate-collection                       Generate postman collection file
                                            from API scenario.
  generate-examples [spec-path]             Generate swagger examples from real
                                            payload records.
  generate-report [raw-report-path]         Generate report from postman report.
  generate-api-scenario                     Generate swagger examples from real
                                            payload records.
  generate-static-api-scenario              Generate API-scenario from swagger.
  run-api-scenario <api-scenario>           newman runner run API scenario
                                            file.                 [aliases: run]
  validate-example <spec-path>              Performs validation of x-ms-examples
                                            and examples present in the spec.
  validate-spec <spec-path>                 Performs semantic validation of the
                                            spec.
  validate-traffic <traffic-path>           Validate traffic payload against the
  <spec-path>                               spec.
  traffic-convert <input-dir>               Showcase what it would look like to 
  <output-dir>                              transform a directory full of 
                                            [azure-sdk/test-proxy](https://github.com/Azure/azure-sdk-tools/tree/main/tools/test-proxy/Azure.Sdk.Tools.TestProxy) 
                                            recordings files into traffic payloads 
                                            consumable by traffic validation command in oav
  

Options:
  --version          Show version number                               [boolean]
  -l, --logLevel     Set the logging level for console.
  [choices: "off", "json", "error", "warn", "info", "verbose", "debug", "silly"]
                                                               [default: "info"]
  -f, --logFilepath  Set the log file path. It must be an absolute filepath. By
                     default the logs will stored in a timestamp based log file
                     at "/home/ruowan/oav_output".
  -p, --pretty       Pretty print
  -h, --help         Show help                                         [boolean]

What does the tool do? What issues does the tool catch?

  • Semantic validation Semantic validation enforces correctness on the swagger specific elements. Such as paths and operations. Ensure the element definition meet the OpenApi 2.0 specification.

  • Model validation Model validation enforces correctness between example and swagger. It checks whether definitions for request parameters and responses, match an expected input/output payload of the service.

    Examples of issues detected:

    • Required properties not sent in requests or responses
    • Defined types not matching the value provided in the payload
    • Constraints on properties not met
    • Enumeration values that dont match the value used by the service.

    Model validation requires example payloads (request/response) of the service, so the data can be matched with the defined models. See x-ms-examples extension on how to specify the examples/payloads. Swagger “examples” is also supported and data included there is validated as well. To get the most benefit from this tool, make sure to have the simplest and most complex examples possible as part of x-ms-examples.

    • Please take a look at the redis-cache swagger spec as an example for providing "x-ms-examples" over here.

    • The examples need to be provided in a separate file in the examples directory under the api-version directory azure-rest-api-specs/arm-<yourService>/<api-version>/examples/<exampleName>.json. You can take a look over here for the structure of examples.

    • We require you to provide us a minimum (just required properties/parameters of the request/response) and a maximum (full blown) example. Feel free to provide more examples as deemed necessary.

    • We have provided schemas for examples to be provided in the examples directory. It can be found over here. This will help you with intellisense and validation.

    • If you are using vscode to edit your swaggers in the azure-rest-api-specs repo then everything should work out of the box as the schemas have been added in the .vscode/settings.json file over here.

    • If you are using Visual Studio then you can use the urls provided in the settings.json file and put them in the drop down list at the top of a json file when the file is opened in VS.

How does this tool fit with others

Swagger specs validation could be split in the following:

  1. Schema validation
  2. Semantic validation
  3. Model definition validation
  4. Swagger operations execution (against mocked data or live tests)
  5. Human-eye review to complement the above

In the context of “azure-rest-api-specs” repo:

  • #1 is being performed on every PR as part of CI.
  • #2 and #3 are performed by the tool currently in openapi-validation-tools repo and by AutoRest linter. Were working towards integrating them into CI for “azure-rest-api-specs” repo.
  • #4 is not available yet, though were starting to work on it.
  • #5 will be done by the approvers of PRs in “azure-rest-api-specs”, as this wont be automated.

Run API test

OAV support run API test against Azure and validate request and response. You could define API scenario file which compose with several swagger example files and then use oav to execute it. For more details about API test, please refer to this API scenario documentation.

Live Validation Mode

  • A Live Validation mode has been added to OAV with the purpose of enabling validation of live traffic.
  • Usage (here is a sample of a request-response pair):
const liveValidatorOptions = {
  git: {
    url: "https://github.com/Azure/azure-rest-api-specs.git",
    shouldClone: true,
  },
  directory: path.resolve(os.homedir(), "cloneRepo"),
  swaggerPathsPattern: "/specification/**/resource-manager/**/*.json",
  isPathCaseSensitive: false,
  shouldModelImplicitDefaultResponse: true,
};

const apiValidator = new oav.LiveValidator(liveValidatorOptions);
await apiValidator.initialize(); // Note that for a large number of specs this can take some time.

// After `initialize()` finishes we are ready to validate
const validationResult = apiValidator.validateLiveRequestResponse(requestResponsePair);

Regression testing

Output of the OAV tool has been snapshotted and committed to the repo. The regression test may be run on a sample or all of https://github.com/azure/azure-rest-api-specs. If there are changes to the snapshots the build produces a git patch file as an artifact which may be used to update the snapshots.

Fast Regression (~10mins) is used for merge validation

Slow Regression (~1 hour) is run after merge and should be fixed if it fails

Fixing regression builds

  1. Go to the failed build
  2. Download the artifact patch file
  3. In the OAV directory run git apply <path to patch file>
  4. Commit the patched changes and create a pull request
  5. Validate that the changes look ok and don't represent a breaking change in OAV
  6. Merge the PR

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.