[doc] fixed typos and punctuation (#29225)

Fixed typos and added punctuations to improve readability of README.md
files.

---------

Co-authored-by: Maor Leger <maorleger@users.noreply.github.com>
This commit is contained in:
Jana R 2024-06-25 19:32:57 +05:30 коммит произвёл GitHub
Родитель 59c7f7d209
Коммит e6db98ecb5
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: B5690EEEBB952194
20 изменённых файлов: 303 добавлений и 247 удалений

Просмотреть файл

@ -9,7 +9,7 @@ a CLA and decorate the PR appropriately (e.g., status check, comment). Simply fo
provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
For more information, see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or
contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
# How to contribute to the Azure SDK for Javascript
@ -23,7 +23,7 @@ There are many ways that you can contribute to the Azure SDK for JavaScript proj
All code submissions will be reviewed and tested by the team, and those that meet a high bar for both quality and design/roadmap appropriateness will be merged into the source. Be sure to follow the existing file/folder structure when adding new boards or sensors.
If you encounter any bugs with the library please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-js/issues) section of the project.
If you encounter any bugs with the library, please file an issue in the [Issues](https://github.com/Azure/azure-sdk-for-js/issues) section of the project.
## Things to keep in mind when contributing
@ -35,7 +35,7 @@ Some guidance for when you make a contribution:
## Big contributions
If your contribution is significantly big it is better to first check with the project developers in order to make sure the change aligns with the long term plans. This can be done simply by submitting a question via the GitHub Issues section.
If your contribution is significantly big, it is better to first check with the project developers in order to make sure the change aligns with the long term plans. This can be done simply by submitting a question via the GitHub Issues section.
## Project orchestration
@ -181,7 +181,7 @@ If you modify the network calls (both the number of calls or their shape) either
Regenerating the recordings has the same requirements as running the live tests. You will be using the same `test` npm script with the environment variables pointing to previously created Azure resources. The only difference is that the `TEST_MODE` environment variable needs to be set to `record`. When this process finishes without errors, the recordings will be updated.
For more information the recorder, please visit the [test-recorder's readme](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/README.md).
For more information about the recorder, please visit the [test-recorder's readme](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/README.md).
Here are a few [Useful Commands](https://github.com/Azure/azure-sdk-for-js/wiki/Golden-Testing-Commands) that can be handy while testing your SDKs.
@ -314,7 +314,7 @@ nodeResolve({
### Package Versioning
For information about packages are versioned and tagged see [Javascript Releases](https://azure.github.io/azure-sdk/policies_releases.html#javascript)
For information about packages are versioned and tagged, see [Javascript Releases](https://azure.github.io/azure-sdk/policies_releases.html#javascript)
### Core Client libraries

Просмотреть файл

@ -40,7 +40,7 @@ Newer versions of these libraries follow the [Azure SDK Design Guidelines for Ty
## Need help?
- For detailed documentation visit our [Azure SDK for JavaScript documentation](https://aka.ms/js-docs)
- For detailed documentation, visit our [Azure SDK for JavaScript documentation](https://aka.ms/js-docs)
- File an issue via [GitHub Issues](https://github.com/Azure/azure-sdk-for-js/issues)
- Check [previous questions](https://stackoverflow.com/questions/tagged/azure-sdk-js) or ask new ones on StackOverflow using `azure-sdk-js` tag.
- Read our [Support documentation](https://github.com/Azure/azure-sdk-for-js/blob/main/SUPPORT.md).
@ -62,6 +62,6 @@ This project welcomes contributions and suggestions. Most contributions require
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information, see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
![Impressions](https://azure-sdk-impressions.azurewebsites.net/api/impressions/azure-sdk-for-js%2FREADME.png)

Просмотреть файл

@ -2,11 +2,11 @@
The Smoke Tests validate customer scenarios by creating an application which
uses package dependencies, loads all packages into a single process, and
executes code samples to ensure basic end to end scenarios work as expected.
executes code samples to ensure basic end-to-end scenarios work as expected.
Smoke Tests are meant to be run periodically in an Azure DevOps pipeline. See
[`smoke-tests.yml`](https://github.com/Azure/azure-sdk-for-js/blob/main/common/smoke-test/smoke-tests.yml) to configure Smoke Tests in an Azure
DevOps pipeline. When run in an Azure DevOps pipeline specify the `-CI` flag to
DevOps pipeline. When run in an Azure DevOps pipeline, specify the `-CI` flag to
ensure environment variables are properly set and error/warning messages are
properly surfaced during the execution.
@ -28,7 +28,7 @@ package.
## Configuring Samples
By default _all_ JavaScript samples are prepped, loaded, and executed. Samples
By default, _all_ JavaScript samples are prepped, loaded, and executed. Samples
are assumed to work with just the environment variables defined in
`test-resources.json` for the service. Samples which have additional resource
requirements should opt out of execution.

Просмотреть файл

@ -18,7 +18,7 @@ There are cons to this approach as we consider bringing the JavaScript clients i
## Modular Design
We can create a more ala carte experience where the customer only pulls in the methods they need. With the Azure SDK for JavaScript, we can create a client previously using the `ServiceClient` pattern:
We can create a more a-la-carte experience where the customer only pulls in the methods they need. With the Azure SDK for JavaScript, we can create a client previously using the `ServiceClient` pattern:
```typescript
import { NotificationHubsClient } from "@azure/notification-hubs";
@ -40,7 +40,7 @@ const registrationId = await createRegistrationId(context);
The pros of this design are as follows:
- Fine grained exports to only give the customer what they need
- Fine-grained exports to only give the customer what they need
- Smaller bundle sizes the customer for exports.
- Tree shaking is easier for bundlers.
@ -54,7 +54,7 @@ The goals of enabling both the class based `ServiceClient` and modular developme
- **Do not introduce breaking changes**
- Keep the existing CommonJS compatibility.
- Do not pollute the top level index with all exports, keeping the top level index.ts as is
- Do not pollute the top-level index with all exports, keeping the top-level index.ts as is
- Expose exports at "namespaces" or export subpaths
## Subpath Exports
@ -68,7 +68,7 @@ Node added the following support for subpath exports in the following versions:
- `v12.7.0` - Introduce `"exports"` package.json field as a more powerful alternative to the classic `"main"` field
- `v12.0.0` - Add support for ES modules using `.js` file extension via `package.json` `"type"` field.
For subpaths, we can specify the path which is either a path or a wildcard path. Inside the path, we can specify following:
For subpaths, we can specify the path which is either a path or a wildcard path. Inside the path, we can specify the following:
- `types` - The TypeScript types for that path
- `require` - Common JS entry point
@ -94,7 +94,7 @@ For example, we could have our client and associated methods exposed as `@azure/
},
```
Then we could import those modules as the following:
Then, we could import those modules as the following:
```typescript
import { createClientContext, createOrUpdateInstallation } from "@azure/notification-hubs/api";
@ -115,7 +115,7 @@ Note the main exports of `.` can expose both ES-Modules and CommonJS such as the
This approach has a number of benefits:
- Does not pollute the top level index to export the world.
- Does not pollute the top-level index to export the world.
- Can support both class-based `ServiceClient` and modular development separately.
- Allows the SDK Team to ship experimental/beta features through its own subpath.
@ -133,7 +133,7 @@ First and foremost, we do not want to break the customers so they can continue t
The standard for Azure SDKs going forward is the following:
- `.` - Still expose all things at the top level exports for CommonJS and ES-Modules.
- `.` - Still expose all things at the top-level exports for CommonJS and ES-Modules.
- `/api` - The client context and single method exports
- `/models` - The models and factories
@ -171,7 +171,7 @@ This could then be imported such as the following:
import { createClient, createOrUpdateInstallation } from "@azure/notification-hubs/api";
```
For models and their associated factory functions should be in a `models` subpath. The models are then exported via the `models/index.js` file.
Models and their associated factory functions should be in a `models` subpath. The models are then exported via the `models/index.js` file.
```json
"./models": {
@ -188,7 +188,7 @@ import { Installation, createAppleInstallation } from "@azure/notification-hubs/
## Shipping Experimental Features
Another aspect of this design is to allow us to ship experimental features with modular development. We can ship preview features in either `experimental` or `preview` subpath exports which allow us to ship features that do not collide with our top level exports nor our standard APIs. Once these features have been approved for GA, they will be removed from the `experimental` or `preview` subpath.
Another aspect of this design is to allow us to ship experimental features with modular development. We can ship preview features in either `experimental` or `preview` subpath exports which allow us to ship features that do not collide with our top-level exports nor our standard APIs. Once these features have been approved for GA, they will be removed from the `experimental` or `preview` subpath.
## Onboarding Azure SDK Packages to Subpath Exports
@ -214,7 +214,7 @@ To support subpath exports, the `package.json` requires the following changes.
],
```
- The `exports` must be specified for the top level export as well as subpaths. Wildcards or absolute paths can be specified.
- The `exports` must be specified for the top-level export as well as subpaths. Wildcards or absolute paths can be specified.
```json
"exports": {
@ -240,7 +240,7 @@ The TypeScript configuration `tsconfig.json` must also be updated with the follo
- Update the `module` setting to either `NodeNext` or `Node16`.
- Update the `moduleResolution` setting to either `NodeNext` or `Node16`.
- In some cases,the compiler needed `rootDir` set, so update `rootDir` to be `"."`.
- In some cases, the compiler needed `rootDir` set, so update `rootDir` to be `"."`.
### Source Code Changes
@ -252,7 +252,7 @@ Previously, we imported files such as the following:
import { isDefined } from "./utils";
```
And now we must specify the `.js` extension.
And now, we must specify the `.js` extension.
```typescript
import { isDefined } from "./utils.js";
@ -264,7 +264,7 @@ To support this in our packages, we should think about how we support a hybrid p
### Hybrid Design
For design considerations and code re-use, we could deploy a hybrid solution to this problem by using the single method exports and still expose the existing `ServiceClient` which simply acts as a proxy to the the underlying method.
For design considerations and code re-use, we could deploy a hybrid solution to this problem by using the single method exports and still expose the existing `ServiceClient` which simply acts as a proxy to the underlying method.
```typescript
import { NotificationHubsClientContext, createClientContext, createRegistrationId } from "@azure/notification-hubs/api";
@ -284,7 +284,7 @@ export class NotificationHubsServiceClient {
## Follow On Work
With this approach the following functions still work as expected:
With this approach, the following functions still work as expected:
- CI
- Linting

Просмотреть файл

@ -4,9 +4,9 @@
In Azure [API design guideline](https://azure.github.io/azure-sdk/general_design.html#client-interface), we allow one client library to have multiple service clients. which can be defined with multiple @client decorators in TypeSpec. But our Rest Level Client (RLC) is meant to be light-weighted and RESTful.
As you may know our JS next generation library [Modular](https://github.com/Azure/azure-sdk-for-js/blob/main/design/modular-development.md) is composited of classical client layer, api layer and rest layer. And one of our goals is to have customers like Azure Portal to use our libraries with the rest layer.
As you may know, our JS next generation library [Modular](https://github.com/Azure/azure-sdk-for-js/blob/main/design/modular-development.md) is composited of classical client layer, api layer and rest layer. One of our goals is to have customers like Azure Portal use our libraries with the rest layer.
This document is going to talk about what the multi-client and multi-endpoint for our Modular would look like. we will introduce it from:
This document is going to talk about what the multi-client and multi-endpoint for our Modular would look like. We will introduce it from:
1. **_Definitions of Multi-Client and Multi-Endpoint_**
1. **_Design Principal_**
@ -50,7 +50,7 @@ ClientB
In the previous design, this applies to both multi-client and multi-endpoint.
In the multi-client case, the above design will split the RLC client into several parts and provide different sub path exports to it, each will contain a subset of operations related with it even if each of the sub path exports have the same endpoint. See the below examples of using the sub clients.
In the multi-client case, the above design will split the RLC client into several parts and provide different subpath exports to it, each will contain a subset of operations related with it even if each of the subpath exports have the same endpoint. See the below examples of using the sub clients.
```typescript
import createMyMulticlient from "@azure/foo/ClientA/rest";
@ -60,18 +60,18 @@ import createMyMulticlient from "@azure/foo/ClientA/rest";
import createMyMulticlient from "@azure/foo/ClientB/rest";
```
But it will still cause problems because there's no guarantee that the api layer will never call operations across different rest level sub clients. If such case happens, we will be difficult to do the tree-shaking or even impossible to do that.
But, it will still cause problems because there's no guarantee that the api layer will never call operations across different rest level sub clients. If such case happens, it will be difficult to do the tree-shaking or even impossible to do that.
And as our goal is to have customers like Azure Portal to use our RLC libraries, bundle size matters a lot to them. In cases like this, we must have subset api layer to call the complete set of rest layer.
## Proposals
To resolve the above multi-client splitting RLC into several part issue, and keep RLC layer as a complete set, we will first need to put them in the top level `src/rest` folder and just provide the subpath export like `@azure/foo/rest`.
To resolve the above multi-client splitting RLC into several part issue, and keep RLC layer as a complete set, we will first need to put them in the top-level `src/rest` folder and just provide the subpath export like `@azure/foo/rest`.
Now, if we keep the rest layer as the original design in the multi-endpoint case, we will found that.
If a customer works on multiple packages, which includes both multi-client case and multi-endpoint case. He or she will found that, sometimes the code is under `src/clientA/rest` folder and sometimes it's not. which could be confusing.
Now, if we keep the rest layer as the original design in the multi-endpoint case, we will be able to find them.
If a customer works on multiple packages, which includes both multi-client case and multi-endpoint case. He or she will find that, sometimes the code is under `src/clientA/rest` folder and sometimes it's not. That could be confusing.
Then in the case of multi-endpoint, we choose to have some folder structure like:
Then, in the case of multi-endpoint, we choose to have some folder structure like:
```text
src/rest/ClientA
@ -128,17 +128,17 @@ ClientB
First, let's consider some common questions about this two design.
1. **_Default Export or Both Exports_**
As not all services will have a domain user scenario, sometimes they are equally important to their customers, it will be difficult to pick one between different sub clients as the default client, Also, it's possible that, sometimes one user scenario could be domain scenario, but as time goes by, their behavior could change, the other one may become a domain scenario. As such, we choose to use **named exports** for all sub clients.
As not all services will have a domain user scenario, sometimes they are equally important to their customers, it will be difficult to pick the one between different sub clients as the default client. Also, it's possible that, sometimes one user scenario could be domain scenario, but as time goes by, their behavior could change, the other one may become a domain scenario. As such, we choose to use **named exports** for all sub clients.
1. **_Models Subpath Export_**
We want to avoid exporting all our models to the top level, as this would obscure some key information about the API. Instead, we want a separate subpath for models, so that they dont clutter the API document and can still be imported by customers if needed.
1. **_Shared Models_**
In both multi-client and multi-endpoint cases, it's possible that we can have some models are shared by both api layer sub clients, As we will have the same models in both the classical client layer and api layer, we will put those models into `src/clientA/models` and `src/clientB/models` folder, And those shared models will be in `common/models`
In both multi-client and multi-endpoint cases, it's possible that we can have some models are shared by both api layer sub clients, As we will have the same models in both the classical client layer and api layer, we will put those models into `src/clientA/models` and `src/clientB/models` folder, and those shared models will be in `common/models`
1. **_Top Level `./models` and `./api` Subpath Export_**
If we export both sub clients to the top level, customers will have to choose to import between the top level and subpath export from sub client. And we only need one way to import these things without there being ambiguity about which way is correct. As such, we choose to remove top level subpaths as well as the index.ts files for both models and api.
If we export both sub clients to the top level, customers will have to choose to import between the top-level and subpath export from sub client. And, we only need one way to import these things without there being ambiguity about which way is correct. As such, we choose to remove top-level subpaths as well as the index.ts files for both models and api.
1. **_Rest Layer Export_**
We should also remove the `./rest` subpath export in Modular, and keep the rest layer as internal in Modular to avoid the following issues:
1. User experience inconsistent issues between cjs customer and esm customer as well as pure RLC customers.
1. To export RLC layer in Modular with RLC as a float dependency will somehow provide extra features for Modular customers without bumping any versions.
We should also remove the `./rest` subpath export in Modular, and keep the rest layer as internal in Modular to avoid the following issues:
1. User experience inconsistent issues between CJS customer and ESM customer as well as pure RLC customers.
1. To export RLC layer in Modular with RLC as a float dependency will somehow provide extra features for Modular customers without bumping any versions.
1. If theres a case where RLC layer is a breaking change and we use rename to avoid breaking in Modular layer. To export RLC layer will make it impossible to avoid that breaking in Modular.
## Finalized Proposals
@ -194,4 +194,4 @@ src/clientB/models
</table>
<!-- markdownlint-enable MD033 -->
The proposals should be the same in both the Multi-Client and Multi-Endpoint case. But there's a difference in the rest layer between the Multi-Client and the Multi-Endpoint case. In the Multi-Client case, we will just have one `@azure-rest/foo` as the modular dependencies that provides the rest api layer stuff to the modular layer. In the Multi-Endpoint case, we will have both `@azure-rest/foo-clientA` that provides rest api layer of clientA to the modular layer sub-client clientA and `@azure-rest/foo-clientB` that provides rest api layer of clientB to the modular layer to the sub-client clientB.
The proposals should be the same in both the Multi-Client and Multi-Endpoint case. But there's a difference in the rest layer between the Multi-Client and the Multi-Endpoint case. In the Multi-Client case, we will just have one `@azure-rest/foo` as the modular dependencies that provides the rest api layer stuff to the modular layer. In the Multi-Endpoint case, we will have both `@azure-rest/foo-clientA` that provides rest api layer of clientA to the modular layer sub-client clientA and `@azure-rest/foo-clientB` that provides rest api layer of clientB to the modular layer to the sub-client clientB.

Просмотреть файл

@ -16,7 +16,7 @@ We therefore require a system that enables us to extract the canonical TypeScrip
### Prior art: unit-testable snippets in the C# SDK
The Azure SDK for .NET authors README documentation snippets as unit tests in a samples directory, for example: [Text Analytics](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/textanalytics/Azure.AI.TextAnalytics/tests/samples/SampleSnippets.cs). Each unit test is decorated with a `#region` which is used to extract the snippet text and insert it into a README. The [Snippet Generator Tool](https://github.com/Azure/azure-sdk-tools/tree/main/tools/snippet-generator/Azure.Sdk.Tools.SnippetGenerator) performs the extraction & insertion.
The Azure SDK for .NET authors README documentation snippets as unit tests in a samples directory, for example: [Text Analytics](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/textanalytics/Azure.AI.TextAnalytics/tests/samples/SampleSnippets.cs). Each unit test is decorated with a `#region` which is used to extract the snippet text and insert it into a README. The [Snippet Generator Tool](https://github.com/Azure/azure-sdk-tools/tree/main/tools/snippet-generator/Azure.Sdk.Tools.SnippetGenerator) performs the extraction & insertion.
The `README.md` has named snippet fences such as the following, and the names correspond to a given named region inside `SampleSnippets.cs`
@ -67,7 +67,7 @@ describe("snippets", function () {
});
```
The top level `describe` call defines a suite named `"snippets"`, and any nested `it` calls define unit tests where the name given is the name of the corresponding snippet. The above file defines a single snippet named `"GetConfigurationSetting"`. The snippet initializes the client using environment variables but coalesces those values to string literals if the environment variables are undefined. Because the resulting `setting` variable is unused, the `@ts-ignore` designation must be applied to prevent a compiler error.
The top-level `describe` call defines a suite named `"snippets"`, and any nested `it` calls define unit tests where the name given is the name of the corresponding snippet. The above file defines a single snippet named `"GetConfigurationSetting"`. The snippet initializes the client using environment variables but coalesces those values to string literals if the environment variables are undefined. Because the resulting `setting` variable is unused, the `@ts-ignore` designation must be applied to prevent a compiler error.
To create a code snippet in a README file or documentation comment, the snippet name is applied to the code fence (just as we saw in the Azure SDK for .NET example above):
@ -77,7 +77,7 @@ To create a code snippet in a README file or documentation comment, the snippet
Following the migration of a package to utilize this snippet extraction system, it is an error for any snippet to be unnamed. All JavaScript and TypeScript snippets **MUST** declare a name, and that name **MUST** match the name of a snippet in the `snippets.spec.ts` file.
The snippet extraction tool will then extract the text of the corresponding snippet within the `snippets.spec.ts` file, transpile and validate it, and then update the fence contents with the resulting JavaScript source. It automatically detects the relevant imports and replaces the `process.env` coalescent access expressions we saw above with _only_ the simple string example value from the right hand side.
The snippet extraction tool will then extract the text of the corresponding snippet within the `snippets.spec.ts` file, transpile and validate it, and then update the fence contents with the resulting JavaScript source. It automatically detects the relevant imports and replaces the `process.env` coalescent access expressions we saw above with _only_ the simple string example value from the right-hand side.
```typescript
import { DefaultAzureCredential } from "@azure/identity";
@ -92,7 +92,7 @@ const client = new ConfigurationClient(endpoint, credential);
const setting = await client.getConfigurationSetting(key);
```
This system works for JSDoc code snippets just as well as README code snippets. For example, for the `ConfigurationClient` class constructor, the following snippet is subject to automatic updating/replacement by the snippet extractor:
This system works for JSDoc code snippets just as well as README code snippets. For example, for the `ConfigurationClient` class constructor, the following snippet is subject to automatic updating/replacement by the snippet extractor:
```typescript
/**
@ -136,7 +136,7 @@ The following set of files within a package directory is subject to snippet extr
- Any Markdown file (ending with `.md`) immediately within the package directory (not any subfolder).
- Any Markdown file (ending with `.md`) within the `samples-dev` folder of the package or any of its children with unlimited depth.
- Any TypeScript file (ending with `.ts` in the `src`, `test`, or `samples-dev` folders of the package or any of their children with unlimited depth.
- Any TypeScript file (ending with `.ts`) in the `src`, `test`, or `samples-dev` folders of the package or any of their children with unlimited depth.
A file may opt out of snippet extraction using a comment directive. Any files containing the following text are ignored:
@ -152,8 +152,8 @@ Snippets are extracted using the TypeScript compiler API. Strictly, a code snipp
1. The file `test/snippets.spec.ts` is read and parsed in the context of the package local to where `dev-tool` was invoked.
2. Each `CallExpression` where the called expression is the literal identifier `it` and the first argument is a `StringLiteral` and the second argument is a function expression (`ArrowFunction` or `FunctionExpression`) is treated as the definition of a snippet, where the snippet name is the extracted `StringLiteral` text.
3. If the second argument is an `ArrowFunction`, its body must be a `Block` (i.e. it is not allowed for the function's body to be an `Expression` as in `() => "test"`, it must be `() => { return "test"; }` instead).
4. Any `BinaryOperation` where the operator is `BarBarToken` or `QuestionMarkQuestionMarkToken` (i.e. a binary logical or operation or nullish coalescing operator) and where the left-hand-side expression is a `process.env` access expression will be replaced by the right hand side expression within the extracted `Block`.
5. The symbols within the `Block` are analyzed to determine whether or not they refer to the definitions within any imports (their type symbols resolve to an import source that is an `ImportSpecifier`), and corresponding `import` declarations are added to the beginning of the extracted `Block`
4. Any `BinaryOperation` where the operator is `BarBarToken` or `QuestionMarkQuestionMarkToken` (i.e. a binary logical or operation or nullish coalescing operator) and where the left-hand-side expression is a `process.env` access expression will be replaced by the right-hand-side expression within the extracted `Block`.
5. The symbols within the `Block` are analyzed to determine whether or not they refer to the definitions within any imports (their type symbols resolve to an import source that is an `ImportSpecifier`), and corresponding `import` declarations are added to the beginning of the extracted `Block`.
6. The contents of the extracted & modified `Block` are validated to ensure they do not contain any syntax that will not function on our minimum-supported Node.js target.
7. If the target language (as declared in the code fence) is `js` or `javascript`, the extracted & modified `Block` is transpiled to JavaScript using the same method we use for compiling samples.

Просмотреть файл

@ -14,7 +14,7 @@ First, download and install Node.js from the official website: https://nodejs.or
Once it is installed correctly, you will be able to use it with the `node` command on the command-line:
```
```sh
node --version
```
@ -22,32 +22,32 @@ node --version
The [Node Package Manager](https://npmjs.com) (npm) is included when you install Node. You can access it from the command-line, similar to Node:
```
```sh
npm --version
```
## Setting up your project
If you already have a project with a package.json file set up, skip to the next section. If not, first let's make a new directory for your project, and change into it.
If you already have a project with a `package.json` file set up, skip to the next section. If not, first let's make a new directory for your project, and change into it.
```
```sh
mkdir example
cd example
```
Now let's [set up a package.json file](https://docs.npmjs.com/creating-a-package-json-file) to configure npm:
Now, let's [set up a package.json file](https://docs.npmjs.com/creating-a-package-json-file) to configure npm:
```
```sh
npm init -y
```
Follow the prompts and npm will generate a starter [package.json](https://docs.npmjs.com/files/package.json) for you.
Now we can install Azure SDK packages. The Azure SDK is composed of many separate packages. You can pick and choose which you need based on the services you intend to use.
Now, we can install Azure SDK packages. The Azure SDK is composed of many separate packages. You can pick and choose which you need based on the services you intend to use.
For example, if you wish to use the Blob functionality provided by Azure's Storage service, you can install the `@azure/storage-blob` package:
```
```sh
npm install --save @azure/storage-blob
```
@ -59,7 +59,7 @@ Below we show examples of using three popular bundlers: [Webpack](https://webpac
First, you need to install [webpack](https://webpack.js.org/) globally:
```
```sh
npm install -g webpack webpack-cli
```
@ -77,13 +77,13 @@ const { BlobServiceClient } = require("@azure/storage-blob");
Now invoke webpack on the command-line:
```
```sh
webpack --mode=development
```
This will create a **bundled** version of your code along with the Azure SDK functionality your code depends on. It writes out the brower-compatible bundle to `dist/main.js` by default.
Now you can use this bundle inside an html page via a script tag:
Now, you can use this bundle inside an html page via a script tag:
```html
<script src="./dist/main.js"></script>
@ -95,11 +95,11 @@ If you want to customize the name or location of your input file, the bundled fi
First, you need to install [TypeScript](https://typescriptlang.org) and a [Webpack loader](https://webpack.js.org/loaders/) for TypeScript:
```
```sh
npm install --save-dev typescript ts-loader
```
Now let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript. If you've already configured TypeScript, you can skip this step. Save the following `tsconfig.json` file next to your `package.json` file you created earlier:
Now, let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript. If you've already configured TypeScript, you can skip this step. Save the following `tsconfig.json` file next to your `package.json` file you created earlier:
```json
{
@ -121,7 +121,7 @@ Similar to our JS example above, let's create an `index.ts` file that imports fr
```ts
// src/index.ts
import { BlobServiceClient } from "@azure/storage-blob";
// Now do something interesting with BlobServiceClient :)
// Now, do something interesting with BlobServiceClient :)
```
The last step we need to perform before we can run `webpack` and produce bundled output is set up a basic `webpack.config.js` file:
@ -151,15 +151,15 @@ module.exports = {
};
```
Now you can invoke webpack on the command-line:
Now, you can invoke webpack on the command-line:
```
```sh
webpack --mode=development
```
This will create a **bundled** version of your code plus the Azure SDK functionality that your code depends on and write it out to a `dist` subfolder inside a file named `bundle.js` (as configured in `webpack.config.js`.)
Now you can use this bundled output file inside an html page via a script tag:
Now, you can use this bundled output file inside an html page via a script tag:
```html
<script src="./dist/bundle.js"></script>
@ -169,7 +169,7 @@ Now you can use this bundled output file inside an html page via a script tag:
First, you need to install [rollup](https://rollupjs.org/) globally:
```
```sh
npm install -g rollup
```
@ -185,7 +185,7 @@ const { BlobServiceClient } = require("@azure/storage-blob");
// Now do something interesting with BlobServiceClient :)
```
Now we need to configure Rollup to take the above code and turn it into a bundle. Save the following `rollup.config.js` file next to your `package.json` file you created earlier:
Next we need to configure Rollup to take the above code and turn it into a bundle. Save the following `rollup.config.js` file next to your `package.json` file you created earlier:
```js
// rollup.config.js
@ -234,19 +234,19 @@ The above configuration may need to change based on which SDK packages your code
We also need to install the plugins we referenced in the above file:
```
```sh
npm install --save-dev @rollup/plugin-node-resolve @rollup/plugin-commonjs @rollup/plugin-json rollup-plugin-shim
```
Now that we have our config file and necessary plugins installed, we can run rollup:
```
```sh
rollup --config
```
This will create a **bundled** version of your code along with the Azure SDK functionality your code depends on. It writes out the brower-compatible bundle to `dist/bundle.js` as configured above.
Now you can use this bundle inside an html page via a script tag:
Now, you can use this bundle inside an html page via a script tag:
```html
<script src="./dist/bundle.js"></script>
@ -256,11 +256,11 @@ Now you can use this bundle inside an html page via a script tag:
First, you need to install [TypeScript](https://typescriptlang.org):
```
```sh
npm install --save-dev typescript
```
Now let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript. If you've already configured TypeScript, you can skip this step. Save the following `tsconfig.json` file next to your `package.json` file you created earlier:
Next, let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript. If you've already configured TypeScript, you can skip this step. Save the following `tsconfig.json` file next to your `package.json` file you created earlier:
```json
{
@ -283,7 +283,7 @@ import { BlobServiceClient } from "@azure/storage-blob";
// Now do something interesting with BlobServiceClient :)
```
Now we need to configure Rollup to take the above code and turn it into a bundle. Save the following `rollup.config.js` file next to your `package.json` file you created earlier:
Next we need to configure Rollup to take the above code and turn it into a bundle. Save the following `rollup.config.js` file next to your `package.json` file you created earlier:
```js
// rollup.config.js
@ -334,13 +334,13 @@ The above configuration may need to change based on which SDK packages your code
We also need to install the plugins we referenced in the above file:
```
```sh
npm install --save-dev @rollup/plugin-node-resolve @rollup/plugin-commonjs @rollup/plugin-json rollup-plugin-shim rollup-plugin-typescript2
```
Now that we have our config file and necessary plugins installed, we can run rollup:
```
```sh
rollup --config
```
@ -356,7 +356,7 @@ Now you can use this bundled output file inside an html page via a script tag:
First, you need to install [parcel](https://parceljs.org/) globally:
```
```sh
npm install -g parcel-bundler
```
@ -396,7 +396,7 @@ const { BlobServiceClient } = require("@azure/storage-blob");
Now you can invoke parcel on the command-line:
```
```sh
parcel index.html
```
@ -404,7 +404,7 @@ This will bundle your code and create a local development server for your page a
If you wish to bundle your page without using the local development server, you can do this by passing the `build` command:
```
```sh
parcel build index.html
```
@ -424,11 +424,11 @@ Parcel uses [browserslist](https://github.com/browserslist/browserslist) to conf
Next, you need to install [TypeScript](https://typescriptlang.org):
```
```sh
npm install --save-dev typescript
```
Now let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript:
Next, let's create a very basic [tsconfig.json](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) file to configure TypeScript:
```json
{
@ -467,7 +467,7 @@ and also an `index.html` that references it:
Now you can invoke parcel on the command-line:
```
```sh
parcel index.html
```
@ -475,7 +475,7 @@ This will bundle your code and create a local development server for your page a
If you wish to bundle your page without using the local development server, you can do this by passing the `build` command:
```
```sh
parcel build index.html
```

Просмотреть файл

@ -10,4 +10,4 @@ The next-generation Azure JavaScript libraries introduce a few important changes
#### Tips:
1. **For more details on how to migrate the next-generation libraries, please visit the [migration guide](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/MIGRATION-guide-for-next-generation-management-libraries.md).**
1. **To get started, please visit the [quickstart guide](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/next-generation-quickstart.md).**
1. **For more sample code, Please visit our [samples repo](https://github.com/Azure-Samples/azure-sdk-for-js-samples).**
1. **For more sample code, please visit our [samples repo](https://github.com/Azure-Samples/azure-sdk-for-js-samples).**

Просмотреть файл

@ -21,11 +21,11 @@ Join the [JavaScript - Reviews](https://teams.microsoft.com/l/channel/19%3a408c5
# Set up your development environment
Follow the [setup guide](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#prerequisites) for environment prerequisites in the Azure SDK for JS repository.
Follow the [setup guide](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#prerequisites) for environment prerequisites in the `azure-sdk-for-js` repository.
# Identify your project's service and package name
The `service name` is a concise identifier for the Azure service and should be consistent across all SDK languages. It's typically the name of the directory in the azure-rest-api-specs repository containing your service's REST API definition.
The `service name` is a concise identifier for the Azure service and should be consistent across all SDK languages. It's typically the name of the directory in the `azure-rest-api-specs` repository containing your service's REST API definition.
The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). It usually follows the format `@azure/{service-name}` or `@azure/{service-name}-{module}` for services with multiple modules.
@ -73,7 +73,7 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
If you are generating the DPG library for Azure Cognitive Services Content Safety, and your TypeSpec configuration file is located at `https://github.com/Azure/azure-rest-api-specs/blob/46ca83821edd120552403d4d11cf1dd22360c0b5/specification/cognitiveservices/ContentSafety/tspconfig.yaml`, you would initialize the library like this:
```shell
```sh
tsp-client init -c https://github.com/Azure/azure-rest-api-specs/blob/46ca83821edd120552403d4d11cf1dd22360c0b5/specification/cognitiveservices/ContentSafety/tspconfig.yaml
```
@ -91,7 +91,7 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
Run the `update` command from SDK directory (i.e sdk/agrifood/agrifood-farming) to re-generate the code:
```shell
```sh
tsp-client update
```
@ -103,9 +103,9 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
3. **Edit rush.json**
As the libraries in azure-sdk-for-js repository are managed by rush, you need to add an entry in rush.json under projects section for the first time to make sure it works. For example:
As the libraries in the `azure-sdk-for-js` repository are managed by rush, you need to add an entry in `rush.json` under projects section for the first time to make sure it works. For example:
```
```json
{
"packageName": "@azure/agrifood-farming",
"projectFolder": "sdk/agrifood/agrifood-farming",
@ -113,8 +113,8 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
},
```
Here you also need to replace the `packageName`, `projectFolder` into your own services'.
You also need to replace the `packageName`, `projectFolder` entries with your own services'.
# After SDK generation
The generated code is not enough to release at once and you need to update it for better usage experience. Please follow [steps after generation guide](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/steps-after-generations.md) to check the code.
The generated code is not enough to release at once and you need to update it for better usage experience. Please follow [steps after generation guide](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/steps-after-generations.md) to check the code.

Просмотреть файл

@ -2,11 +2,11 @@
This document shows the customers of the JavaScript/TypeScript management libraries on how to migrate their code to use the next-generation libraries.
**For new customers of the JavaScript/TypeScript SDK ([azure-sdk-for-js](https://github.com/Azure/azure-sdk-for-js)) please see [quick start for next generation](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/next-generation-quickstart.md).**
**For new customers of the JavaScript/TypeScript SDK ([azure-sdk-for-js](https://github.com/Azure/azure-sdk-for-js)), please see [quick start for next generation](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/next-generation-quickstart.md).**
## Current status
Currently, we have released GA version of selected services including `@azure/arm-resources`, `@azure/arm-storage`, `@azure/arm-compute`, `@azure/arm-network`. We are actively working on releasing more packages and eventually cover all Azure services. Please find the latest version of those libraries in npmjs.com and have a try.
Currently, we have released GA version of selected services including `@azure/arm-resources`, `@azure/arm-storage`, `@azure/arm-compute`, `@azure/arm-network`. We are actively working on releasing more packages and eventually cover all Azure services. Please find the latest version of those libraries in [npm](https://www.npmjs.com) and have a try.
## Why Switching to the next-generation
@ -15,7 +15,7 @@ Compared to the current management libraries, the next-generation libraries have
1. Authentication: The packages `@azure/ms-rest-nodeauth` or `@azure/ms-rest-browserauth` are no longer supported. Use package [@azure/identity](https://www.npmjs.com/package/@azure/identity) instead. Select a credential from Azure Identity examples based on the authentication method of your choice.
1. Callbacks: Method overloads that use callbacks have been replaced to use Promise instead.
1. You could iterate the result of List operations by using the `PagedAsyncIterableIterator` interface, compared with in previous model, you have to make a new request using the link to the next page.
1. Interface and API change for Long running operations: To check the final result of the Poller object returned by long running operations like `beginCreateOrUpdate`, please use `pollUntilDone` instead of `pollUntilFinished`. To get the final result directly, use the method with the suffix `AndWait` e.g.`beginCreateOrUpdateAndWait`.
1. Interface and API change for long-running operations: To check the final result of the Poller object returned by long-running operations like `beginCreateOrUpdate`, please use `pollUntilDone` instead of `pollUntilFinished`. To get the final result directly, use the method with the suffix `AndWait` e.g.`beginCreateOrUpdateAndWait`.
1. The SDK only supports ECMAScript 2015 (ES6) and beyond, all projects that referenced this SDK should be upgraded to use ES6.
If you have an existing application that uses the JavaScript/TypeScript Azure SDK packages and you're interested in updating your application to use the next-generation SDKs, here are the things that you need to do for the migration:
@ -44,11 +44,11 @@ import { ClientSecretCredential } from "@azure/identity";
const credentials = new ClientSecretCredential(tenantId, clientId, clientSecrat);
```
Please refer to [@azure/identity](https://www.npmjs.com/package/@azure/identity) for more details about @azure/identity and [migration guide from @azure/ms-rest-nodeauth to @azure/identity](https://github.com/Azure/ms-rest-nodeauth/blob/master/migrate-to-identity-v2.md) on how to migrate from @azure/ms-rest-nodeauth.
Please refer to [@azure/identity](https://www.npmjs.com/package/@azure/identity) for more details about `@azure/identity` and [migration guide from @azure/ms-rest-nodeauth to @azure/identity](https://github.com/Azure/ms-rest-nodeauth/blob/master/migrate-to-identity-v2.md) on how to migrate from `@azure/ms-rest-nodeauth`.
## Callbacks
In current libraries. we have some operations that allow customers to use callback such as
In current libraries, we have some operations that allow customers to use callback such as
<!-- markdownlint-disable MD033 -->
<table>
@ -130,7 +130,7 @@ The below example shows how you could handle the list result in previous version
await client.availabilitySets.list(this.resourceName).then((response) => handle(response));
```
now you will get a iterator, and you need to do the iteration to get the result.
Now, you will get a iterator, and you need to do the iteration to get the result.
```typescript
const result = client.availabilitySets.list(this.resourceName);

Просмотреть файл

@ -65,7 +65,7 @@ generate-test: true
They only contains basics for testing, you need to update to your own utility and test cases. The overall structure will be similar to below:
_Note: the structure of `test` folder has slight differences between high-level and rest-level clients. In HLC we only have one file under the `test` folder which contains all contents. But in RLC we separate the sample test and utils._
_Note: the structure of the `test` folder has slight differences between high-level and rest-level clients. In HLC, we only have one file under the `test` folder which contains all contents. But in RLC, we separate the sample test and utils._
```
sdk/
@ -85,14 +85,14 @@ sdk/
## Run tests in record mode
Before running tests it's advised to update the dependencises and build our project by running the command `rush update && rush build -t <package-name>`. Please notice this command is time-consuming and it will take around 10 mins, you could refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#resolving-dependency-version-conflicts) for more details.
Before running tests, it's advised to update the dependencises and build our project by running the command `rush update && rush build -t <package-name>`. Please notice this command is time-consuming and it will take around 10 mins, you could refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#resolving-dependency-version-conflicts) for more details.
```Shell
> rush update
> rush build -t @azure-rest/purview-catalog
```
Then we could go to the project folder to run the tests. By default, if you don't specify `TEST_MODE`, it will run previously recorded tests.
Then, we could go to the project folder to run the tests. By default, if you don't specify `TEST_MODE`, it will run previously recorded tests.
```Shell
> cd sdk/purview/purview-catalog-rest
@ -111,7 +111,7 @@ If you are the first time to run tests you may fail with below message because t
[node-tests] RecorderError: Start request failed.
```
To record or update our recordings we need to set the environment variable `TEST_MODE` to `record`. Then run `rushx test`.
To record or update our recordings, we need to set the environment variable `TEST_MODE` to `record`. Then, run `rushx test`.
```Shell
# Windows
@ -136,7 +136,7 @@ This time we could get following similar logs. Go to the folder `purview-catalog
## Run tests in playback mode
If we have existing recordings then the tests have been run against generated the HTTP recordings, we can run your tests in `playback` mode.
If we have existing recordings, then the tests have been run against generated the HTTP recordings, we can run your tests in `playback` mode.
```Shell
# Windows
@ -150,7 +150,7 @@ If we have existing recordings then the tests have been run against generated th
## How to push test recordings to assets repo
We need to push test recording files to [asset repo](https://github.com/Azure/azure-sdk-assets) after testing your test cases.
`Notice`: Before push your recording file, you must confirm that you are able to push recordings to the "azure-sdk-assets" repo, you need write-access to the assets repo.[Permissions to `Azure/azure-sdk-assets`](https://dev.azure.com/azure-sdk/internal/_wiki/wikis/internal.wiki/785/Externalizing-Recordings-(Asset-Sync)?anchor=permissions-to-%60azure/azure-sdk-assets%60)
`Notice`: Before pushing your recording file, you must confirm that you are able to push recordings to the `azure-sdk-assets` repo, you need write-access to the assets repo. [Permissions to `Azure/azure-sdk-assets`](https://dev.azure.com/azure-sdk/internal/_wiki/wikis/internal.wiki/785/Externalizing-Recordings-(Asset-Sync)?anchor=permissions-to-%60azure/azure-sdk-assets%60)
### Push test recording
@ -164,11 +164,11 @@ npx dev-tool test-proxy init
```
Note: If you [install `dev-tool` globally](https://github.com/Azure/azure-sdk-for-js/tree/main/common/tools/dev-tool#installation), you don't need `npx` prefix in the above command
This command would generate an assets.json file with an empty tag.
This command would generate an `assets.json` file with an empty tag.
Example `assets.json` with an empty tag:
```
```json
{
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "js",
@ -177,20 +177,20 @@ Example `assets.json` with an empty tag:
}
```
After init the assets.json file, [run your test with record mode](#run-tests-in-record-mode)
After `init` the `assets.json` file, [run your test with record mode](#run-tests-in-record-mode)
`Notice`: If you have already run tests in record mode before, you need to re-run the tests again to make sure that your records can be pushed later.
Then go to the next step to [Existing package - Tests have been pushed before](#existing-package---tests-have-been-pushed-before).
Then, go to the next step to [Existing package - Tests have been pushed before](#existing-package---tests-have-been-pushed-before).
#### Existing package - Tests have been pushed before
At this point, you should have an `assets.json` file under your SDK. `sdk/<service-folder>/<package-name>/assets.json`.
With asset sync enabled, there is one extra step that must be taken before you create a PR with changes to recorded tests: you must push the new recordings to the assets repo. This is done with the following command:
`Notice`:
the tests have to be recorded using the `TEST_MODE=record` environment variable in order for the recording files to be generated, then you can push them to `assets repo`
`Notice`: the tests have to be recorded using the `TEST_MODE=record` environment variable in order for the recording files to be generated, then you can push them to `assets repo`
```bash
npx dev-tool test-proxy push
```
@ -206,15 +206,15 @@ You should stage and commit the `assets.json` update as part of your PR. If you
### How to find recording files
#### Find local recording files
you can find your recording files in `./azure-sdk-for-js/.assets`
You can find your recording files in `./azure-sdk-for-js/.assets`
If you want to search your recording quickly, you can open `.breadcrumb` file and search your package in which folder.
#### Find recording files in assets repo
You can get the tag in `assets.json` in your package root, which is a tag `pointing` to your recordings in the `Azure/azure-sdk-assets` repo
You can get the tag in `assets.json` in your package root, which is a tag `pointing` to your recordings in the `Azure/azure-sdk-assets` repo.
Example `assets.json` from "arm-network" SDK:
```
Example `assets.json` from `arm-network` SDK:
```json
{
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "js",
@ -223,7 +223,7 @@ Example `assets.json` from "arm-network" SDK:
}
```
the recordings are located at :https://github.com/Azure/azure-sdk-assets/tree/js/network/arm-network_bec01aa795
The recordings are located at https://github.com/Azure/azure-sdk-assets/tree/js/network/arm-network_bec01aa795.
# How to add tests
@ -233,11 +233,11 @@ Adding runnable tests requires both a good understanding of the service, and the
### Client authentication
There are several ways to authenticate to Azure and most common ways are AzureAD OAuth2 authentication and API key authentication. Before adding tests you are expected to know what your services support and ensure you or service principal have rights to perform actions in test.
There are several ways to authenticate to Azure and most common ways are AzureAD OAuth2 authentication and API key authentication. Before adding tests, you are expected to know what your services support and ensure you or service principal have rights to perform actions in test.
#### AzureAD OAuth2 Authentication
If your service uses AzureAD OAuth2 token for authentication. A common solution is to provide [an application and its service principal](https://docs.microsoft.com/azure/active-directory/develop/app-objects-and-service-principals) and to provide RBAC to the service principal for the access to the Azure resource of your service.
If your service uses AzureAD OAuth2 token for authentication, a common solution is to provide [an application and its service principal](https://docs.microsoft.com/azure/active-directory/develop/app-objects-and-service-principals) and to provide RBAC to the service principal for the access to the Azure resource of your service.
Client requires following three variables for the service principal using client ID/secret for authentication:
@ -247,7 +247,7 @@ AZURE_CLIENT_ID
AZURE_CLIENT_SECRET
```
The recommended practice is to store these three values in environment variables called `AZURE_TENANT_ID`, `AZURE_CLIENT_ID`, and `AZURE_CLIENT_SECRET`. To set an environment variable use the following commands:
The recommended practice is to store these three values in environment variables called `AZURE_TENANT_ID`, `AZURE_CLIENT_ID`, and `AZURE_CLIENT_SECRET`. To set an environment variable, use the following commands:
```Shell
# Windows
@ -257,7 +257,7 @@ The recommended practice is to store these three values in environment variables
> export AZURE_TENANT_ID=<value>
```
To ensure our recorder could record OAuth traffic we have to leverage the `createTestCredential` helper to prepare test credential. So please follow below code snippet to create your client.
To ensure our recorder could record OAuth traffic, we have to leverage the `createTestCredential` helper to prepare test credential. So, please follow below code snippet to create your client.
```typescript
import { createTestCredential } from "@azure-tools/test-credential";
@ -268,7 +268,7 @@ const credential = createTestCredential();
new MyServiceClient(<endpoint>, credential);
```
To avoid storing the sensitive info in the recordings like authenticating with your Azure endpoints, keys, secrets, etc, we use the sanitizers to mask the values with the fake ones or remove them, `RecorderStartOptions` helps us here. In our generated sample file we have below sanitizers' code:
To avoid storing the sensitive info in the recordings like authenticating with your Azure endpoints, keys, secrets, etc, we use the sanitizers to mask the values with the fake ones or remove them, `RecorderStartOptions` helps us here. In our generated sample file, we have below sanitizers' code:
```typescript
const envSetupForPlayback: Record<string, string> = {
@ -289,22 +289,22 @@ await recorder.start(recorderEnvSetup);
#### API Key Authentication
API key authentication would hit the service's endpoint directly so these traffic will be recorded. It doesn't require any customization in tests. However we must secure the sensitive data and not leak into our recordings, so add a sanitizer to replace your API keys. You could read more on how to add sanitizer at [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/README.md).
API key authentication would hit the service's endpoint directly so these traffic will be recorded. It doesn't require any customization in tests. However, we must secure the sensitive data and not leak into our recordings, so add a sanitizer to replace your API keys. You could read more on how to add sanitizer at [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/README.md).
## Example 1: Basic RLC test interaction and recording for Azure data-plane service
At the code structure [section](#code-structure) we described we'll generate sample file for you, if you are the first time to write test cases you could grow up your own based on them.
At the code structure [section](#code-structure), we described we'll generate sample file for you. If you are the first time to write test cases, you could grow up your own based on them.
This simple test creates a resource and checks that the service handles it correctly in the project `purview-catalog-rest`. Below are the steps:
- Step 1: Create your test file and add one test case with resource creation, here we have purview catalog glossary test file `glossary.spec.ts` and one case named `Should create a glossary`. Or rename the `sampleTest.spec.ts` file and its case `sample test`.
- Step 2: Add the utility method `createClient` in `public/utils/recordedClient.ts` to share the `PurviewCatalogClient` creation.
- Call `createTestCredential` to init your credential and refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/MIGRATION.md#aad-and-the-new-noopcredential) for more details
- Wrap the `option` with test options by calling `recorder.configureClientOptions(options)`
- Step 3: In `glossary.spec.ts` file call `createClient` to prepare the client and call `client.path("/atlas/v2/glossary").post()` to create our glossary resource under our case `Should create a glossary`
- Call `createTestCredential` to init your credential and refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/MIGRATION.md#aad-and-the-new-noopcredential) for more details.
- Wrap the `option` with test options by calling `recorder.configureClientOptions(options)`.
- Step 3: In `glossary.spec.ts` file, call `createClient` to prepare the client and call `client.path("/atlas/v2/glossary").post()` to create our glossary resource under our case `Should create a glossary`.
- Step 4[Optional]: Specify environment variables that would be faked in the recordings in map `envSetupForPlayback` under the file `public/utils/recordedClient.ts`.
- Step 5: In `glossary.spec.ts` file add necessary assertions in your test case
- Step 6: Run and record your test cases
- Step 5: In `glossary.spec.ts` file, add necessary assertions in your test case.
- Step 6: Run and record your test cases.
### `glossary.spec.ts`
@ -397,19 +397,19 @@ export function createClient(recorder: Recorder, options?: ClientOptions): Purvi
## Example 2: Basic HLC test interaction and recording for Azure management service
At the code structure [section](#code-structure) we described if your SDK is generated base on HLC we'll generate a sample test named `sampleTest.ts` for you.
At the code structure [section](#code-structure), we described if your SDK is generated based on HLC, we'll generate a sample test named `sampleTest.ts` for you.
Next we'll take the package `@azure/arm-monitor` as an example to guide you how to add your own test case. Below are the steps:
Next, we'll take the package `@azure/arm-monitor` as an example to guide you how to add your own test case. Below are the steps:
- Step 1: Create your test file and add one test case with resource creation, here we have monitor test file `monitor.spec.ts` and one case named `Should create diagnosticSettings`. Or rename the `sampleTest.spec.ts` file and its case `sample test`.
- Step 2: Add declarations for common variables e.g monitor client, its diagnostic name and subscription id.
- Step 3: Create the monitor client in `beforeEach` and call `client.diagnosticSettings.createOrUpdate` in test case
- Read the `subscriptionId` from `env`
- Call `createTestCredential` to init your credential and refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/MIGRATION.md#aad-and-the-new-noopcredential) for more details
- Wrap the `option` with test options by calling `recorder.configureClientOptions(options)`
- Step 3: Create the monitor client in `beforeEach` and call `client.diagnosticSettings.createOrUpdate` in test case.
- Read the `subscriptionId` from `env`.
- Call `createTestCredential` to init your credential and refer [here](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/test-utils/recorder/MIGRATION.md#aad-and-the-new-noopcredential) for more details.
- Wrap the `option` with test options by calling `recorder.configureClientOptions(options)`.
- Step 4[Optional]: Specify environment variables that would be faked in the recordings in map `envSetupForPlayback`.
- Step 5: Add necessary assertions in your test case
- Step 6: Run and record your test cases
- Step 5: Add necessary assertions in your test case.
- Step 6: Run and record your test cases.
### `monitor.spec.ts`

Просмотреть файл

@ -1,9 +1,9 @@
Getting Started - Generate the RLC rest-level client libraries with Swagger
Getting Started - Generate the RLC REST-level client libraries with Swagger
===========================================================================
# Before you start
Please refer to this [link](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#prerequisites) for the environment set up prerequisites in azure-sdk-for-js repository. We highly recommand to read [this blog](https://devblogs.microsoft.com/azure-sdk/azure-rest-libraries-for-javascript/) to get familiar with REST libraries for JavaScript.
Please refer to this [link](https://github.com/Azure/azure-sdk-for-js/blob/main/CONTRIBUTING.md#prerequisites) for the environment set up prerequisites in `azure-sdk-for-js` repository. We highly recommand to read [this blog](https://devblogs.microsoft.com/azure-sdk/azure-rest-libraries-for-javascript/) to get familiar with REST libraries for JavaScript.
:warning: Note: if youre generating from Cadl with RLC, please read [this doc](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/RLC-quickstart.md) for Cadl specific details.
@ -18,11 +18,11 @@ If you are the first time to prepare the SDK, please follow the Azure SDK guidan
# How to generate RLC
We are working on to automatically generate everything right now, but currently we still need some manual work to get a releasable package. Here're the steps of how to get the package.
We are working on to automatically generate everything right now, but currently, we still need some manual work to get a releasable package. Here're the steps of how to get the package.
1. **Create a swagger/README.md file.under ${PROJECT_ROOT} folder**
1. **Create a swagger/README.md file under ${PROJECT_ROOT} folder**
We are using autorest to generate the code, but there's a lot of command options and in order to make the regenerate process easier in the cases of refresh the rest api input or change the code generator version, you need to document the generate command parameters.
Here's an example of the swagger/README.md
Here's an example of the `swagger/README.md`:
~~~
@ -88,18 +88,18 @@ We are working on to automatically generate everything right now, but currently
It's always recommended to replace the version of code generator @autorest/typescript with the latest version you can find in [npmjs.com](https://www.npmjs.com/package/@autorest/typescript) in latest tag.
If the `input-file` is followed by an `. md` file, you need to replace the `input-file` with `require`. If it is a `JSON` file, do not change it.
If the `input-file` is followed by an `.md` file, you need to replace the `input-file` with `require`. If it is a `JSON` file, do not change it.
We enable the samples generation by default, this may fail the generation due to the example quality or codegen issue. You could turn this option off by `generate-sample: false` to non-block your process.
**After the first generation, you need to switch `generate-metadata: false` as we have some manual changes in this file and don't want them get overwrite by generated ones.**
**After the first generation, you need to switch `generate-metadata: false` as we have some manual changes in this file and don't want them get overwritten by generated ones.**
---
1. **Edit rush.json**
As the libraries in this azure-sdk-for-js repository are managed by rush, you need to add an entry in rush.json under projects section for the first time to make sure it works. For example:
As the libraries in this `azure-sdk-for-js` repository are managed by rush, you need to add an entry in `rush.json` under projects section for the first time to make sure it works. For example:
```
```json
{
"packageName": "@azure-rest/agrifood-farming",
"projectFolder": "sdk/agrifood/agrifood-farming-rest",
@ -107,7 +107,7 @@ We are working on to automatically generate everything right now, but currently
},
```
Here you also need to replace the `packageName`, `projectFolder` into your own services'.
Here, you also need to replace the `packageName`, `projectFolder` into your own services'.
---
**NOTE**
@ -118,7 +118,7 @@ We are working on to automatically generate everything right now, but currently
1. **Run autorest to generate the SDK**
Now you can run this command in swagger folder you just created.
Now, you can run this command in swagger folder you just created.
```shell
autorest --typescript ./README.md
@ -133,8 +133,8 @@ We are working on to automatically generate everything right now, but currently
cd <your-sdk-folder>
rushx pack
```
But we still need to add some tests for it.
But, we still need to add some tests for it.
# Improve README.md document
@ -151,12 +151,13 @@ See the [JavaScript Codegen Quick Start for Test](https://github.com/Azure/azure
To be able to leverage the asset-sync workflow
- Install [Powershell](https://github.com/PowerShell/PowerShell)
- Make sure "pwsh" command works at this step (If you follow the above link, "pwsh" is typically added to the system environment variables by default)
- Make sure `pwsh` command works at this step (If you follow the above link, `pwsh` is typically added to the system environment variables by default)
- Add `dev-tool` to the `devDependencies` in the `package.json`.
The package you are migrating needs to be using the new version of the recorder that uses the test proxy (`@azure-tools/test-recorder@^3.0.0`).
Then we need to generate a `assets.json` file. If your package is new or has never been pushed before, you could use below commands:
Then, we need to generate a `assets.json` file. If your package is new or has never been pushed before, you could use below commands:
```shell
npx dev-tool test-proxy init # this will generate assets.json file, you will get some info in this file.
```
@ -166,49 +167,62 @@ See the [JavaScript Codegen Quick Start for Test](https://github.com/Azure/azure
You could follow the [basic RLC test interaction and recording example](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Quickstart-on-how-to-write-tests.md#example-1-basic-rlc-test-interaction-and-recording-for-azure-data-plane-service) to write your test step by step. Also you could refer [the test of MapsRouteClient](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/maps/maps-route-rest/test/public) for more cases.
2. **Run the test**
2. **Run the test**
Now, you can run the test like this. If you are the first time to run test, you need to set the environment variable `TEST_MODE` to `record`. This will generate recordings for your test they could be used in `playback` mode.
On Linux, you could use `export` to set env variable:
```shell
rush build -t ${PACKAGE_NAME}
export TEST_MODE=record && rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
On Windows, you could use `SET`:
```shell
rush build -t ${PACKAGE_NAME}
SET TEST_MODE=record&& rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
You can also run the `playback` mode test if your apis don't have breaking changes and you've already done the recording before.
On Linux, you could use below commands:
```shell
rush build -t ${PACKAGE_NAME}
export TEST_MODE=playback && rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
On Windows, you can use:
```shell
rush build -t ${PACKAGE_NAME}
SET TEST_MODE=playback&& rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
3. **Push recording to assets repo**
`Notice`:
- the tests have to be recorded using the `TEST_MODE=record`, then the recording files will be generate.
- Before push your recording file, you must confirm that you are able to push recordings to the "azure-sdk-assets" repo, you need write-access to the assets repo.
- Before push your recording file, you must confirm that you are able to push recordings to the `azure-sdk-assets` repo, you need write-access to the assets repo.
[Permissions to `Azure/azure-sdk-assets`](https://dev.azure.com/azure-sdk/internal/_wiki/wikis/internal.wiki/785/Externalizing-Recordings-(Asset-Sync)?anchor=permissions-to-%60azure/azure-sdk-assets%60)
Here is the command to push:
Here is the command to push:
```shell
npx dev-tool test-proxy push
```
After above command finished, you can find your local recording files in `./azure-sdk-for-js/.assets`.
And if you want to find your recording on `assets repo` , you can get the tag in assets.json in your package root, which is a tag pointing to your recordings in the Azure/azure-sdk-assets repo
And if you want to find your recording on `assets repo`, you can get the tag in `assets.json` in your package root, which is a tag pointing to your recordings in the [Azure/azure-sdk-assets](https://github.com/Azure/azure-sdk-assets) repo.
example assets.json from "keyvault-certificates" SDK.
```
Example `assets.json` from `keyvault-certificates` SDK.
```json
{
"AssetsRepo": "Azure/azure-sdk-assets",
"AssetsRepoPrefixPath": "js",
@ -216,31 +230,35 @@ See the [JavaScript Codegen Quick Start for Test](https://github.com/Azure/azure
"Tag": "js/keyvault/keyvault-certificates_43821e21b3"
}
```
And the recordings are located at https://github.com/Azure/azure-sdk-assets/tree/js/keyvault/keyvault-certificates_43821e21b3
And the recordings are located at [here](https://github.com/Azure/azure-sdk-assets/tree/js/keyvault/keyvault-certificates_43821e21b3).
# How to write samples
If you enable `generate-sample: true` option the codegen would do two things for you:
- Add samples metadata in `tsconfig.json` and `package.json`
- Generate a collection of TypeScript sample files (based on x-ms-examples in OpenAPI specs) under `samples-dev` folder.
If you enable `generate-sample: true` option, the codegen would do two things for you:
- Add samples metadata in `tsconfig.json` and `package.json`.
- Generate a collection of TypeScript sample files (based on `x-ms-examples` in OpenAPI specs) under `samples-dev` folder.
Please notice that the generated samples might not be directly usable as runnable codes, however we could get the basic idea on how code works, and update them to be more valuable samples.
Please notice that the generated samples might not be directly usable as runnable codes, however, we could get the basic idea on how code works, and update them to be more valuable samples.
And the errors may come from two kinds, the codegen issue or swagger example issue. For the former one we need to report them with codegen owner while as for the latter one we need to fix our swagger examples.
And the errors may come from two kinds, the codegen issue or swagger example issue. For the former one, we need to report them with codegen owner while as for the latter one we need to fix our swagger examples.
Now, you can generate both JavaScript and TypeScript workable samples with the following commands.
```shell
npm install -g common/tools/dev-tool # make sure you are in the azure-sdk-for-js repo root directory
cd ${PROJECT_ROOT}
npx dev-tool samples publish -f
```
You will see the workable samples in the `${PROJECT_ROOT}/samples` folder.
Besides the generated samples, we also recommand you to add your HERO sample scenarios per your services to guide customers on how to use your library. You could refer [the samples of MapsRouteClient here](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/maps/maps-route-rest/samples-dev) as an example.
Besides the generated samples, we also recommand you to add your HERO sample scenarios per your services to guide customers on how to use your library. You could refer to [the samples of MapsRouteClient here](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/maps/maps-route-rest/samples-dev) as an example.
# Format both the generated code and manual code
After you have finished the generation and added your own tests or samples, You can use the following command to format the code.
After you have finished the generation and added your own tests or samples, You can use the following command to format the code.
```shell
cd ${PROJECT_ROOT} && rushx format
```
@ -256,13 +274,14 @@ And we could use `lint:fix` if there are any errors.
```shell
cd ${PROJECT_ROOT} && rushx lint:fix
```
# How to do customizations
You may want to do your customizations based on generated code. We collect some common customization cases and you can read [Customization on the RLC rest-level client libraries](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/RLC-customization.md) for more details.
# How to create package
Now we can use the exact same steps to build an releasable artifact.
Now, we can use the exact same steps to build an releasable artifact.
```shell
rush update
@ -272,11 +291,12 @@ export TEST_MODE=record && rushx test
rushx pack
```
You may send this artifact to your customer if your services are still in private preview and some customers want to try it out.
# Create/Update the ci.yaml
Now, if everything looks good to you, you can submit a PR in azure-sdk-for-js repo with all the changes you made above. Before you do that, you need to add/update the ci.yml file. Depends on whether there's already one in your package folder.
Now, if everything looks good to you, you can submit a PR in `azure-sdk-for-js` repo with all the changes you made above. Before you do that, you need to add/update the `ci.yml` file. Depends on whether there's already one in your package folder.
If there's no such file then you can add the following template.
If there's no such file, then you can add the following template.
``` yaml
# NOTE: Please refer to https://aka.ms/azsdk/engsys/ci-yaml before editing this file.
@ -310,12 +330,11 @@ extends:
safeName: azurerestagrifoodfarming
```
Please change the paths.include value as your own project path, and change the Artifacts name and safeName into yours.
Please change the `paths.include` value as your own project path, and change the Artifacts `name` and `safeName` into yours.
If there's already a ci.yml file in your project path. then the only thing you need to do is to add the Artifacts name and safeName of yours into that ci.yml.
Please notice the Artifacts name should align with your package name. Here the package name is `@azure-rest/agrifood-farming` so the relevant Artifacts name is `azure-rest-agrifood-farming`.
If there's already a `ci.yml` file in your project path, then the only thing you need to do is to add the Artifacts `name` and `safeName` of yours into that `ci.yml`.
Please notice the Artifacts name should align with your package name. Here, the package name is `@azure-rest/agrifood-farming`, so the relevant Artifacts name is `azure-rest-agrifood-farming`.
# Prepare PR
@ -323,7 +342,7 @@ The codegen can only help you generate SDK code, there is something you need to
## CHANGELOG.md
CHANGELOG can help customers know the change of new version quicky, so you need to update the it according to the change of this new version. It is also necessary to update release date like `1.0.0-beta.1 (2022-11-11)`(rough time is fine and no need to be very accurate).
CHANGELOG can help customers know the change of new version quicky, so you need to update it according to the change of this new version. It is also necessary to update release date like `1.0.0-beta.1 (2022-11-11)` (rough time is fine and no need to be very accurate).
## Version Number
@ -334,18 +353,20 @@ You shall update the version number according to [semantic versioning rule](http
Please ensure that your test recordings are committed together with your code.
## Fix CI for PR
You may meet the CI failures after submitting the PR, so please refer to [Troubleshoot CI Failure](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Troubleshoot-ci-failure.md) to fix it.
## CC dpg-devs for review
Please add below comment in your pr to include dpg-devs to review your pr timely.
Please add below comment in your PR to include `dpg-devs` to review your PR timely.
```
cc @Azure/dpg-devs for awareness
```
# Create API View
When submitting a PR our pipeline would automatically prepare the API view in [API View Website](https://apiview.dev/). You could see an [example link](https://github.com/Azure/azure-sdk-for-js/pull/23866#issuecomment-1316259448) here. You could click the API view link in that comment to know more details.
When submitting a PR, our pipeline would automatically prepare the API view in [API View Website](https://apiview.dev/). You could see an [example link](https://github.com/Azure/azure-sdk-for-js/pull/23866#issuecomment-1316259448) here. You could click the API view link in that comment to know more details.
# Release

Просмотреть файл

@ -12,9 +12,9 @@ source-code-folder-path: ./src/generated
## Custom authentication
Some services require a custom authentication flow. For example Metrics Advisor uses Key Authentication, however MA requires 2 headers for key authentication `Ocp-Apim-Subscription-Key` and `x-api-key`, which is different to the usual key authentication which only requires a single key.
Some services require a custom authentication flow. For example, Metrics Advisor uses Key Authentication, however, MA requires 2 headers for key authentication `Ocp-Apim-Subscription-Key` and `x-api-key`, which is different to the usual key authentication which only requires a single key.
In this case we customize as follows:
In this case, we customize as follows:
1. Hand author a `PipelinePolicy` that takes values for both keys and sign the request
2. Hand author a wrapping client factory function
@ -52,7 +52,7 @@ export default function createClient(
}
```
And in `metricsAdvisorKeyCredentialPolicy.ts` file we have the customized policy and `createMetricsAdvisorKeyCredentialPolicy` function to create that policy
And in `metricsAdvisorKeyCredentialPolicy.ts` file, we have the customized policy and `createMetricsAdvisorKeyCredentialPolicy` function to create that policy.
```typescript
import {
@ -113,7 +113,7 @@ Eventhough the code generator provides a pagination helper for RLCs, there are s
One example is the Metrics Advisor service, which implements a pagination pattern in which getting the next page can be called with `GET` or `POST` depending on the resource.
The standard pagination pattern, assumes `GET` for getting the next pages. In this case we implemented a custom paginate helper that has the same public interface as the generated helper but under the hoods has an additional pagination implementation to use `POST`. Also this custom helper has an internal map that indicates which operations need `POST` and which need `GET`.
The standard pagination pattern, assumes `GET` for getting the next pages. In this case, we implemented a custom paginate helper that has the same public interface as the generated helper but under the hoods has an additional pagination implementation to use `POST`. Also this custom helper has an internal map that indicates which operations need `POST` and which need `GET`.
Here is the implementation in Metrics Advisor and remember to replace the `paginationMapping` as yours. The generated paging helper is hidden and the custom paginate helper is exposed.
@ -204,7 +204,7 @@ for await (const dataFeed of dataFeeds) {
There may be times in which transforming the data from the service would be beneficial. When a transformation is common for our customers we may decide to expose helper transformation functions. These helper transformations are optional and customers can decide to use them or not, the calls maintain the original data form from the Service.
If we export `toDataFeedDetailResponse` which may convert the REST model to a common one, so that the customers could call this way:
If we export `toDataFeedDetailResponse`, which may convert the REST model to a common one so that the customers could call this way:
```typescript
import MetricsAdvisor, { toDataFeedDetailResponse } from "@azure-rest/ai-metricsadvisor";
@ -222,7 +222,7 @@ const formattedDatafeed = toDataFeedDetailResponse(listResponse);
## Multi-client packages
There are cases where 2 services are closely related that most users will need to use both in the same application, in this case, we may opt for multi-client packages. Each client can be imported individually without a top-level client, this is to work nicely with bundler TreeShaking.
There are cases where 2 services are closely related that most users will need to use both in the same application. In this case, we may opt for multi-client packages. Each client can be imported individually without a top-level client, this is to work nicely with bundler TreeShaking.
We could leverage the autorest batch option and enable multi-client flag in our `README.md` to generate two or more service clients.
@ -240,7 +240,7 @@ batch:
### Specify configurations for each individual clients
For each individual clients specify your client name and swagger file. Make sure that you don't have one Swagger with operations that are designed to be in two different clients so that clients should correspond to a clear set of Swagger files.
For each individual clients, specify your client name and swagger file. Make sure that you don't have one Swagger with operations that are designed to be in two different clients so that clients should correspond to a clear set of Swagger files.
Normally, the folder structure would be something like `sdk/{servicename}/{servicename}-{modulename}-rest`. For example, we have `sdk/agrifood/agrifood-farming-rest` folder for Farmbeats account modules. That folder will be your **${PROJECT_ROOT} folder**.
@ -262,7 +262,7 @@ input-file: /your/swagger/folder/metricsadvisor-admin.json
### Generate code with `--multi-client`
When generating the code specify that what we want is multi-client so append the flag in command line `--multi-client`. After generation the folder structure would be like below:
When generating the code, specify that what we want is multi-client so append the flag in command line `--multi-client`. After generation, the folder structure would be like below:
```
${PROJECT_ROOT}/
@ -283,6 +283,7 @@ import {
MetricsAdvisorAdministrationClient,
MetricsAdvisorClient,
} from "@azure-rest/ai-metrics-advisor";
const adminClient = MetricsAdvisorAdministrationClient.createClient(endpoint, credential);
// call any admin operation
const createdResponse = await adminClient.createDataFeed(`<parameter>`);
@ -297,4 +298,4 @@ Our customization strategy has the following principles:
- Expose custom functionality as helper functions that users can opt-in
- Never force customers to use a customized function or operation
- The only exception is if we need to add custom policies to the client, it is okay to wrap the generated client factory and exposed the wrapped factory instead of the generated one.
- The only exception is if we need to add custom policies to the client. It is okay to wrap the generated client factory and expose the wrapped factory instead of the generated one.

Просмотреть файл

@ -23,13 +23,13 @@ Follow the [setup guide](https://github.com/Azure/azure-sdk-for-js/blob/main/CON
# Identify your project's service and package name
The `service name` is a concise identifier for the Azure service and should be consistent across all SDK languages. It's typically the name of the directory in the azure-rest-api-specs repository containing your service's REST API definition.
The `service name` is a concise identifier for the Azure service and should be consistent across all SDK languages. It's typically the name of the directory in the `azure-rest-api-specs` repository containing your service's REST API definition.
The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). It usually follows the format `@azure/{service-name}-rest` or `@azure/{service-name}-{module}-rest` for services with multiple modules.
# Structure your project
1. **SDK Repo Root**: the generated libraries should be in the [azure-sdk-for-js](https://github.com/Azure/azure-sdk-for-js) repo, so fork and clone it in your local then the absolute path is called **${SDK_REPO_ROOT} folder**.
1. **SDK Repo Root**: the generated libraries should be in the [azure-sdk-for-js](https://github.com/Azure/azure-sdk-for-js) repo, so fork and clone it in your local. Then, the absolute path is called **${SDK_REPO_ROOT} folder**.
1. **Project Folder Structure**: the typical structure is `sdk/{servicename}/{servicename}-{modulename}-rest`, e.g., `sdk/agrifood/agrifood-farming-rest`. That folder is under {SDK_REPO_ROOT} and will be your **${PROJECT_ROOT} folder**.
@ -99,9 +99,9 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
3. **Edit rush.json**
As the libraries in azure-sdk-for-js repository are managed by rush, you need to add an entry in rush.json under projects section for the first time to make sure it works. For example:
As the libraries in `azure-sdk-for-js` repository are managed by rush, you need to add an entry in `rush.json` under projects section for the first time to make sure it works. For example:
```
```json
{
"packageName": "@azure-rest/agrifood-farming",
"projectFolder": "sdk/agrifood/agrifood-farming-rest",
@ -109,7 +109,7 @@ The `package name` is used when publishing to [npmjs](https://www.npmjs.com/). I
},
```
Here you also need to replace the `packageName`, `projectFolder` into your own services'.
Here, you also need to replace the `packageName`, `projectFolder` into your own services'.
---
**NOTE**

Просмотреть файл

@ -1,14 +1,17 @@
# Overview
This doc shows some common problems and resolution in CI.
# Broken links
![image](./images/broken-links.png)
Add the broken links into [eng/ignore-links.txt](https://github.com/Azure/azure-sdk-for-js/blob/main/eng/ignore-links.txt) file to bypass this verification or you could update the broken links to valid ones, see [example pr here](https://github.com/Azure/azure-sdk-for-js/pull/23429/commits/1a7b74c4bdad27e423a355a4c7f3dde4ac3c83bc).
Add the broken links into [eng/ignore-links.txt](https://github.com/Azure/azure-sdk-for-js/blob/main/eng/ignore-links.txt) file to bypass this verification or you could update the broken links to valid ones, see [example PR here](https://github.com/Azure/azure-sdk-for-js/pull/23429/commits/1a7b74c4bdad27e423a355a4c7f3dde4ac3c83bc).
# Check spelling (cspell)
For new service the error usually happens, Fix spelling in code or in markdown at file [.vscode/cspell.json](https://github.com/Azure/azure-sdk-for-js/blob/main/.vscode/cspell.json). See an example in [devcenter pr](https://github.com/chrissmiller/azure-sdk-for-js/commit/ef18dccae59e98185e3854f8b087230b65735744).
For new service the error usually happens, fix spelling in code or in markdown at file [.vscode/cspell.json](https://github.com/Azure/azure-sdk-for-js/blob/main/.vscode/cspell.json). See an example in [devcenter PR](https://github.com/chrissmiller/azure-sdk-for-js/commit/ef18dccae59e98185e3854f8b087230b65735744).
# Push failure\
@ -16,14 +19,14 @@ For new service the error usually happens, Fix spelling in code or in markdown a
![image](./images/Push-failure.png)
Install [Powershell](https://github.com/PowerShell/PowerShell). Make sure "pwsh" command works at this step (If you follow the above link, "pwsh" is typically added to the system environment variables by default)
Install [Powershell](https://github.com/PowerShell/PowerShell). Make sure `pwsh` command works at this step (If you follow the above link, `pwsh` is typically added to the system environment variables by default.)
## Authorization issue
![image](./images/Authorization-issue.png)
If you are from service team, External to azure-sdk, you can follow these steps:
1. To request write access, join the an appropriate team from [this list](https://github.com/orgs/Azure/teams?query=azure-sdk-write-)([same list](https://repos.opensource.microsoft.com/teams?q=azure-sdk-write-) on MS OpenSource portal) that corresponds with the product or org that you work in. **Be sure to join only one team.**
If you are from service team, External to `azure-sdk`, you can follow these steps:
1. To request write access, join an appropriate team from [this list](https://github.com/orgs/Azure/teams?query=azure-sdk-write-)([same list](https://repos.opensource.microsoft.com/teams?q=azure-sdk-write-) on MS OpenSource portal) that corresponds with the product or org that you work in. **Be sure to join only one team.**
2. If you don't see your team in the list? Contact **Scott Kurtzeborn** : <scotk@microsoft.com> to create a new one for your service team.

Просмотреть файл

@ -1,38 +1,43 @@
In this document, we will give a brief introduction on how to use the JavaScript SDK for new users
In this document, we will give a brief introduction on how to use the JavaScript SDK for new users.
1. Prepare your environment.
NodeJS: can be installed from https://nodejs.org/en/download/
typescript: install it with `npm install -g typescript`.
1. Create a empty folder and cd this folder.
1. Create a empty folder and `cd` this folder.
```
mkdir jstest
cd jstest
```
1. Initialize a new node project.
```
npm init
```
This step will create a package.json file in current folder.
This step will create a `package.json` file in current folder.
1. Install dependencies.
```
// install @azure/identity, you can use @azure/identity to do the authentication work.
npm install @azure/identity
// Then install your target try out package, you can install the latest published with
npm install @azure/arm-XXX
// or install it from your local JS SDK artifact file.
npm install D:\\dev\\test\\test-compute\\azure-arm-XXX-1.0.0.tgz
// Then install your target try out package, you can install the latest published with
npm install @azure/arm-XXX
// or install it from your local JS SDK artifact file.
npm install D:\\dev\\test\\test-compute\\azure-arm-XXX-1.0.0.tgz
```
In the case of verifying the unpublished packages, you may download the artifact from either rest api specs CI pipeline or from the release request issue that we provided.
In the case of verifying the unpublished packages, you may download the artifact from either rest api specs CI pipeline or from the release request issue that we provided.
1. Create a ts file (free name and copy follow code into this file) eg: test_1.ts
eg
```
1. Create a ts file (free name and copy follow code into this file), eg: `test_1.ts`.
Eg:
```ts
import { DefaultAzureCredential } from "@azure/identity";
import{ TargetManagementClient } from "@azure/arm-target";
@ -48,26 +53,35 @@ In this document, we will give a brief introduction on how to use the JavaScript
}
test();
```
In the example, we only add client.operations.list(), you may change them into other resources CURD function per your need.
for example
```
In the example, we only add `client.operations.list()`, you may change them into other resources CRUD function per your need.
Eg:
```ts
const client = new ComputeManagementClient(credentials, subscriptionID);
const result= await client.galleries.beginCreateOrUpdateAndWait(resourceGroupName, galleryName, gallery);
const result= await client.galleryImages.begincreateOrUpdateAndWait(resourceGroupName, galleryName, galleryImageName, galleryImage);
```
1. Install all the dependencies
```
npm install // need to make sure package.json exists and has contained at step 4.
```
1. Install all the dependencies
```
npm install // need to make sure package.json exists and has contained at step 4.
```
1. Compile the ts file.
```
tsc test_1.ts
```
it will generate a test_1.js file in current folder.
it will generate a `test_1.js` file in current folder.
1. Run the code.
```
node test_1.js
```
Now you can see expected response.
Now, you can see expected response.

Просмотреть файл

@ -2,11 +2,11 @@
Getting Started - Using the next-generation management libraries of Azure SDK for JavaScript/TypeScript
=============================================================
We are excited to announce the GA of a new set of management plane libraries for JavaScript/TypeScript. Those libraries contain a number of new features including Azure Identity support, HTTP pipeline, error-handling.,etc, and follow the new Azure SDK guidelines which create easy-to-use
We are excited to announce the GA of a new set of management plane libraries for JavaScript/TypeScript. Those libraries contain a number of new features including Azure Identity support, HTTP pipeline, error-handling, etc, and follow the new Azure SDK guidelines which create easy-to-use
APIs that are idiomatic, compatible, and dependable. See [TypeScript Design Guidelines](https://azure.github.io/azure-sdk/typescript_design.html) for more information.
Currently, we have released GA version of several packages such as `azure/arm-resources`, `@azure/arm-storage`,
`@azure/arm-compute`, `@azure/arm-network` for next-generation. Please find the latest version of those libraries in npmjs.com and have a try.
`@azure/arm-compute`, `@azure/arm-network` for next-generation. Please find the latest version of those libraries in [npm](https://www.npmjs.com) and have a try.
In this basic quickstart guide, we will walk you through how to
authenticate to Azure and start interacting with Azure resources. There are several possible approaches to
@ -76,7 +76,7 @@ As an example, to install the Azure Compute module, you would run :
```sh
npm i @azure/arm-compute@latest
```
You can always find the latest preview version of our next-generation management libraries via npmjs under the `next` tag of each package.
You can always find the latest preview version of our next-generation management libraries via [npm](https://www.npmjs.com) under the `next` tag of each package.
We also recommend installing other packages for authentication and core functionalities :
@ -115,13 +115,13 @@ Interacting with Azure Resources
Now that we are authenticated and have created our clients, we can use our client to make API calls. For resource management scenarios, most of our cases are centered around creating / updating / reading / deleting Azure resources. Those scenarios correspond to what we call "operations" in Azure. Once you are sure of which operations you want to call, you can then implement the operation call using the management client we just created in previous section.
In the following samples. we are going to show
In the following samples, we are going to show
- **Step 1** : How to Create a simple resource Resource Group.
- **Step 2** : How to Manage Resource Group with Azure SDK for JavaScript/TypeScript
- **Step 3** : How to Create a complex resource Virtual Machine.
Let's show our what final code looks like
Let's show what our final code looks like
Example: Creating a Resource Group
---------------------------------

Просмотреть файл

@ -41,7 +41,7 @@ We will go into each step in the following sections
## 1. Initialize the Client
First import the client
First, import the client
```typescript
import ExampleClient from "@azure-rest/example-client";
@ -69,11 +69,11 @@ const client = ExampleClient("https://example.org/", new DefaultAzureCredential(
## 2. Send a request
Once the client has been initialized we need to set a path to work with. For this, the REST client exposes 2 functions `path` and `pathUnchecked`
Once the client has been initialized, we need to set a path to work with. For this, the REST client exposes 2 functions `path` and `pathUnchecked`
### Path
The `path` function takes a string as the first parameter and accepts any path documented by the service, this function will help with autocomplete to discover all available paths. It also detects if the path needs parameters and makes them required positional parameters to `path`. Once the path is set users can access functions for all the supported verbs on that path
The `path` function takes a string as the first parameter and accepts any path documented by the service, this function will help with autocomplete to discover all available paths. It also detects if the path needs parameters and makes them required positional parameters to `path`. Once the path is set, users can access functions for all the supported verbs on that path
```typescript
import ExampleClient from "@azure-rest/example-client";
@ -90,7 +90,7 @@ console.log(response.body);
### PathUnchecked
PathUnchecked function is similar to Path, it takes a path as the first parameter, this can be any arbitrary path. It also detects if the path needs a path parameter and requires them as positional parameters to `pathUnchecked`. Once the path is set users can access functions for any verb on that path.
PathUnchecked function is similar to Path, it takes a path as the first parameter, this can be any arbitrary path. It also detects if the path needs a path parameter and requires them as positional parameters to `pathUnchecked`. Once the path is set, users can access functions for any verb on that path.
The main difference with `path` is that `pathUnchecked` doesn't have strongly typed payload, headers, or query parameters and has `any` as the response type.
@ -123,7 +123,6 @@ const response = await client.path("/hello").post({body: {content: "Brian"}});
console.log(response.status);
// 200
```
### Headers and Query Parameters
@ -141,7 +140,6 @@ const hello = await client
console.log(hello.body);
// {content: "Hello"}
```
## 3. Handle the Response

Просмотреть файл

@ -28,29 +28,38 @@ See the [Javascript Codegen Quick Start for Test](https://github.com/Azure/azure
You could follow the [basic RLC test interaction and recording example](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Quickstart-on-how-to-write-tests.md#example-1-basic-rlc-test-interaction-and-recording-for-azure-data-plane-service) to write your test step by step.
Also you could refer below examples for more cases:
Also, you could refer to the below examples for more cases:
- RLC example: [OpenAI Testing](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai/test/public)
- DPG example: [Maps Route Testing](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/maps/maps-route-rest/test/public)
2. **Run the test**
Now, you can run the test like this. If you are the first time to run test, you need to set the environment variable `TEST_MODE` to `record`. This will generate recordings for your test they could be used in `playback` mode.
On Linux, you could use `export` to set env variable:
```shell
rush build -t ${PACKAGE_NAME}
export TEST_MODE=record && rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
On Windows, you could use `SET`:
```shell
rush build -t ${PACKAGE_NAME}
SET TEST_MODE=record&& rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
You can also run the `playback` mode test if your apis don't have breaking changes and you've already done the recording before.
On Linux, you could use below commands:
```shell
rush build -t ${PACKAGE_NAME}
export TEST_MODE=playback && rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
```
On Windows, you can use:
```shell
rush build -t ${PACKAGE_NAME}
SET TEST_MODE=playback&& rushx test # this will run live test and generate a recordings folder, you will need to submit it in the PR.
@ -58,26 +67,30 @@ See the [Javascript Codegen Quick Start for Test](https://github.com/Azure/azure
# How to write samples
We highly encourage you to write some valid samples for your customer to get start your service with libraries. You may author TypeScript samples under the `samples-dev` folder. For quick start you can use [sample-dev template](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/template/template/samples-dev) as reference and update the relevant information for your service such as package-name, sample code, description, etc.
We highly encourage you to write some valid samples for your customer to get start your service with libraries. You may author TypeScript samples under the `samples-dev` folder. For quick start, you can use [sample-dev template](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/template/template/samples-dev) as reference and update the relevant information for your service such as package-name, sample code, description, etc.
To learn more you could refer below samples:
To learn more, you could refer to the below samples:
- DPG sample: [the samples of OpenAIClient](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai/samples-dev)
- RLC sample: [the samples of MapsRouteClient](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/maps/maps-route-rest/samples-dev)
After the samples-dev folder change is finished, you will need to change the tsconfig.json to make sure the dev samples can be compiled and build correctly.
After the `samples-dev` folder change is finished, you will need to change the `tsconfig.json` to make sure the dev samples can be compiled and build correctly.
You will need to add this part to the `compilerOptions` of your `tsconfig.json` file so that the Samples engine could resolve the `sample-dev` package against the source code of the SDK.
You will need to add this part to the "compilerOptions" of your tsconfig.json file so that the Samples engine could resolve the sample-dev package against the source code of the SDK.
``` json
"paths": { "@azure/agrifood-farming": ["./src/index"] }
```
And change the *"include"* part into
And change the *"include"* part to
```json
"include": ["./src/**/*.ts", "./test/**/*.ts", "samples-dev/**/*.ts"],
```
Then, we provide tools to automatically change it into workable samples in both TypeScript and JavaScript. and you just need to add a sampleConfiguration in your package.json.
Then, we provide tools to automatically change it into workable samples in both TypeScript and JavaScript. And, you just need to add a `sampleConfiguration` in your `package.json`.
You will need to add a sample configuration section in your `package.json` file and put the following content into it.
You will need to add a sample configuration section in your package.json file and put the following content into it.
```json
"//sampleConfiguration": {
"productName": "A description of your services",
@ -88,6 +101,7 @@ You will need to add a sample configuration section in your package.json file an
```
Now, you can generate both JavaScript and TypeScript workable samples with the following commands.
```shell
npm install -g common/tools/dev-tool # make sure you are in the azure-sdk-for-js repo root directory
cd ${PROJECT_ROOT}
@ -96,12 +110,14 @@ npx dev-tool samples publish -f
You will see the workable samples in the `${PROJECT_ROOT}/samples` folder.
# Format both the generated code and manual code
After you have finished the generation and added your own tests or samples, You can use the following command to format the code.
After you have finished the generation and added your own tests or samples, you can use the following command to format the code.
```shell
cd ${PROJECT_ROOT} && rushx format
```
Also we'll recommand you to run `lint` command to analyze your code and quickly find any problems.
Also, we'll recommand you to run `lint` command to analyze your code and quickly find any problems.
```shell
cd ${PROJECT_ROOT} && rushx lint
@ -115,7 +131,7 @@ cd ${PROJECT_ROOT} && rushx lint:fix
# How to create package
Now we can use the exact same steps to build a releasable artifact.
Now, we can use the exact same steps to build a releasable artifact.
```shell
rush update
@ -124,13 +140,14 @@ cd <your-sdk-folder>
export TEST_MODE=record && rushx test
rushx pack
```
You may send this artifact to your customer if your services are still in private preview and some customers want to try it out.
# Create/Update the ci.yaml
Now, if everything looks good to you, you can submit a PR in azure-sdk-for-js repo with all the changes you made above. Before you do that, you need to add/update the ci.yml file. Depends on whether there's already one in your package folder.
Now, if everything looks good to you, you can submit a PR in `azure-sdk-for-js` repo with all the changes you made above. Before you do that, you need to add/update the `ci.yml` file. Depends on whether there's already one in your package folder.
If there's no such file then you can add the following template.
If there's no such file, then you can add the following template.
``` yaml
# NOTE: Please refer to https://aka.ms/azsdk/engsys/ci-yaml before editing this file.
@ -164,20 +181,20 @@ extends:
safeName: azureagrifoodfarming # azureagrifoodfarming for DPG; azurerestagrifoodfarming for RLC
```
Please change the paths.include value as your own project path, and change the Artifacts name and safeName into yours.
Please change the `paths.include` value as your own project path, and change the Artifacts `name` and `safeName` into yours.
If there's already a ci.yml file in your project path. then the only thing you need to do is to add the Artifacts name and safeName of yours into that ci.yml.
If there's already a `ci.yml` file in your project path, then the only thing you need to do is to add the Artifacts `name` and `safeName` of yours into that `ci.yml`.
Please notice the Artifacts name should align with your package name. Here the package name is `@azure/agrifood-farming` so the relevant Artifacts name is `azure-agrifood-farming`.
Please notice the Artifacts `name` should align with your package name. Here, the package name is `@azure/agrifood-farming` so the relevant Artifacts name is `azure-agrifood-farming`.
# Prepare PR
TypeScript emitter can only help you generate SDK code, there is something you need to update manually:
TypeScript emitter can only help you generate SDK code, there is something you need to update manually.
## CHANGELOG.md
CHANGELOG can help customers know the change of new version quickly, so you need to update the it according to the change of this new version. It is also necessary to update release date like `1.0.0-beta.1 (2022-11-11)`(rough time is fine and no need to be very accurate).
CHANGELOG can help customers know the change of new version quickly, so you need to update it according to the change of this new version. It is also necessary to update release date like `1.0.0-beta.1 (2022-11-11)` (rough time is fine and no need to be very accurate).
## Version Number
@ -188,18 +205,20 @@ You shall update the version number according to [semantic versioning rule](http
After [writing and running test cases](#how-to-write-test-for-dpgrlc), you need to push the recordings to [assets repo](https://github.com/Azure/azure-sdk-assets). Please refer to [push recording guide](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Quickstart-on-how-to-write-tests.md#how-to-push-test-recordings-to-assets-repo) to push recordings.
## Fix CI for PR
You may meet the CI failures after submitting the PR, so please refer to [Troubleshoot CI Failure](https://github.com/Azure/azure-sdk-for-js/blob/main/documentation/Troubleshoot-ci-failure.md) to fix it.
## CC dpg-devs for review
Please add below comment in your pr to include dpg-devs to review your pr timely.
Please add below comment in your PR to include `dpg-devs` to review your PR timely.
```
cc @Azure/dpg-devs for awareness
```
# Create API View
When submitting a PR our pipeline would automatically prepare the API view in [API View Website](https://apiview.dev/). You could see an [example link](https://github.com/Azure/azure-sdk-for-js/pull/23866#issuecomment-1316259448) here. Then you could click the API view link in that comment to know more details.
When submitting a PR our pipeline would automatically prepare the API view in [API View Website](https://apiview.dev/). You could see an [example link](https://github.com/Azure/azure-sdk-for-js/pull/23866#issuecomment-1316259448) here. Then, you could click the API view link in that comment to know more details.
# Release

Просмотреть файл

@ -111,7 +111,7 @@ permissions](https://learn.microsoft.com/azure/active-directory/develop/v2-permi
behalf of a specific user. The user may grant permission to your application
unless the permission requires administrator consent.
If you are only using _confidential credentials_ you should only need to be
If you are only using _confidential credentials_, you should only need to be
concerned with application permissions. If you will be authenticating users
with a _public credential_, you must configure API permissions for the Azure
service you need to access (Key Vault, Storage, etc) so that user accounts can
@ -153,15 +153,15 @@ To use this credential, you will need to create a client secret using the
"Certificates & secrets" page for your app registration.
The `ClientCertificateCredential` implements the same client credentials flow,
but instead uses a certificate as the means to authenticate the client. You must
must generate your own PEM-formatted certificate for use in this flow and then
but instead uses a certificate as the means to authenticate the client. You must
generate your own PEM-formatted certificate for use in this flow and then
[register
it](https://learn.microsoft.com/azure/active-directory/develop/active-directory-certificate-credentials#register-your-certificate-with-azure-ad)
in the "Certificates & secrets" page for your app registration. Using a
certificate to authenticate is recommended as it is generally more secure than
using a client secret.
> NOTE: At this time, @azure/identity only supports PEM certificates that are
> NOTE: At this time, `@azure/identity` only supports PEM certificates that are
> **not** password protected.
For both of these credentials, `tenantId` and `clientId` are required parameters.