Writing Performance Tests
Index
- Template perf test project
- Setting up the project
- Writing perf tests
- Executing the perf tests
- Using Proxy Tool
Template perf test project
A template project is available under the sdk/template/perf-tests
directory, which demonstrates how perf test projects should be structured. It includes a basic perf test against the @azure/template
package.
Setting up the project
To add perf tests for the sdk/<service>/<service-sdk>
package, follow the steps below.
-
Create a new folder for the perf tests.
Path-
sdk/<service>/perf-tests/<service-sdk>
(Create the
perf-tests
folder if that doesn't exist) -
This new perf test project will be managed by the rush infrastructure in the repository, with the package name
@azure-tests/<service-sdk>
. To allow rush to manage the project, add the following entry inrush.json
{ "packageName": "@azure-tests/perf-<service-sdk>", "projectFolder": "sdk/<service>/perf-tests/<service-sdk>", "versionPolicyName": "test" }
-
Tests will live under
sdk/<service>/perf-tests/<service-sdk>/test
-
Add a
package.json
such as example-perf-package.json atsdk/<service>/perf-tests/<service-sdk>
folder.Make sure to import your
<service-sdk>
and thetest-utils-perf
project."dependencies": { "@azure/<service-sdk>": "^<version-in-master-branch>", "@azure/test-utils-perf": "^1.0.0" }
Note:
"@azure/test-utils-perf"
is not a published npm package.Set the name of the package and mark it as private.
"name": "@azure-tests/perf-<service-sdk>", "sdk-type": "perf-test", "private": true,
-
Run
rush update
and commit the changes to thepnpm-lock
file. -
Copy the
tsconfig.json
,sample.env
(and.env
) files that are present at thesdk/<service>/<service-sdk>
tosdk/<service>/perf-tests/<service-sdk>
.TSCONFIG
- Modify the "extends" string in the copied tsconfig by adding ".." since the perf tests project is located a level below the actual SDK.
- Set the
compilerOptions.module
tocommonjs
in thetsconfig
to allow running the tests withts-node
.
In the end, your tsconfig may look something like below.
{
"extends": "../../../../tsconfig.package",
"compilerOptions": {
"module": "commonjs",
"declarationDir": "./types/latest",
"outDir": "./dist-esm",
},
"compileOnSave": true,
"exclude": ["node_modules"],
"include": ["./test/**/*.ts"]
}
For perf-testing track 1 version of the same package
(Skip this section if your service does not have or does not care about a track-1 version.)
-
If there is an old major version of your package that needs to be compared, create the folder as
sdk/<service>/perf-tests/<service-sdk>-track-1
-
It is expected that the track-1 perf tests are counterparts of track-2 tests, so they need to have the same names as specified in the track-2 tests for convenience.
-
Add a
package.json
such as example-track-1-perf-package.json atsdk/<service>/perf-tests/<service-sdk>
folder.Make sure to import your
<service-sdk>
and thetest-utils-perf
project."dependencies": { "@azure/<service-sdk>": "^<latest-track-1-version>", "@azure/test-utils-perf": "file:../../../test-utils/perf/azure-test-utils-perf-1.0.0.tgz", }
Set the name of the package and mark it as private.
"name": "@azure-tests/perf-<service-sdk>-track-1", "sdk-type": "perf-test" "private": true,
Note: Track-1 packages will not be managed by
rush
, insteadnpm
will be used to manage/run the track-1 tests, you can copy the readme such as the storage-blob-perf-tests-track-1-readme for instructions.Make sure to add the "setup" step in package.json.
"setup": "node ../../../../common/tools/perf-tests-track-1-setup.js",
-
Run
rush update
followed bynpm run setup
to be able to use the perf framework for track-1 perf tests.npm run setup
installs the dependencies specified inpackage.json
-
Repeat the step 6 from the previous section for the track-1 too to get the
tsconfig.json
,sample.env
(and.env
) files.
Writing perf tests
Entry Point
Add an index.spec.ts
at sdk/<service>/perf-tests/<service-sdk>/test/
.
import { createPerfProgram } from "@azure/test-utils-perf";
import { `ServiceNameAPI1Name`Test } from "./api1-name.spec";
import { `ServiceNameAPI2Name`Test } from "./api2-name.spec";
// Expects the .env file at the same level
import * as dotenv from "dotenv";
dotenv.config();
console.log("=== Starting the perf test ===");
const perfProgram = createPerfProgram(`ServiceNameAPIName`Test, `ServiceNameAPIName2`Test);
perfProgram.run();
Base Class
Base class would have all the common code that would be repeated for each of the tests - common code such as creating the client, creating a base resource, etc.
Create a new file such as serviceName.spec.ts
at sdk/<service>/perf-tests/<service-sdk>/test/
.
import { PerfTest, getEnvVar } from "@azure/test-utils-perf";
import {
ServiceNameClient
} from "@azure/<service-sdk>";
export abstract class `ServiceName`Test<TOptions = {}> extends PerfTest<TOptions> {
serviceNameClient: ServiceNameClient;
constructor() {
super();
// Setting up the serviceNameClient
}
public async globalSetup() {
// .createResources() using serviceNameClient
}
public async globalCleanup() {
// .deleteResources() using serviceNameClient
}
}
Test File
Following code shows how the individual perf test files would look like.
import { ServiceNameClient } from "@azure/<service-sdk>";
import { PerfOptionDictionary, drainStream } from "@azure/test-utils-perf";
import { `ServiceName`Test } from "./serviceNameTest.spec";
export class `ServiceNameAPIName`Test extends ServiceNameTest {
// The next section talks about the custom options that you can provide for a test
public options: PerfOptionDictionary = {};
serviceNameClient: `ServiceName`Client;
constructor() {
super();
// Setting up the client
}
public async globalSetup() {
await super.globalSetup(); // Calling base class' setup
// Add any additional setup
}
async run(): Promise<void> {
// call the method on `serviceNameClient` that you're interested in testing
}
}
It is not mandatory to have separate base class and test classes. If there is nothing common among the testing scenarios of your service, feel free to merge base class with the test class to only have a single test class instead.
Custom Options
As seen in the previous section, you can specify custom options along with the default options from the performance framework. You can access the options in the class using this.parsedOptions
.
Parsed options include the default options such as duration, iterations, parallel, etc offered by the perf framework as well as the custom options provided in the TestClass.
interface `ServiceNameAPIName`TestOptions {
newOption: number;
}
export class `ServiceNameAPIName`Test extends ServiceNameTest<`ServiceNameAPIName`TestOptions> {
public options: PerfOptionDictionary<`ServiceNameAPIName`TestOptions> = {
newOption: {
required: true,
description: "A new option",
shortName: "sz",
longName: "newOption",
defaultValue: 10240
}
};
async run(): Promise<void> {
// You can leverage the parsedOptions in the setup or globalSetup or runAsync methods as shown below.
// this.parsedOptions.duration.value!
// this.parsedOptions.newOption.value!
}
}
Executing the perf tests
Command to run
To run a particular test, use npm run perf-test:node
- takes the test class name as the argument along with the command line arguments you may provide.
- Run
npm run perf-test:node -- TestClassName --warmup 2 --duration 7 --iterations 2 --parallel 50
Adding Readme/Instructions
Refer to storage-blob-perf-tests-readme and storage-blob-perf-tests-readme-track-1 and have similar set of instructions for your perf project.
Testing an older track 2 version
Example: Currently @azure/<service-sdk>
is at 12.4.0 on master and you want to test version 12.2.0
- In the track 2 perf tests project, update dependency
@azure/<service-sdk>
version inpackage.json
to12.2.0
- Add a new exception in
common\config\rush\common-versions.json
underallowedAlternativeVersions
"@azure/<service-sdk>": [..., "12.2.0"]
rush update
(generates a new pnpm-lock file)- Navigate to
sdk\storage\perf-tests\<service-sdk>
rush build -t perf-<service-sdk>
- Run the tests as suggested before, example
npm run perf-test:node -- TestClassName --warmup 2 --duration 7 --iterations 2 --parallel 50
Using Proxy Tool
Using the testProxy option
To be able to leverage the powers of playing back the requests using the test proxy, add the following to your code.
```ts
/// Core V1 SDKs - For services depending on core-http
/// Call this.configureClientOptionsCoreV1 method on your client options
this.blobServiceClient = BlobServiceClient.fromConnectionString(connectionString, this.configureClientOptionsCoreV1({}));
/// Core V2 SDKs - For services depending on core-rest-pipeline
/// this.configureClient call to modify your client
this.client = this.configureClient(TableClient.fromConnectionString(connectionString, tableName));
// Not all core-v1 SDKs allow passing httpClient option.
// Not all core-v2 SDKs allow adding policies via pipeline option.
// Please reach out if your service doesn't support.
```
Running the proxy server
Run this command
docker run -p 5000:5000 azsdkengsys.azurecr.io/engsys/ubuntu_testproxy_server:latest
To use the proxy-tool in your test pass this option in cli --test-proxy http://localhost:5000
(Make sure the port is same as what you have used to run the docker run
command).
Sample command(using storage-blob perf tests as example (Core-v1)!)
npm run perf-test:node -- StorageBlobDownloadTest --warmup 2 --duration 7 --iterations 2 --test-proxy http://localhost:5000
npm run perf-test:node -- StorageBlobDownloadTest --warmup 2 --duration 7 --iterations 2 --parallel 2 --test-proxy http://localhost:5000
Sample command(using data-tables perf tests as example (Core-v2)!)
npm run perf-test:node -- ListComplexEntitiesTest --duration 7 --iterations 2 --parallel 2 --test-proxy http://localhost:5000
npm run perf-test:node -- ListComplexEntitiesTest --duration 7 --iterations 2 --parallel 2
Using proxy-tool part is still under construction. Please reach out to the owners/team if you face issues.