- General improvements to python scripting in repo
This commit is contained in:
Cameron Aavik 2023-08-11 19:08:09 +10:00 коммит произвёл GitHub
Родитель 2312601d45
Коммит d85a49791e
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
30 изменённых файлов: 1703 добавлений и 215 удалений

1
.gitignore поставляемый
Просмотреть файл

@ -1,3 +1,4 @@
!.git
# Other folders to ignore
**/logs/
/tools/

12
.vscode/settings.json поставляемый
Просмотреть файл

@ -9,5 +9,15 @@
"**/obj/": true,
"**/packages/": true,
"tools/": true,
}
},
"python.analysis.typeCheckingMode": "strict",
"python.analysis.diagnosticSeverityOverrides": {
"reportMissingParameterType": "none",
"reportUnnecessaryComparison": "information",
"reportUnnecessaryIsInstance": "information",
"reportUnusedVariable": "information",
"reportMissingTypeStubs": "information"
},
"python.analysis.extraPaths": ["scripts"],
"python.analysis.diagnosticMode": "workspace"
}

Просмотреть файл

@ -0,0 +1,147 @@
# Using Crank to schedule performance tests on Helix
## Table of Contents
- [Introduction](#introduction)
- [Current Limitations](#current-limitations)
- [Prerequisites](#prerequisites)
- [Workflow](#workflow)
- [Building the runtime repository](#building-the-runtime-repository)
- [Building for Windows on Windows](#building-for-windows-on-windows)
- [Building for Linux on Windows](#building-for-linux-on-windows)
- [Building for Windows arm64 on Windows](#building-for-windows-arm64-on-windows)
- [Using the Crank CLI](#using-the-crank-cli)
- [Example: Run microbenchmarks on Windows x64](#example-run-microbenchmarks-on-windows-x64)
- [Profiles](#profiles)
- [Other useful arguments](#other-useful-arguments)
- [Accessing results](#accessing-results)
- [Crank CLI output](#crank-cli-output)
- [Azure Data Explorer](#azure-data-explorer)
- [Crank JSON output](#crank-json-output)
## Introduction
We have support and documentation today explaining how to run the performance tests in this repository locally on your machine, however these steps may be quite difficult to follow, or you may not have the hardware that you wish to run the performance tests on. This document provides a way for internal Microsoft employees to schedule performance tests to be run on Helix machines using Crank.
- [Helix](https://github.com/dotnet/arcade/blob/main/Documentation/Helix.md) is the work scheduler that we use in our CI pipelines to run performance tests. We have many Helix queues available to us which provide different hardware capabilities so that we are able to test a wide array of situations.
- [Crank](https://github.com/dotnet/crank) is a tool that provides infrastructure for software performance measurement and is mainly used today to support our [TechEmpower Web Framework Benchmarks](https://github.com/aspnet/benchmarks).
## Current limitations
- This workflow is only available to Microsoft employees. If you are not a Microsoft employee you can continue to run benchmarks using our other [benchmarking workflow documentation](./benchmarking-workflow-dotnet-runtime.md).
- Currently, only support for running the BenchmarkDotNet benchmarks has been thoroughly tested. In the future, we will work towards adding support for scenarios such as Startup and Size on Disk.
- The developer is required to build the runtime repository themselves before sending the job to Helix. Doing the runtime builds on your local machine means you will be able to take full advantage of incremental compilation and won't have to wait for crank to build from scratch.
- There is no support currently for using an existing version of .NET installed using the .NET Installer. Only a local build of the .NET runtime is supported.
## Prerequisites
- The [dotnet/runtime](https://github.com/dotnet/runtime) and [dotnet/performance](https://github.com/dotnet/perforamnce) repositories must be cloned to your machine.
- It is not required, but running crank will be simpler if the two repositories are cloned to the same parent directory such that doing `cd ../runtime` in the performance repository will navigate you to the runtime repository.
- Crank must be installed to your machine.
- Only the crank controller is required. We are hosting a crank agent accessible to Microsoft employees which has all the required environment variables set up to schedule Helix jobs and upload performance results.
- Crank can be installed with `dotnet tool install -g Microsoft.Crank.Controller --version "0.2.0-*"`
- Please see the [crank](https://github.com/dotnet/crank) GitHub repository for further information and documentation.
- Microsoft's corporate VPN is required to be active to connect to the crank agent.
- Corpnet access
- If you are working from home, it is likely that you are not on corpnet as corpnet usually requires that the machine is physically connected to a Microsoft Building.
- If your machine is not connected to corpnet, then [DevBox](https://devbox.microsoft.com) is our strongly recommended alternative.
- If you are unable to use DevBox and can't get corpnet access, then please email [dotnetperf@microsoft.com](mailto:dotnetperf@microsoft.com) so that we can give you an alternative.
- Additional configurations such as Mono, WASM, iOS, and Android are also not currently supported, but will be supported in the future.
## Workflow
### Building the runtime repository
The crank configuration only supports builds that have been generated to the Core_Root folder. Please see [these docs](https://github.com/dotnet/runtime/blob/main/docs/workflow/testing/coreclr/testing.md#building-the-core_root) in the runtime repo for more information about how to build this folder. If you wish to compile for a different OS or architecture, please read the [Cross-Building](https://github.com/dotnet/runtime/blob/main/docs/workflow/building/coreclr/cross-building.md#cross-building-for-different-architectures-and-operating-systems) documentation. Below are some examples of what to run for some common scenarios.
#### Building for Windows on Windows
Run the following in the cloned runtime repository
```cmd
.\build.cmd clr+libs -c Release
.\src\tests\build.cmd release generatelayoutonly
```
#### Building for Linux on Windows
Docker can be used to build for Linux ([see documentation](https://github.com/dotnet/runtime/blob/main/docs/workflow/building/coreclr/linux-instructions.md#build-using-docker)).
Ensure that you have Docker installed on your machine with WSL enabled. In the below script, RUNTIME_REPO_PATH should be a full path to the repo from the root.
```cmd
docker run --rm `
-v <RUNTIME_REPO_PATH>:/runtime `
-w /runtime `
mcr.microsoft.com/dotnet-buildtools/prereqs:ubuntu-22.04 `
./build.sh clr+libs -c Release && ./src/tests/build.sh release generatelayoutonly
```
#### Building for Windows arm64 on Windows
Run the following in the cloned runtime repository
```cmd
.\build.cmd clr+libs -c Release -arch arm64
.\src\tests\build.cmd arm64 release generatelayoutonly
```
### Using the Crank CLI
After installing crank as mentioned in the prerequisites, you will be able to invoke crank using `crank` in the command line.
#### Example: Run microbenchmarks on Windows x64
Below is an example of a crank command which will run any benchmarks with Linq in the name on a Windows x64 queue. This command must be run in the performance repository, and the runtime repository must be located next to it so that you could navigate to it with `cd ../runtime`.
```cmd
crank --config .\helix.yml --scenario micro --profile win-x64 --variable bdnArgs="--filter *Linq*" --profile msft-internal --variable buildNumber="myalias-20230811.1"
```
An explanation for each argument:
- `--config .\helix.yml`: This tells crank what yaml file defines all the scenarios and jobs
- `--scenario micro`: Runs the microbenchmarks scenario
- `--profile win-x64`: Configures crank to a local Windows x64 build of the runtime, and sets the Helix Queue to a Windows x64 queue.
- `--variable bdnArgs="--filter *Linq*"`: Sets arguments to pass to BenchmarkDotNet that will filter it to only Linq benchmarks
- `--profile msft-internal`: Sets the crank agent endpoint to the internal hosted crank agent
- `--variable buildNumber="myalias-20230811.1"`: Sets the build number which will be associated with the results when it gets uploaded to our storage accounts. You can use this to search for the run results in Azure Data Explorer. This build number does not have to follow any convention, the only recommendation would be to include something unique to yourself so that it doesn't conflict with other build numbers.
#### Profiles
Profiles are a set of predefined variables that are given a name so it is easy to reuse. Profiles are additive meaning that if you specify multiple profiles the variables will get merged together. A list of profiles can be found in [helix.yml](../helix.yml) at the bottom. Some of the profiles configure the crank agent endpoint, and other profiles configure what the target OS, architecture, and queue is. If you wish to run microbenchmarks on Ubuntu x64, just use `--profile ubuntu-x64`.
#### Other useful arguments
- `--variable runtimeRepoDir="../path/to/runtime"`: Set a custom path to the runtime repository, relative to the working directory
- `--variable performanceRepoDir="C:/path/to/performance"`: Set a custom path to the performance repository, relative to the working directory
- `--variable partitionCount=10`: Set the number of Helix jobs to split the microbenchmarks across. By default this is 5, but may need to be increased or decreased depending on the number of benchmarks being run. If running all the microbenchmarks, it is recommended to set this to 30. If just running a few microbenchmarks, set this to 1.
- `--variable queue="Windows.11.Amd64.Tiger.Perf"`: Set a specific Helix queue to run on. When doing this you may need to set `osGroup`, `architecture`, and `internal` as well.
- `--variable osGroup="windows"`: Set what type of OS the helix queue is, examples of valid values are `windows`, `osx`, and `linux`, `ios`, `freebsd`, etc. Set to `windows` by default.
- `-variable architecture="x64"`: Sets the architecture to use e.g. `x64`, `x86`, `arm64`. Set to `x64` by default.
- `--variable internal="true"`: Sets whether or not the Helix Queue is a public or internal queue. If the queue is public the results will not be uploaded. Defaults to `true`.
- `--json results.json`: Will export all the raw benchmark results as a JSON with the given file name.
### Accessing results
#### Crank CLI output
Once the helix jobs have completed, crank will output a simplfied benchmark results with a list of all the benchmarks and the average runtime.
#### Azure Data Explorer
If you made use of a non-public queue, the results will be uploaded our [Azure Data Explorer](https://dataexplorer.azure.com/clusters/dotnetperf.westus/databases/PerformanceData) database and be accessible almost immediately to query. If you don't have access to see the Azure Data Explorer database, please join the ".NET Perf Data Readers" Security Group.
Using the `buildNumber` you set on the command line, you can search for that build number in the "Build Name" column in the Measurements table. Using the build number from earlier `myalias-20230811.1` as an example, you could query for your data with the following:
```kql
Measurements
| where BuildName == "myalias-20230811.1"
| where TestCounterDefaultCounter // filter to only the default counter
```
This will contain much more information about each benchmark including standard deviation and all the individual measurements
#### Crank JSON output
If you want access to the raw data but don't wish to use Azure Data Explorer, you can also pass the `--json results.json` command line argument to crank and you will also get raw measurements data which you can look at.

Просмотреть файл

@ -13,6 +13,7 @@
<PropertyGroup>
<FrameworkVersion>$(PERFLAB_Framework.Substring($([MSBuild]::Subtract($(PERFLAB_Framework.Length), 3))))</FrameworkVersion>
<HelixResultsDestinationDir>$(CorrelationPayloadDirectory)performance/artifacts/helix-results</HelixResultsDestinationDir>
</PropertyGroup>
<PropertyGroup Condition="'$(TargetsWindows)' == 'true'">

Просмотреть файл

@ -53,6 +53,8 @@ jobs:
value: '%HELIX_CORRELATION_PAYLOAD%\artifacts\BenchmarkDotNet.Artifacts'
- name: InstallPrerequisites
value: ''
- name: TargetsWindowsParam
value: '--target-windows'
- ${{ if ne(parameters.osName, 'windows') }}:
- name: CliArguments
value: '--dotnet-versions $DOTNET_VERSION --cli-source-info args --cli-branch $PERFLAB_BRANCH --cli-commit-sha $PERFLAB_HASH --cli-repository https://github.com/$PERFLAB_REPO --cli-source-timestamp $PERFLAB_BUILDTIMESTAMP'
@ -123,12 +125,16 @@ jobs:
${{ channel }}:
_Channel: ${{ channel }}
_Configs: CompilationMode=Tiered RunKind="${{ parameters.kind }}"
_PerfLabArguments: $(PerfLabArguments)
_BuildConfig: ${{ parameters.architecture }}_$(_Channel)_${{ parameters.kind }} # needs to be unique to avoid logs overwriting in mc.dot.net
steps:
- checkout: self
clean: true
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) $(AffinityParam) $(runEnvVarsParam)
- ${{ if ne(parameters.osName, 'windows') }}:
- script: wget https://bootstrap.pypa.io/pip/3.6/get-pip.py && $(Python) get-pip.py --user
displayName: Ensure pip is installed
- script: $(Python) -m pip install --user dataclasses
displayName: Install dataclasses library used in ci_setup.py
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) $(AffinityParam) $(runEnvVarsParam) $(TargetsWindowsParam)
displayName: Run ci_setup.py
- ${{ if eq(parameters.osName, 'windows') }}:
- script: (robocopy $(Build.SourcesDirectory) $(Build.SourcesDirectory)\notLocked /E /XD $(Build.SourcesDirectory)\notLocked $(Build.SourcesDirectory)\artifacts $(Build.SourcesDirectory)\.git) ^& IF %ERRORLEVEL% LEQ 1 exit 0

Просмотреть файл

@ -1,63 +1,145 @@
<Project Sdk="Microsoft.DotNet.Helix.Sdk" DefaultTargets="Test">
<PropertyGroup Condition="'$(TargetsWindows)' == 'true'">
<PropertyGroup Condition="'$(TargetsWindows)' == 'true' AND '$(UseCoreRun)' == 'true'">
<PerformanceDirectory>%HELIX_WORKITEM_ROOT%\performance</PerformanceDirectory>
<HelixPreCommands>$(HelixPreCommands) &amp;&amp; robocopy /np /nfl /e %HELIX_CORRELATION_PAYLOAD%\performance $(PerformanceDirectory) /XD %HELIX_CORRELATION_PAYLOAD%\performance\.git</HelixPreCommands>
<WorkItemCommand>$(PerformanceDirectory)\scripts\benchmarks_ci.py --csproj $(PerformanceDirectory)\$(TargetCsproj)</WorkItemCommand>
<Python>py -3</Python>
<CoreRun>%HELIX_CORRELATION_PAYLOAD%\Core_Root\CoreRun.exe</CoreRun>
<BaselineCoreRun>%HELIX_CORRELATION_PAYLOAD%\Baseline_Core_Root\CoreRun.exe</BaselineCoreRun>
<HelixPreCommands>$(HelixPreCommands);call $(PerformanceDirectory)\tools\machine-setup.cmd;set PYTHONPATH=%HELIX_WORKITEM_PAYLOAD%\scripts%3B%HELIX_WORKITEM_PAYLOAD%</HelixPreCommands>
<ArtifactsDirectory>%HELIX_WORKITEM_ROOT%\artifacts\BenchmarkDotNet.Artifacts</ArtifactsDirectory>
<BaselineArtifactsDirectory>%HELIX_WORKITEM_ROOT%\artifacts\BenchmarkDotNet.Artifacts_Baseline</BaselineArtifactsDirectory>
</PropertyGroup>
<PropertyGroup Condition="'$(TargetsWindows)' != 'true' AND '$(UseCoreRun)' == 'true'">
<PerformanceDirectory>$HELIX_WORKITEM_ROOT/performance</PerformanceDirectory>
<HelixPreCommands>$(HelixPreCommands);cp -R $(BaseDirectory)/performance $(PerformanceDirectory)</HelixPreCommands>
<WorkItemCommand>$(PerformanceDirectory)/scripts/benchmarks_ci.py --csproj $(PerformanceDirectory)/$(TargetCsproj)</WorkItemCommand>
<Python>python3</Python>
<CoreRun>$HELIX_CORRELATION_PAYLOAD/Core_Root/corerun</CoreRun>
<BaselineCoreRun>$HELIX_CORRELATION_PAYLOAD/Baseline_Core_Root/corerun</BaselineCoreRun>
<HelixPreCommands>$(HelixPreCommands);cp -R $HELIX_CORRELATION_PAYLOAD/performance $(PerformanceDirectory);chmod +x $(PerformanceDirectory)/tools/machine-setup.sh;. $(PerformanceDirectory)/tools/machine-setup.sh</HelixPreCommands>
<ArtifactsDirectory>$HELIX_WORKITEM_ROOT/artifacts/BenchmarkDotNet.Artifacts</ArtifactsDirectory>
<BaselineArtifactsDirectory>$HELIX_WORKITEM_ROOT/artifacts/BenchmarkDotNet.Artifacts_Baseline</BaselineArtifactsDirectory>
</PropertyGroup>
<PropertyGroup Condition="'$(TargetsWindows)' == 'true' AND '$(UseCoreRun)' != 'true'">
<WorkItemCommand>%HELIX_CORRELATION_PAYLOAD%\scripts\benchmarks_ci.py --csproj %HELIX_CORRELATION_PAYLOAD%\$(TargetCsproj)</WorkItemCommand>
</PropertyGroup>
<PropertyGroup Condition="'$(TargetsWindows)' != 'true'">
<PropertyGroup Condition="'$(TargetsWindows)' != 'true' AND '$(UseCoreRun)' != 'true'">
<WorkItemCommand>scripts/benchmarks_ci.py --csproj $(TargetCsproj)</WorkItemCommand>
</PropertyGroup>
<PropertyGroup Condition="'$(WorkItemCommand)' != ''">
<WorkItemCommand>$(Python) $(WorkItemCommand) --incremental no --architecture $(Architecture) -f $(PERFLAB_Framework) $(_PerfLabArguments)</WorkItemCommand>
<PropertyGroup Condition="'$(WasmDotnet)' == 'true'">
<CliArguments>$(CliArguments) --run-isolated --wasm --dotnet-path %24HELIX_CORRELATION_PAYLOAD/dotnet/</CliArguments>
</PropertyGroup>
<PropertyGroup Condition="'$(MonoDotnet)' == 'true' and '$(AGENT_OS)' == 'Windows_NT'">
<CoreRunArgument>--corerun %HELIX_CORRELATION_PAYLOAD%\dotnet-mono\shared\Microsoft.NETCore.App\$(ProductVersion)\corerun.exe</CoreRunArgument>
</PropertyGroup>
<PropertyGroup Condition="'$(MonoDotnet)' == 'true' and '$(AGENT_OS)' != 'Windows_NT'">
<CoreRunArgument>--corerun $(BaseDirectory)/dotnet-mono/shared/Microsoft.NETCore.App/$(ProductVersion)/corerun</CoreRunArgument>
</PropertyGroup>
<PropertyGroup Condition="'$(UseCoreRun)' == 'true'">
<CoreRunArgument>--corerun $(CoreRun)</CoreRunArgument>
</PropertyGroup>
<PropertyGroup Condition="'$(UseBaselineCoreRun)' == 'true'">
<BaselineCoreRunArgument>--corerun $(BaselineCoreRun)</BaselineCoreRunArgument>
</PropertyGroup>
<PropertyGroup Condition="'$(WorkItemCommand)' != ''">
<WorkItemCommand>$(Python) $(WorkItemCommand) --incremental no --architecture $(Architecture) -f $(PERFLAB_Framework) $(PerfLabArguments) --bdn-artifacts $(ArtifactsDirectory) </WorkItemCommand>
</PropertyGroup>
<!-- TODO: Check for net462 and net461 -->
<PropertyGroup Condition="'$(PERFLAB_Framework)' != 'net462'">
<WorkItemCommand>$(WorkItemCommand) $(CliArguments)</WorkItemCommand>
</PropertyGroup>
<!-- TODO: Take this property in using python-->
<PropertyGroup>
<WorkItemTimeout>6:00</WorkItemTimeout>
<WorkItemTimeout Condition="'$(OnlySanityCheck)' == 'true'">1:30</WorkItemTimeout>
<HelixResultsDestinationDir>$(CorrelationPayloadDirectory)/performance/artifacts/helix-results</HelixResultsDestinationDir>
</PropertyGroup>
<ItemGroup>
<HelixCorrelationPayload Include="$(CorrelationPayloadDirectory)">
<HelixCorrelationPayload Include="$(CorrelationPayloadDirectory)/Core_Root" Condition="'$(UseCoreRun)' == 'true'">
<PayloadDirectory>%(Identity)</PayloadDirectory>
<Destination>Core_Root</Destination>
</HelixCorrelationPayload>
<HelixCorrelationPayload Include="$(CorrelationPayloadDirectory)/performance" Condition="'$(UseCoreRun)' == 'true'">
<PayloadDirectory>%(Identity)</PayloadDirectory>
<Destination>performance</Destination>
</HelixCorrelationPayload>
<HelixCorrelationPayload Include="$(CorrelationPayloadDirectory)" Condition="'$(UseCoreRun)' != 'true'">
<PayloadDirectory>%(Identity)</PayloadDirectory>
</HelixCorrelationPayload>
</ItemGroup>
<PropertyGroup>
<PartitionCount>15</PartitionCount>
<PartitionCount Condition="'$(PartitionCount)' == ''">15</PartitionCount>
</PropertyGroup>
<ItemGroup>
<Partition Include="Partition0" Index="0" />
<Partition Include="Partition1" Index="1" />
<Partition Include="Partition2" Index="2" />
<Partition Include="Partition3" Index="3" />
<Partition Include="Partition4" Index="4" />
<Partition Include="Partition5" Index="5" />
<Partition Include="Partition6" Index="6" />
<Partition Include="Partition7" Index="7" />
<Partition Include="Partition8" Index="8" />
<Partition Include="Partition9" Index="9" />
<Partition Include="Partition10" Index="10" />
<Partition Include="Partition11" Index="11" />
<Partition Include="Partition12" Index="12" />
<Partition Include="Partition13" Index="13" />
<Partition Include="Partition14" Index="14" />
<Partition Include="Partition0" Index="0" Condition="$(PartitionCount) &gt; 0" />
<Partition Include="Partition1" Index="1" Condition="$(PartitionCount) &gt; 1" />
<Partition Include="Partition2" Index="2" Condition="$(PartitionCount) &gt; 2" />
<Partition Include="Partition3" Index="3" Condition="$(PartitionCount) &gt; 3" />
<Partition Include="Partition4" Index="4" Condition="$(PartitionCount) &gt; 4" />
<Partition Include="Partition5" Index="5" Condition="$(PartitionCount) &gt; 5" />
<Partition Include="Partition6" Index="6" Condition="$(PartitionCount) &gt; 6" />
<Partition Include="Partition7" Index="7" Condition="$(PartitionCount) &gt; 7" />
<Partition Include="Partition8" Index="8" Condition="$(PartitionCount) &gt; 8" />
<Partition Include="Partition9" Index="9" Condition="$(PartitionCount) &gt; 9" />
<Partition Include="Partition10" Index="10" Condition="$(PartitionCount) &gt; 10" />
<Partition Include="Partition11" Index="11" Condition="$(PartitionCount) &gt; 11" />
<Partition Include="Partition12" Index="12" Condition="$(PartitionCount) &gt; 12" />
<Partition Include="Partition13" Index="13" Condition="$(PartitionCount) &gt; 13" />
<Partition Include="Partition14" Index="14" Condition="$(PartitionCount) &gt; 14" />
<Partition Include="Partition15" Index="15" Condition="$(PartitionCount) &gt; 15" />
<Partition Include="Partition16" Index="16" Condition="$(PartitionCount) &gt; 16" />
<Partition Include="Partition17" Index="17" Condition="$(PartitionCount) &gt; 17" />
<Partition Include="Partition18" Index="18" Condition="$(PartitionCount) &gt; 18" />
<Partition Include="Partition19" Index="19" Condition="$(PartitionCount) &gt; 19" />
<Partition Include="Partition20" Index="20" Condition="$(PartitionCount) &gt; 20" />
<Partition Include="Partition21" Index="21" Condition="$(PartitionCount) &gt; 21" />
<Partition Include="Partition22" Index="22" Condition="$(PartitionCount) &gt; 22" />
<Partition Include="Partition23" Index="23" Condition="$(PartitionCount) &gt; 23" />
<Partition Include="Partition24" Index="24" Condition="$(PartitionCount) &gt; 24" />
<Partition Include="Partition25" Index="25" Condition="$(PartitionCount) &gt; 25" />
<Partition Include="Partition26" Index="26" Condition="$(PartitionCount) &gt; 26" />
<Partition Include="Partition27" Index="27" Condition="$(PartitionCount) &gt; 27" />
<Partition Include="Partition28" Index="28" Condition="$(PartitionCount) &gt; 28" />
<Partition Include="Partition29" Index="29" Condition="$(PartitionCount) &gt; 29" />
</ItemGroup>
<PropertyGroup>
<BenchmarkDotNetArguments>$(BenchmarkDotNetArguments) $(ExtraBenchmarkDotNetArguments) $(CoreRunArgument)</BenchmarkDotNetArguments>
</PropertyGroup>
<!-- TODO: Support Compare from runtime repo -->
<!--
Partition the MicroBenchmarks project, but nothing else
-->
<ItemGroup Condition="$(TargetCsproj.Contains('MicroBenchmarks.csproj'))">
<HelixWorkItem Include="@(Partition)">
<PayloadDirectory>$(WorkItemDirectory)</PayloadDirectory>
<Command>$(WorkItemCommand) --bdn-arguments="$(BenchmarkDotNetArguments) --partition-count $(PartitionCount) --partition-index %(HelixWorkItem.Index)"</Command>
<Timeout>4:00</Timeout>
<Command>$(WorkItemCommand) --partition=%(HelixWorkItem.Index) --bdn-arguments="$(BenchmarkDotNetArguments) --partition-count $(PartitionCount) --partition-index %(HelixWorkItem.Index)"</Command>
<Timeout>$(WorkItemTimeout)</Timeout>
<DownloadFilesFromResults Condition="'$(DownloadFilesFromHelix)' == 'true'">Partition%(HelixWorkItem.Index)-combined-perf-lab-report.json</DownloadFilesFromResults>
</HelixWorkItem>
</ItemGroup>
<ItemGroup Condition="!$(TargetCsproj.Contains('MicroBenchmarks.csproj'))">
<HelixWorkItem Include="WorkItem">
<PayloadDirectory>$(WorkItemDirectory)</PayloadDirectory>
<Command>$(WorkItemCommand) --bdn-arguments="$(BenchmarkDotNetArguments)"</Command>
<Timeout>4:00</Timeout>
<Timeout>$(WorkItemTimeout)</Timeout>
<DownloadFilesFromResults Condition="'$(DownloadFilesFromHelix)' == 'true'">combined-perf-lab-report.json</DownloadFilesFromResults>
</HelixWorkItem>
</ItemGroup>
</Project>

Просмотреть файл

@ -0,0 +1,6 @@
FROM python:3.10-bookworm
WORKDIR /performance
# copy files in
COPY . .

Просмотреть файл

@ -35,15 +35,15 @@
<ScenarioDirectoryName>emptyvbconsoletemplate</ScenarioDirectoryName>
<PayloadDirectory>$(ScenariosDir)%(ScenarioDirectoryName)</PayloadDirectory>
</Scenario>
<UIScenario Include="WinForms Large" Condition="'$(TargetsWindows)' == 'true'">
<UIScenario Include="WinForms Large" Condition="'$(TargetsWindows)' == 'true' AND '$(OS)' == 'Windows_NT'">
<ScenarioDirectoryName>windowsformslarge</ScenarioDirectoryName>
<PayloadDirectory>$(ScenariosDir)%(ScenarioDirectoryName)</PayloadDirectory>
</UIScenario>
<UIScenario Include="WPF Template" Condition="'$(TargetsWindows)' == 'true'">
<UIScenario Include="WPF Template" Condition="'$(TargetsWindows)' == 'true' AND '$(OS)' == 'Windows_NT'">
<ScenarioDirectoryName>wpf</ScenarioDirectoryName>
<PayloadDirectory>$(ScenariosDir)%(ScenarioDirectoryName)</PayloadDirectory>
</UIScenario>
<UIScenario Include="WPF SFC" Condition="'$(TargetsWindows)' == 'true'">
<UIScenario Include="WPF SFC" Condition="'$(TargetsWindows)' == 'true' AND '$(OS)' == 'Windows_NT'">
<ScenarioDirectoryName>wpfsfc</ScenarioDirectoryName>
<PayloadDirectory>$(ScenariosDir)%(ScenarioDirectoryName)</PayloadDirectory>
</UIScenario>

Просмотреть файл

@ -53,6 +53,8 @@ jobs:
value: 'py -3'
- name: ArtifactsDirectory
value: '%HELIX_WORKITEM_UPLOAD_ROOT%\Scenarios'
- name: TargetsWindowsParam
value: '--target-windows'
- ${{ if ne(parameters.osName, 'windows') }}:
- name: ScriptExtension
value: .sh
@ -114,8 +116,13 @@ jobs:
steps:
- checkout: self
clean: true
- ${{ if ne(parameters.osName, 'windows') }}:
- script: wget https://bootstrap.pypa.io/pip/3.6/get-pip.py && $(Python) get-pip.py --user
displayName: Ensure pip is installed
- script: $(Python) -m pip install --user dataclasses
displayName: Install dataclasses library used in ci_setup.py
- ${{ if ne(length(parameters.channels), 0) }}:
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) --output-file $(CorrelationStaging)machine-setup$(ScriptExtension) --install-dir $(CorrelationStaging)dotnet $(runEnvVarsParam) $(AffinityParam)
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) --output-file $(CorrelationStaging)machine-setup --install-dir $(CorrelationStaging)dotnet $(runEnvVarsParam) $(AffinityParam) $(TargetsWindowsParam)
displayName: Run ci_setup.py
- ${{ elseif ne(length(parameters.dotnetVersionsLinks), 0) }}:
- powershell: |
@ -127,7 +134,7 @@ jobs:
Write-Host "##vso[task.setvariable variable=DotnetVersion;]$dotnet_version"
Write-Host "Found dotnet version $dotnet_version"
displayName: Get dotnetVersion to use
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --dotnet-versions $(DotnetVersion) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) --output-file $(CorrelationStaging)machine-setup$(ScriptExtension) --install-dir $(CorrelationStaging)dotnet $(runEnvVarsParam) $(AffinityParam)
- script: $(Python) scripts/ci_setup.py --channel $(_Channel) --dotnet-versions $(DotnetVersion) --architecture ${{parameters.architecture}} --perf-hash $(Build.SourceVersion) --queue ${{parameters.queue}} --build-number $(Build.BuildNumber) --build-configs $(_Configs) --output-file $(CorrelationStaging)machine-setup --install-dir $(CorrelationStaging)dotnet $(runEnvVarsParam) $(AffinityParam) $(TargetsWindowsParam)
displayName: Run ci_setup.py with dotnetVersionsLinks
- powershell: |
Write-Host "##vso[task.setvariable variable=DOTNET_ROOT;]$(CorrelationStaging)dotnet"

160
helix.yml Normal file
Просмотреть файл

@ -0,0 +1,160 @@
jobs:
run_performance_job:
runId: 'dotnetperf'
isConsoleApp: true
waitForExit: true
variables:
architecture: 'x64'
queue: ''
framework: 'net8.0'
runCategories: ''
kind: ''
runtimeRepoDir: '../runtime' # A guess that the performance repo is next to the runtime repo
osGroup: 'windows'
osSubGroup: ''
coreRootDir: '{{ runtimeRepoDir }}/artifacts/tests/coreclr/{{ osGroup }}.{{ architecture }}.Release/Tests/Core_Root'
performanceRepoDir: '.'
runName: '{{ buildNumber }}-testing'
buildNumber: 'local'
buildSourceVersion: 'abc123'
buildSourceBranch: 'main'
internal: true
bdnArgs: ''
partitionCount: 5
projectFile: 'performance/eng/performance/helix.proj'
isScenario: false
isLocalBuild: true
sources:
coreRoot:
localFolder: '{{ coreRootDir }}'
performance:
localFolder: '{{ performanceRepoDir }}'
dockerfile: performance/eng/performance/helix_sender.Dockerfile
dockerImageName: helix_sender
dockerCommand: python ./performance/scripts/run_performance_job.py --queue {{ queue }} --framework {{ framework }} --run-kind {{ kind }} --core-root-dir /performance/coreRoot --performance-repo-dir /performance/performance --architecture {{ architecture }} --os-group {{ osGroup }} --os-sub-group {{ osSubGroup }} {% if internal %}--internal{% endif %} --run-categories "{{ runCategories }}" {% if bdnArgs != blank and bdnArgs != empty %} --extra-bdn-args "{{ bdnArgs }}" {% endif %} --partition-count {{ partitionCount }} --project-file {{ projectFile }} {% if isScenario %} --is-scenario {% endif %} {% if isLocalBuild %} --local-build {% endif %}
dockerContextDirectory: .
arguments: '--env HelixAccessToken --env SYSTEM_ACCESSTOKEN --env PerfCommandUploadToken --env PerfCommandUploadTokenLinux'
environmentVariables:
BUILD_BUILDNUMBER: '{{ buildNumber }}'
BUILD_SOURCEVERSION: '{{ buildSourceVersion }}'
BUILD_SOURCEBRANCH: '{{ buildSourceBranch }}'
BUILD_REPOSITORY_NAME: 'dotnet/performance'
SYSTEM_TEAMPROJECT: 'internal'
BUILD_REASON: 'manual'
PERFLAB_RUNNAME: '{{ runName }}'
scenarios:
micro:
benchmark:
job: run_performance_job
variables:
kind: 'micro'
runCategories: 'runtime libraries'
scenarios:
benchmark:
job: run_performance_job
variables:
kind: scenarios
isScenario: true
projectFile: 'performance/eng/performance/scenarios.proj'
profiles:
local:
jobs:
benchmark:
endpoints:
- http://localhost:5010
msft-internal:
jobs:
benchmark:
endpoints:
- http://dotnetperf-crankagent:5001
relay:
jobs:
benchmark:
endpoints:
- https://dotnetperf-crankagent-relay.servicebus.windows.net/dotnetperf-crankagent
win-x64-public:
jobs:
benchmark:
variables:
osGroup: windows
queue: Windows.10.Amd64.Client.Open
internal: false
win-x64:
jobs:
benchmark:
variables:
osGroup: windows
queue: Windows.11.Amd64.Tiger.Perf
win-x64-android-arm64-pixel:
jobs:
benchmark:
variables:
osGroup: windows
queue: Windows.10.Amd64.Pixel.Perf
win-x64-android-arm64-galaxy:
jobs:
benchmark:
variables:
osGroup: windows
queue: Windows.10.Amd64.Galaxy.Perf
win-arm64:
jobs:
benchmark:
variables:
osGroup: windows
architecture: arm64
queue: Windows.11.Arm64.Perf.Surf
win-arm64-ampere:
jobs:
benchmark:
variables:
osGroup: windows
architecture: arm64
queue: Windows.Server.Arm64.Perf
ubuntu-x64-public:
jobs:
benchmark:
variables:
osGroup: linux
queue: Ubuntu.2204.Amd64.Open
internal: false
ubuntu-x64:
jobs:
benchmark:
variables:
osGroup: linux
queue: Ubuntu.2204.Amd64.Tiger.Perf
ubuntu-arm64-ampere:
jobs:
benchmark:
variables:
osGroup: linux
architecture: arm64
queue: Ubuntu.2004.Arm64.Perf
osx-x64-ios-arm64:
jobs:
benchmark:
variables:
osGroup: osx
queue: OSX.1015.Amd64.Iphone.Perf
alpine:
jobs:
benchmark:
variables:
osGroup: linux
queue: alpine.amd64.tiger.perf

4
requirements.txt Normal file
Просмотреть файл

@ -0,0 +1,4 @@
azure.storage.blob==12.0.0
azure.storage.queue==12.0.0
urllib3==1.26.15
dataclasses=0.8

Просмотреть файл

@ -22,17 +22,19 @@ https://github.com/dotnet/performance/blob/main/docs/benchmarking-workflow.md
from argparse import ArgumentParser, ArgumentTypeError
from datetime import datetime
import json
from logging import getLogger
import os
import shutil
import sys
from typing import Any, List, Optional
from performance.common import extension, helixpayload, runninginlab, validate_supported_runtime, get_artifacts_directory, helixuploadroot, RunCommand
from performance.common import validate_supported_runtime, get_artifacts_directory, helixuploadroot
from performance.logger import setup_loggers
from performance.constants import UPLOAD_CONTAINER, UPLOAD_STORAGE_URI, UPLOAD_TOKEN_VAR, UPLOAD_QUEUE
from channel_map import ChannelMap
from subprocess import Popen, CalledProcessError
from shutil import copy
from subprocess import CalledProcessError
from glob import glob
import dotnet
@ -40,11 +42,11 @@ import micro_benchmarks
def init_tools(
architecture: str,
dotnet_versions: str,
target_framework_monikers: list,
dotnet_versions: List[str],
target_framework_monikers: List[str],
verbose: bool,
azure_feed_url: str = None,
internal_build_key: str = None) -> None:
azure_feed_url: Optional[str] = None,
internal_build_key: Optional[str] = None) -> None:
'''
Install tools used by this repository into the tools folder.
This function writes a semaphore file when tools have been successfully
@ -218,10 +220,18 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
help='Key used to fetch the build from an internal azure feed',
)
parser.add_argument(
'--partition',
dest='partition',
required=False,
type=int,
help='Partition Index of the run',
)
return parser
def __process_arguments(args: list):
def __process_arguments(args: List[str]):
parser = ArgumentParser(
description='Tool to run .NET micro benchmarks',
allow_abbrev=False,
@ -231,9 +241,9 @@ def __process_arguments(args: list):
add_arguments(parser)
return parser.parse_args(args)
def __main(args: list) -> int:
def __main(argv: List[str]):
validate_supported_runtime()
args = __process_arguments(args)
args = __process_arguments(argv)
verbose = not args.quiet
if not args.skip_logger_setup:
@ -308,13 +318,25 @@ def __main(args: list) -> int:
getLogger().warning(f"Benchmark run for framework '{framework}' contains errors")
run_contains_errors = True
globpath = os.path.join(
get_artifacts_directory() if not args.bdn_artifacts else args.bdn_artifacts,
'**',
'*perf-lab-report.json')
artifacts_dir = get_artifacts_directory() if not args.bdn_artifacts else args.bdn_artifacts
combined_file_prefix = "" if args.partition is None else f"Partition{args.partition}-"
globpath = os.path.join(artifacts_dir, '**', '*perf-lab-report.json')
all_reports: List[Any] = []
for file in glob(globpath, recursive=True):
copy(file, os.path.join(helixuploadroot(), file.split(os.sep)[-1]))
with open(file, 'r', encoding="utf8") as report_file:
all_reports.append(json.load(report_file))
with open(os.path.join(artifacts_dir, f"{combined_file_prefix}combined-perf-lab-report.json"), "w", encoding="utf8") as all_reports_file:
json.dump(all_reports, all_reports_file)
helix_upload_root = helixuploadroot()
if helix_upload_root is not None:
for file in glob(globpath, recursive=True):
shutil.copy(file, os.path.join(helix_upload_root, file.split(os.sep)[-1]))
else:
getLogger().info("Skipping upload of artifacts to Helix as HELIX_WORKITEM_UPLOAD_ROOT environment variable is not set.")
except CalledProcessError:
getLogger().info("Run failure registered")
# rethrow the caught CalledProcessError exception so that the exception being bubbled up correctly.
@ -323,6 +345,7 @@ def __main(args: list) -> int:
dotnet.shutdown_server(verbose)
if args.upload_to_perflab_container:
globpath = os.path.join(artifacts_dir, '**', '*perf-lab-report.json')
import upload
upload_code = upload.upload(globpath, upload_container, UPLOAD_QUEUE, UPLOAD_TOKEN_VAR, UPLOAD_STORAGE_URI)
getLogger().info("Benchmarks Upload Code: " + str(upload_code))

Просмотреть файл

@ -8,7 +8,7 @@ monthly manual performance runs.
from performance.common import get_machine_architecture
from performance.logger import setup_loggers
from argparse import ArgumentParser, ArgumentTypeError
from argparse import ArgumentParser
from datetime import datetime
from logging import getLogger
from subprocess import CalledProcessError
@ -35,10 +35,9 @@ VERSIONS = {
'net6.0': { 'tfm': 'net6.0' }
}
def get_version_from_name(name: str) -> str:
for version in VERSIONS:
if version == name:
return VERSIONS[version]
def get_version_from_name(name: str) -> dict[str, str]:
if name in VERSIONS:
return VERSIONS[name]
raise Exception('The version specified is not supported', name)
@ -114,7 +113,7 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
return parser
def __process_arguments(args: list):
def __process_arguments(args: list[str]):
parser = ArgumentParser(
description='Tool to execute the monthly manual micro benchmark performance runs',
allow_abbrev=False
@ -123,10 +122,10 @@ def __process_arguments(args: list):
add_arguments(parser)
return parser.parse_args(args)
def __main(args: list) -> int:
def __main(argv: list[str]):
setup_loggers(verbose=True)
args = __process_arguments(args)
args = __process_arguments(argv)
rootPath = os.path.normpath(os.path.join(os.path.dirname(__file__), '..'))
sdkPath = os.path.join(rootPath, 'tools', 'dotnet')
@ -146,7 +145,7 @@ def __main(args: list) -> int:
else:
args.bdn_arguments = '--iterationCount 1 --warmupCount 0 --invocationCount 1 --unrollFactor 1 --strategy ColdStart'
versionTarFiles = []
versionTarFiles: list[str] = []
for versionName in args.versions:
version = get_version_from_name(versionName)
@ -162,7 +161,7 @@ def __main(args: list) -> int:
if not args.dry_run:
shutil.rmtree(sdkPath)
benchmarkArgs = ['--skip-logger-setup', '--filter', args.filter, '--architecture', args.architecture, '-f', version['tfm']]
benchmarkArgs: list[str] = ['--skip-logger-setup', '--filter', args.filter, '--architecture', args.architecture, '-f', version['tfm']]
if 'build' in version:
benchmarkArgs += ['--dotnet-versions', version['build']]
@ -190,7 +189,7 @@ def __main(args: list) -> int:
benchmarkArgs += ['--internal-build-key', args.internal_build_key]
log('Executing: benchmarks_ci.py ')
else:
raise("Must include both a --azure-feed-url and a --internal-build-key")
raise Exception("Must include both a --azure-feed-url and a --internal-build-key")
else:
log('Executing: benchmarks_ci.py ' + str.join(' ', benchmarkArgs))
@ -214,7 +213,7 @@ def __main(args: list) -> int:
resultsName = timestamp + '-' + versionName
resultsName = args.architecture + '-' + resultsName
resultsTarPath = os.path.join(rootPath, 'artifacts', resultsName + '.tar.gz')
resultsTarPath: str = os.path.join(rootPath, 'artifacts', resultsName + '.tar.gz')
versionTarFiles += [resultsTarPath]
if not args.dry_run:

Просмотреть файл

@ -1,4 +1,4 @@
from argparse import ArgumentParser
from typing import List, Optional, Set
class ChannelMap():
channel_map = {
@ -143,12 +143,12 @@ class ChannelMap():
}
}
@staticmethod
def get_supported_channels() -> list:
def get_supported_channels() -> List[str]:
'''List of supported channels.'''
return list(ChannelMap.channel_map.keys())
@staticmethod
def get_supported_frameworks() -> list:
def get_supported_frameworks() -> Set[str]:
'''List of supported frameworks'''
frameworks = [ChannelMap.channel_map[channel]['tfm'] for channel in ChannelMap.channel_map]
return set(frameworks)
@ -161,7 +161,7 @@ class ChannelMap():
raise Exception('Channel %s is not supported. Supported channels %s' % (channel, ChannelMap.get_supported_channels()))
@staticmethod
def get_target_framework_monikers(channels: list) -> list:
def get_target_framework_monikers(channels: List[str]) -> List[str]:
'''
Translates channel names to Target Framework Monikers (TFMs).
'''
@ -182,7 +182,7 @@ class ChannelMap():
raise Exception('Channel %s is not supported. Supported channels %s' % (channel, ChannelMap.get_supported_channels()))
@staticmethod
def get_quality_from_channel(channel: str) -> str:
def get_quality_from_channel(channel: str) -> Optional[str]:
'''Translate Target Framework Moniker (TFM) to channel name'''
if 'quality' in ChannelMap.channel_map[channel]:
return ChannelMap.channel_map[channel]['quality']

140
scripts/ci_setup.py Executable file → Normal file
Просмотреть файл

@ -1,6 +1,7 @@
#!/usr/bin/env python3
from argparse import ArgumentParser, ArgumentTypeError
from dataclasses import dataclass, field
from logging import getLogger
import os
@ -8,8 +9,9 @@ import sys
import datetime
from subprocess import check_output
from typing import Optional, List
from performance.common import get_repo_root_path
from performance.common import get_machine_architecture, get_repo_root_path, set_environment_variable
from performance.common import get_tools_directory
from performance.common import push_dir
from performance.common import validate_supported_runtime
@ -17,16 +19,13 @@ from performance.logger import setup_loggers
from channel_map import ChannelMap
import dotnet
import micro_benchmarks
global_extension = ".cmd" if sys.platform == 'win32' else '.sh'
def init_tools(
architecture: str,
dotnet_versions: str,
dotnet_versions: List[str],
channel: str,
verbose: bool,
install_dir: str=None) -> None:
install_dir: Optional[str]=None) -> None:
'''
Install tools used by this repository into the tools folder.
This function writes a semaphore file when tools have been successfully
@ -50,7 +49,6 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
# Download DotNet Cli
dotnet.add_arguments(parser)
micro_benchmarks.add_arguments(parser)
parser.add_argument(
'--channel',
@ -95,6 +93,14 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
type=str,
help='Product commit time. Format: %Y-%m-%d %H:%M:%S %z'
)
parser.add_argument(
'--local-build',
dest="local_build",
required=False,
action='store_true',
default=False,
help='Whether the test is being run against a local build'
)
parser.add_argument(
'--repository',
dest='repository',
@ -145,11 +151,26 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
help='Discover the hash of the performance repository'
)
def __valid_file_path(file_path: str) -> str:
'''Verifies that specified file path exists.'''
file_path = os.path.abspath(file_path)
if not os.path.isfile(file_path):
raise ArgumentTypeError('{} does not exist.'.format(file_path))
return file_path
parser.add_argument(
'--cli',
dest='cli',
required=False,
type=__valid_file_path,
help='Full path to dotnet.exe',
)
parser.add_argument(
'--output-file',
dest='output_file',
required=False,
default=os.path.join(get_tools_directory(),'machine-setup' + global_extension),
default=os.path.join(get_tools_directory(),'machine-setup'),
type=str,
help='Filename to write the setup script to'
)
@ -224,9 +245,19 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
nargs='*',
help='Environment variables to set on the machine in the form of key=value key2=value2. Will also be saved to additional data'
)
parser.add_argument(
'--target-windows',
dest='target_windows',
required=False,
action='store_true',
default=False,
help='Will it run on a Windows Helix Queue?'
)
return parser
def __process_arguments(args: list):
def __process_arguments(args: List[str]):
parser = ArgumentParser(
description='Tool to generate a machine setup script',
allow_abbrev=False,
@ -236,14 +267,37 @@ def __process_arguments(args: list):
add_arguments(parser)
return parser.parse_args(args)
def __write_pipeline_variable(name: str, value: str):
# Create a variable in the build pipeline
getLogger().info("Writing pipeline variable %s with value %s" % (name, value))
print('##vso[task.setvariable variable=%s]%s' % (name, value))
def __main(args: list) -> int:
validate_supported_runtime()
args = __process_arguments(args)
@dataclass
class CiSetupArgs:
channel: str
quiet: bool = False
commit_sha: Optional[str] = None
repository: Optional[str] = None
architecture: str = get_machine_architecture()
dotnet_path: Optional[str] = None
dotnet_versions: List[str] = field(default_factory=list)
install_dir: Optional[str] = None
build_configs: List[str] = field(default_factory=list)
pgo_status: Optional[str] = None
get_perf_hash: bool = False
perf_hash: str = 'testSha'
cli: Optional[str] = None
commit_time: Optional[str] = None
local_build: bool = False
branch: Optional[str] = None
output_file: str = os.path.join(get_tools_directory(), 'machine-setup')
not_in_lab: bool = False
queue: str = 'testQueue'
build_number: str = '1234.1'
locale: str = 'en-US'
maui_version: str = ''
affinity: Optional[str] = None
run_env_vars: Optional[List[str]] = None
target_windows: bool = True
physical_promotion: Optional[str] = None
def main(args: CiSetupArgs):
verbose = not args.quiet
setup_loggers(verbose=verbose)
@ -251,6 +305,10 @@ def __main(args: list) -> int:
# if repository is set, user needs to supply the commit_sha
if not ((args.commit_sha is None) == (args.repository is None)):
raise ValueError('Either both commit_sha and repository should be set or neither')
# for CI pipelines, use the agent OS
if not args.local_build:
args.target_windows = sys.platform == 'win32'
# Acquire necessary tools (dotnet)
# For arm64 runs, download the x64 version so we can get the information we need, but set all variables
@ -284,15 +342,15 @@ def __main(args: list) -> int:
# (ie https://github.com/dotnet-coreclr). Replace dashes with slashes in that case.
repo_url = None if args.repository is None else args.repository.replace('-','/')
variable_format = 'set %s=%s\n' if sys.platform == 'win32' else 'export %s=%s\n'
path_variable = 'set PATH=%s;%%PATH%%\n' if sys.platform == 'win32' else 'export PATH=%s:$PATH\n'
which = 'where dotnet\n' if sys.platform == 'win32' else 'which dotnet\n'
dotnet_path = '%HELIX_CORRELATION_PAYLOAD%\dotnet' if sys.platform == 'win32' else '$HELIX_CORRELATION_PAYLOAD/dotnet'
owner, repo = ('dotnet', 'core-sdk') if args.repository is None else (dotnet.get_repository(repo_url))
config_string = ';'.join(args.build_configs) if sys.platform == 'win32' else '"%s"' % ';'.join(args.build_configs)
variable_format = 'set "%s=%s"\n' if args.target_windows else 'export %s="%s"\n'
path_variable = 'set PATH=%s;%%PATH%%\n' if args.target_windows else 'export PATH=%s:$PATH\n'
which = 'where dotnet\n' if args.target_windows else 'which dotnet\n'
dotnet_path = '%HELIX_CORRELATION_PAYLOAD%\\dotnet' if args.target_windows else '$HELIX_CORRELATION_PAYLOAD/dotnet'
owner, repo = ('dotnet', 'core-sdk') if repo_url is None else (dotnet.get_repository(repo_url))
config_string = ';'.join(args.build_configs) if args.target_windows else '"%s"' % ';'.join(args.build_configs)
pgo_config = ''
physical_promotion_config = ''
showenv = 'set' if sys.platform == 'win32' else 'printenv'
showenv = 'set' if args.target_windows else 'printenv'
if args.pgo_status == 'nodynamicpgo':
pgo_config = variable_format % ('COMPlus_TieredPGO', '0')
@ -305,7 +363,7 @@ def __main(args: list) -> int:
with push_dir(get_repo_root_path()):
output = check_output(['git', 'rev-parse', 'HEAD'])
decoded_lines = []
decoded_lines: List[str] = []
for line in output.splitlines():
decoded_lines = decoded_lines + [line.decode('utf-8')]
@ -315,11 +373,21 @@ def __main(args: list) -> int:
perfHash = decoded_output if args.get_perf_hash else args.perf_hash
framework = ChannelMap.get_target_framework_moniker(args.channel)
# if the extension is already present, don't add it
output_file = args.output_file
if not output_file.endswith("cmd") and not output_file.endswith(".sh"):
extension = ".cmd" if args.target_windows else ".sh"
output_file += extension
if not framework.startswith('net4'):
target_framework_moniker = dotnet.FrameworkAction.get_target_framework_moniker(framework)
dotnet_version = dotnet.get_dotnet_version(target_framework_moniker, args.cli) if args.dotnet_versions == [] else args.dotnet_versions[0]
commit_sha = dotnet.get_dotnet_sdk(target_framework_moniker, args.cli) if args.commit_sha is None else args.commit_sha
if(args.commit_time is not None):
if args.local_build:
source_timestamp = datetime.datetime.utcnow().strftime('%Y-%m-%dT%H:%M:%SZ')
elif(args.commit_time is not None):
try:
parsed_timestamp = datetime.datetime.strptime(args.commit_time, '%Y-%m-%d %H:%M:%S %z').astimezone(datetime.timezone.utc)
source_timestamp = parsed_timestamp.strftime('%Y-%m-%dT%H:%M:%SZ')
@ -331,12 +399,15 @@ def __main(args: list) -> int:
branch = ChannelMap.get_branch(args.channel) if not args.branch else args.branch
getLogger().info("Writing script to %s" % args.output_file)
dir_path = os.path.dirname(args.output_file)
getLogger().info("Writing script to %s" % output_file)
dir_path = os.path.dirname(output_file)
if not os.path.isdir(dir_path):
os.mkdir(dir_path)
with open(args.output_file, 'w') as out_file:
perflab_upload_token = os.environ.get('PerfCommandUploadToken' if args.target_windows else 'PerfCommandUploadTokenLinux')
run_name = os.environ.get("PERFLAB_RUNNAME")
with open(output_file, 'w') as out_file:
out_file.write(which)
out_file.write(pgo_config)
out_file.write(physical_promotion_config)
@ -358,6 +429,10 @@ def __main(args: list) -> int:
out_file.write(variable_format % ('UseSharedCompilation', 'false'))
out_file.write(variable_format % ('DOTNET_ROOT', dotnet_path))
out_file.write(variable_format % ('MAUI_VERSION', args.maui_version))
if perflab_upload_token is not None:
out_file.write(variable_format % ('PERFLAB_UPLOAD_TOKEN', perflab_upload_token))
if run_name is not None:
out_file.write(variable_format % ('PERFLAB_RUNNAME', run_name))
out_file.write(path_variable % dotnet_path)
if args.affinity:
out_file.write(variable_format % ('PERFLAB_DATA_AFFINITY', args.affinity))
@ -368,15 +443,18 @@ def __main(args: list) -> int:
out_file.write(variable_format % ("PERFLAB_DATA_" + key, value))
out_file.write(showenv)
else:
with open(args.output_file, 'w') as out_file:
with open(output_file, 'w') as out_file:
out_file.write(variable_format % ('PERFLAB_INLAB', '0'))
out_file.write(variable_format % ('PERFLAB_TARGET_FRAMEWORKS', framework))
out_file.write(path_variable % dotnet_path)
# The '_Framework' is needed for specifying frameworks in proj files and for building tools later in the pipeline
__write_pipeline_variable('PERFLAB_Framework', framework)
set_environment_variable('PERFLAB_Framework', framework)
def __main(argv: List[str]):
validate_supported_runtime()
args = __process_arguments(argv)
main(CiSetupArgs(**vars(args)))
if __name__ == "__main__":

Просмотреть файл

@ -6,18 +6,16 @@ Contains the functionality around DotNet Cli.
import ssl
import datetime
from argparse import Action, ArgumentParser, ArgumentTypeError, ArgumentError
from collections import namedtuple
from argparse import Action, ArgumentParser, ArgumentTypeError
from glob import iglob
from json import loads
from logging import getLogger
from os import chmod, environ, listdir, makedirs, path, pathsep, system
from re import search, match, MULTILINE
from re import search, MULTILINE
from shutil import rmtree
from stat import S_IRWXU
from subprocess import CalledProcessError, check_output
from sys import argv, platform
from typing import Tuple
from typing import Any, List, NamedTuple, Optional, Tuple
from urllib.error import URLError
from urllib.parse import urlparse
from urllib.request import urlopen
@ -41,14 +39,14 @@ def info(verbose: bool) -> None:
cmdline = ['dotnet', '--info']
RunCommand(cmdline, verbose=verbose).run()
def exec(asm_path: str, success_exit_codes: list, verbose: bool, *args) -> int:
def exec(asm_path: str, success_exit_codes: List[int], verbose: bool, *args: str) -> int:
"""
Executes `dotnet exec` which can be used to execute assemblies
"""
asm_path=path.abspath(asm_path)
working_dir=path.dirname(asm_path)
if not path.exists(asm_path):
raise ArgumentError('Cannot find assembly {} to exec'.format(asm_path))
raise ArgumentTypeError('Cannot find assembly {} to exec'.format(asm_path))
cmdline = ['dotnet', 'exec', path.basename(asm_path)]
cmdline += list(args)
@ -61,10 +59,7 @@ def __log_script_header(message: str):
getLogger().info('-' * message_length)
CSharpProjFile = namedtuple('CSharpProjFile', [
'file_name',
'working_directory'
])
CSharpProjFile = NamedTuple('CSharpProjFile', file_name=str, working_directory=str)
class FrameworkAction(Action):
'''
@ -93,7 +88,7 @@ class FrameworkAction(Action):
return framework
@staticmethod
def get_target_framework_monikers(frameworks: list) -> list:
def get_target_framework_monikers(frameworks: List[str]) -> List[str]:
'''
Translates framework names to target framework monikers (TFM)
Required to run AOT benchmarks where the host process must be .NET
@ -202,7 +197,7 @@ class CompilationAction(Action):
return requested_mode
@staticmethod
def modes() -> list:
def modes() -> List[str]:
'''Available .NET Performance modes.'''
return [
CompilationAction.DEFAULT,
@ -294,8 +289,8 @@ class CSharpProject:
def restore(self,
packages_path: str,
verbose: bool,
runtime_identifier: str = None,
args: list = None) -> None:
runtime_identifier: Optional[str] = None,
args: Optional[List[str]] = None) -> None:
'''
Calls dotnet to restore the dependencies and tools of the specified
project.
@ -329,10 +324,10 @@ class CSharpProject:
configuration: str,
verbose: bool,
packages_path: str,
target_framework_monikers: list = None,
target_framework_monikers: Optional[List[str]] = None,
output_to_bindir: bool = False,
runtime_identifier: str = None,
args: list = None) -> None:
runtime_identifier: Optional[str] = None,
args: Optional[List[str]] = None) -> None:
'''Calls dotnet to build the specified project.'''
if not target_framework_monikers: # Build all supported frameworks.
cmdline = [
@ -388,8 +383,8 @@ class CSharpProject:
verbose: bool,
working_directory: str,
force: bool = False,
exename: str = None,
language: str = None,
exename: Optional[str] = None,
language: Optional[str] = None,
no_https: bool = False,
no_restore: bool = True
):
@ -432,11 +427,11 @@ class CSharpProject:
configuration: str,
output_dir: str,
verbose: bool,
packages_path,
target_framework_moniker: str = None,
runtime_identifier: str = None,
msbuildprops: list = None,
*args
packages_path: str,
target_framework_moniker: Optional[str] = None,
runtime_identifier: Optional[str] = None,
msbuildprops: Optional[List[str]] = None,
*args: str
) -> None:
'''
Invokes publish on the specified project
@ -466,7 +461,7 @@ class CSharpProject:
self.working_directory
)
def __get_output_build_arg(self, outdir) -> list:
def __get_output_build_arg(self, outdir: str) -> List[str]:
# dotnet build/publish does not support `--output` with sln files
if path.splitext(self.csproj_file)[1] == '.sln':
outdir = outdir if path.isabs(outdir) else path.abspath(outdir)
@ -490,9 +485,9 @@ class CSharpProject:
def run(self,
configuration: str,
target_framework_moniker: str,
success_exit_codes: list,
success_exit_codes: List[int],
verbose: bool,
*args) -> int:
*args: str) -> int:
'''
Calls dotnet to run a .NET project output.
'''
@ -511,24 +506,25 @@ class CSharpProject:
self.working_directory)
def get_framework_version(framework: str) -> str:
FrameworkVersion = NamedTuple('FrameworkVersion', major=int, minor=int)
def get_framework_version(framework: str) -> FrameworkVersion:
groups = search(r".*(\d)\.(\d)$", framework)
if not groups:
raise ValueError("Unknown target framework: {}".format(framework))
FrameworkVersion = namedtuple('FrameworkVersion', ['major', 'minor'])
version = FrameworkVersion(int(groups.group(1)), int(groups.group(2)))
return version
def get_base_path(dotnet_path: str = None) -> str:
def get_base_path(dotnet_path: Optional[str] = None) -> str:
"""Gets the dotnet Host version from the `dotnet --info` command."""
if not dotnet_path:
dotnet_path = 'dotnet'
output = check_output([dotnet_path, '--info'])
groups = None
for line in output.splitlines():
decoded_line = line.decode('utf-8')
@ -546,7 +542,7 @@ def get_base_path(dotnet_path: str = None) -> str:
return groups.group(1)
def get_sdk_path(dotnet_path: str = None) -> str:
def get_sdk_path(dotnet_path: Optional[str] = None) -> str:
base_path = get_base_path(dotnet_path)
sdk_path = path.abspath(path.join(base_path, '..'))
return sdk_path
@ -559,8 +555,8 @@ def get_dotnet_path() -> str:
def get_dotnet_version(
framework: str,
dotnet_path: str = None,
sdk_path: str = None) -> str:
dotnet_path: Optional[str] = None,
sdk_path: Optional[str] = None) -> str:
version = get_framework_version(framework)
sdk_path = get_sdk_path(dotnet_path) if sdk_path is None else sdk_path
@ -591,8 +587,8 @@ def get_dotnet_version(
def get_dotnet_sdk(
framework: str,
dotnet_path: str = None,
sdk: str = None) -> str:
dotnet_path: Optional[str] = None,
sdk: Optional[str] = None) -> str:
"""Gets the dotnet Host commit sha from the `dotnet --info` command."""
sdk_path = get_sdk_path(dotnet_path)
@ -618,7 +614,7 @@ def get_repository(repository: str) -> Tuple[str, str]:
def get_commit_date(
framework: str,
commit_sha: str,
repository: str = None
repository: Optional[str] = None
) -> str:
'''
Gets the .NET Core committer date using the GitHub Web API from the
@ -690,7 +686,7 @@ def get_build_directory(
bin_directory: str,
project_name: str,
configuration: str,
target_framework_moniker: str) -> None:
target_framework_moniker: str) -> str:
'''
Gets the output directory where the built artifacts are in with
respect to the specified bin_directory.
@ -733,7 +729,7 @@ def __get_directory(architecture: str) -> str:
return path.join(get_tools_directory(), 'dotnet', architecture)
def remove_dotnet(architecture: str) -> str:
def remove_dotnet(architecture: str) -> None:
'''
Removes the dotnet installed in the tools directory associated with the
specified architecture.
@ -763,12 +759,12 @@ def shutdown_server(verbose:bool) -> None:
def install(
architecture: str,
channels: list,
versions: str,
channels: List[str],
versions: List[str],
verbose: bool,
install_dir: str = None,
azure_feed_url: str = None,
internal_build_key: str = None) -> None:
install_dir: Optional[str] = None,
azure_feed_url: Optional[str] = None,
internal_build_key: Optional[str] = None) -> None:
'''
Downloads dotnet cli into the tools folder.
'''
@ -856,8 +852,9 @@ def install(
if (not versions) and channels:
for channel in channels:
cmdline_args = common_cmdline_args + ['-Channel', ChannelMap.get_branch(channel)]
if ChannelMap.get_quality_from_channel(channel) is not None:
cmdline_args += ['-Quality', ChannelMap.get_quality_from_channel(channel)]
quality = ChannelMap.get_quality_from_channel(channel)
if quality is not None:
cmdline_args += ['-Quality', quality]
RunCommand(cmdline_args, verbose=verbose, retry=1).run(
get_repo_root_path()
)
@ -919,25 +916,11 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
'''
Adds new arguments to the specified ArgumentParser object.
'''
parser = __add_arguments(parser)
# .NET Compilation modes.
parser.add_argument(
'--dotnet-compilation-mode',
dest='dotnet_compilation_mode',
required=False,
action=CompilationAction,
choices=CompilationAction.modes(),
default=CompilationAction.noenv(),
type=CompilationAction.validate,
help='{}'.format(CompilationAction.help_text())
)
return parser
def __process_arguments(args: list):
def __process_arguments(args: List[str]) -> Any:
parser = ArgumentParser(
description='DotNet Cli wrapper.',
allow_abbrev=False
@ -983,12 +966,13 @@ def __process_arguments(args: list):
action='store_true',
help='Turns on verbosity (default "False")',
)
return parser.parse_args(args)
def __main(args: list) -> int:
def __main(argv: List[str]) -> None:
validate_supported_runtime()
args = __process_arguments(args)
args = __process_arguments(argv)
setup_loggers(verbose=args.verbose)
install(
architecture=args.architecture,

Просмотреть файл

@ -10,7 +10,7 @@ from performance.common import RunCommand
def _get_gcinfra_path() -> str:
return os.path.join(get_repo_root_path(), "src", "benchmarks", "gc")
def __main(args: list) -> int:
def __main(args: list[str]) -> int:
infra_base_path = _get_gcinfra_path()
with push_dir(infra_base_path):
gcperfsim_path = os.path.join(infra_base_path, "src", "exec", "GCPerfSim")

Просмотреть файл

@ -12,7 +12,7 @@ from logging import getLogger
from os import path
from subprocess import CalledProcessError
from traceback import format_exc
from typing import Tuple
from typing import Any, List
import csv
import sys
@ -28,7 +28,7 @@ from channel_map import ChannelMap
import dotnet
def get_supported_configurations() -> list:
def get_supported_configurations() -> List[str]:
'''
The configuration to use for building the project. The default for most
projects is 'Release'
@ -120,7 +120,7 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
help='Full path to dotnet.exe',
)
def __get_bdn_arguments(user_input: str) -> list:
def __get_bdn_arguments(user_input: str) -> List[str]:
file = StringIO(user_input)
reader = csv.reader(file, delimiter=' ')
for args in reader:
@ -213,7 +213,7 @@ def add_arguments(parser: ArgumentParser) -> ArgumentParser:
return parser
def __process_arguments(args: list) -> Tuple[list, bool]:
def __process_arguments(args: List[str]):
parser = ArgumentParser(
description="Builds the benchmarks.",
allow_abbrev=False)
@ -230,8 +230,8 @@ def __process_arguments(args: list) -> Tuple[list, bool]:
return parser.parse_args(args)
def __get_benchmarkdotnet_arguments(framework: str, args: tuple) -> list:
run_args = []
def __get_benchmarkdotnet_arguments(framework: str, args: Any) -> List[str]:
run_args: List[str] = []
if args.corerun:
run_args += ['--coreRun'] + args.corerun
if args.cli:
@ -287,7 +287,7 @@ def get_bin_dir_to_use(csprojfile: dotnet.CSharpProjFile, bin_directory: str, ru
def build(
BENCHMARKS_CSPROJ: dotnet.CSharpProject,
configuration: str,
target_framework_monikers: list,
target_framework_monikers: List[str],
incremental: str,
run_isolated: bool,
for_wasm: bool,
@ -309,7 +309,7 @@ def build(
__log_script_header("Restoring .NET micro benchmarks")
BENCHMARKS_CSPROJ.restore(packages_path=packages, verbose=verbose)
build_args = []
build_args: List[str] = []
if for_wasm:
build_args += ['/p:BuildingForWasm=true']
@ -338,14 +338,14 @@ def run(
framework: str,
run_isolated: bool,
verbose: bool,
*args) -> bool:
args: Any) -> bool:
'''Runs the benchmarks, returns True for a zero status code and False otherwise.'''
__log_script_header("Running .NET micro benchmarks for '{}'".format(
framework
))
# dotnet exec
run_args = __get_benchmarkdotnet_arguments(framework, *args)
run_args = __get_benchmarkdotnet_arguments(framework, args)
target_framework_moniker = dotnet.FrameworkAction.get_target_framework_moniker(
framework
)
@ -376,10 +376,10 @@ def __log_script_header(message: str):
getLogger().info('-' * len(message))
def __main(args: list) -> int:
def __main(argv: List[str]) -> int:
try:
validate_supported_runtime()
args = __process_arguments(args)
args = __process_arguments(argv)
configuration = args.configuration
frameworks = args.frameworks
@ -393,7 +393,7 @@ def __main(args: list) -> int:
# dotnet --info
dotnet.info(verbose)
bin_dir_to_use=micro_benchmarks.get_bin_dir_to_use(args.csprojfile, args.bin_directory, args.run_isolated)
bin_dir_to_use=get_bin_dir_to_use(args.csprojfile, args.bin_directory, args.run_isolated)
BENCHMARKS_CSPROJ = dotnet.CSharpProject(
project=args.csprojfile,
bin_directory=bin_dir_to_use

Просмотреть файл

@ -18,6 +18,7 @@ from platform import machine
import os
import sys
import time
from typing import Callable, List, Optional, Tuple, Type, TypeVar
def get_machine_architecture():
@ -77,7 +78,7 @@ def remove_directory(path: str) -> None:
raise TypeError('Invalid type.')
if os.path.isdir(path):
def handle_rmtree_errors(func, path, excinfo):
def handle_rmtree_errors(func: Callable[[str], None], path: str, excinfo: Exception):
"""
Helper function to handle long path errors on Windows.
"""
@ -140,7 +141,7 @@ def get_packages_directory() -> str:
return os.path.join(get_artifacts_directory(), 'packages')
@contextmanager
def push_dir(path: str = None) -> None:
def push_dir(path: Optional[str] = None):
'''
Adds the specified location to the top of a location stack, then changes to
the specified directory.
@ -158,7 +159,14 @@ def push_dir(path: str = None) -> None:
else:
yield
def retry_on_exception(function, retry_count=3, retry_delay=5, retry_delay_multiplier=1, retry_exceptions=[Exception], raise_exceptions=[]):
TRet = TypeVar('TRet')
def retry_on_exception(
function: Callable[[], TRet],
retry_count = 3,
retry_delay = 5,
retry_delay_multiplier = 1,
retry_exceptions: List[Type[Exception]]=[Exception],
raise_exceptions: List[Type[Exception]]=[]) -> Optional[TRet]:
'''
Retries the specified function if it throws an exception.
@ -195,6 +203,20 @@ def retry_on_exception(function, retry_count=3, retry_delay=5, retry_delay_multi
time.sleep(retry_delay)
retry_delay *= retry_delay_multiplier
def __write_pipeline_variable(name: str, value: str):
# Create a variable in the build pipeline
getLogger().info("Writing pipeline variable %s with value %s" % (name, value))
print('##vso[task.setvariable variable=%s]%s' % (name, value))
def set_environment_variable(name: str, value: str, save_to_pipeline: bool = True):
"""
Sets an environment variable, both in the python process and to the CI pipeline.
Saving it to the CI pipeline can be disabled using the save_to_pipeline parameter.
"""
if save_to_pipeline:
__write_pipeline_variable(name, value)
os.environ[name] = value
class RunCommand:
'''
This is a class wrapper around `subprocess.Popen` with an additional set
@ -203,8 +225,8 @@ class RunCommand:
def __init__(
self,
cmdline: list,
success_exit_codes: list = None,
cmdline: List[str],
success_exit_codes: Optional[List[int]] = None,
verbose: bool = False,
retry: int = 0):
if cmdline is None:
@ -222,12 +244,12 @@ class RunCommand:
self.__success_exit_codes = success_exit_codes
@property
def cmdline(self) -> str:
def cmdline(self) -> List[str]:
'''Command-line to use when starting the application.'''
return self.__cmdline
@property
def success_exit_codes(self) -> list:
def success_exit_codes(self) -> List[int]:
'''
The successful exit codes that the associated process specifies when it
terminated.
@ -243,7 +265,7 @@ class RunCommand:
def stdout(self) -> str:
return self.__stdout.getvalue()
def __runinternal(self, working_directory: str = None) -> tuple:
def __runinternal(self, working_directory: Optional[str] = None) -> Tuple[int, str]:
should_pipe = self.verbose
with push_dir(working_directory):
quoted_cmdline = '$ '
@ -274,7 +296,7 @@ class RunCommand:
return (proc.returncode, quoted_cmdline)
def run(self, working_directory: str = None) -> int:
def run(self, working_directory: Optional[str] = None) -> int:
'''Executes specified shell command.'''
retrycount = 0

Просмотреть файл

@ -60,7 +60,7 @@ def setup_loggers(verbose: bool):
timestamp, script_name, getpid())
return path.join(log_dir, log_file_name)
def __get_console_handler(verbose: bool) -> StreamHandler:
def __get_console_handler(verbose: bool):
console_handler = StreamHandler()
level = INFO if verbose else WARNING
console_handler.setLevel(level)

Просмотреть файл

@ -0,0 +1,362 @@
from dataclasses import dataclass
from glob import glob
import os
import shutil
from ci_setup import CiSetupArgs
from performance.common import RunCommand, set_environment_variable
@dataclass
class PerformanceSetupArgs:
performance_directory: str # Path to local copy of performance repository
working_directory: str # Path to directory where the payload and work item directories will be created
queue: str # The helix queue to run on
csproj: str # Path to benchmark project file
runtime_directory: str | None = None # Path to local copy of runtime repository
core_root_directory: str | None = None # Path to the core root directory so that pre-built versions of the runtime repo can be used
baseline_core_root_directory: str | None = None
architecture: str = "x64"
framework: str | None = None
compilation_mode: str = "Tiered"
repository: str | None = os.environ.get("BUILD_REPOSITORY_NAME")
branch: str | None = os.environ.get("BUILD_SOURCEBRANCH")
commit_sha: str | None = os.environ.get("BUILD_SOURCEVERSION")
build_number: str | None = os.environ.get("BUILD_BUILDNUMBER")
build_definition_name: str | None = os.environ.get("BUILD_DEFINITIONNAME")
run_categories: str = "Libraries Runtime"
kind: str = "micro"
alpine: bool = False
llvm: bool = False
mono_interpreter: bool = False
mono_aot: bool = False
mono_aot_path: str | None = None
internal: bool = False
compare: bool = False
mono_dotnet: str | None = None
wasm_bundle_directory: str | None = None
wasm_aot: bool = False
javascript_engine: str = "v8"
configurations: dict[str, str] | None = None
android_mono: bool = False
ios_mono: bool = False
ios_nativeaot: bool = False
no_dynamic_pgo: bool = False
physical_promotion: bool = False
ios_llvm_build: bool = False
ios_strip_symbols: bool = False
maui_version: str | None = None
use_local_commit_time: bool = False
only_sanity_check: bool = False
extra_bdn_args: list[str] | None = None
affinity: str | None = None
python: str = "python3"
@dataclass
class PerformanceSetupData:
payload_directory: str
performance_directory: str
work_item_directory: str
python: str
bdn_arguments: list[str]
extra_bdn_arguments: list[str]
setup_arguments: CiSetupArgs
perf_lab_arguments: list[str]
bdn_categories: str
target_csproj: str
kind: str
architecture: str
use_core_run: bool
use_baseline_core_run: bool
run_from_perf_repo: bool
compare: bool
mono_dotnet: bool
wasm_dotnet: bool
ios_llvm_build: bool
ios_strip_symbols: bool
creator: str
queue: str
helix_source_prefix: str
build_config: str
runtime_type: str
only_sanity_check: bool
def set_environment_variables(self, save_to_pipeline: bool = True):
def set_env_var(name: str, value: str | bool | list[str], sep = " "):
if isinstance(value, str):
value_str = value
elif isinstance(value, bool):
value_str = "true" if value else "false"
else:
value_str = sep.join(value)
set_environment_variable(name, value_str, save_to_pipeline=save_to_pipeline)
set_env_var("PayloadDirectory", self.payload_directory)
set_env_var("PerformanceDirectory", self.performance_directory)
set_env_var("WorkItemDirectory", self.work_item_directory)
set_env_var("Python", self.python)
set_env_var("BenchmarkDotNetArguments", self.bdn_arguments)
set_env_var("ExtraBenchmarkDotNetArguments", self.extra_bdn_arguments)
# set_env_var("SetupArguments", self.setup_arguments) # Skipping as this is not currently being used as an env var
set_env_var("PerfLabArguments", self.perf_lab_arguments)
set_env_var("BDNCategories", self.bdn_categories)
set_env_var("TargetCsproj", self.target_csproj)
set_env_var("Kind", self.kind)
set_env_var("Architecture", self.architecture)
set_env_var("UseCoreRun", self.use_core_run)
set_env_var("UseBaselineCoreRun", self.use_baseline_core_run)
set_env_var("RunFromPerfRepo", self.run_from_perf_repo)
set_env_var("Compare", self.compare)
set_env_var("MonoDotnet", self.mono_dotnet)
set_env_var("WasmDotnet", self.wasm_dotnet)
set_env_var("iOSLlvmBuild", self.ios_llvm_build)
set_env_var("iOSStripSymbols", self.ios_strip_symbols)
set_env_var("Creator", self.creator)
set_env_var("Queue", self.queue)
set_env_var("HelixSourcePrefix", self.helix_source_prefix)
set_env_var("_BuildConfig", self.build_config)
set_env_var("RuntimeType", self.runtime_type)
set_env_var("OnlySanityCheck", self.only_sanity_check)
def run(args: PerformanceSetupArgs):
payload_directory = os.path.join(args.working_directory, "Payload")
performance_directory = os.path.join(payload_directory, "performance")
work_item_directory = os.path.join(args.working_directory, "workitem")
bdn_arguments = ["--anyCategories", args.run_categories]
if args.affinity is not None and not "0":
bdn_arguments += ["--affinity", args.affinity]
extra_bdn_arguments = [] if args.extra_bdn_args is None else args.extra_bdn_args[:]
if args.internal:
creator = ""
perf_lab_arguments = ["--upload-to-perflab-container"]
helix_source_prefix = "official"
else:
extra_bdn_arguments += [
"--iterationCount", "1",
"--warmupCount", "0",
"--invocationCount", "1",
"--unrollFactor", "1",
"--strategy", "ColdStart",
"--stopOnFirstError", "true"
]
creator = args.build_definition_name or ""
perf_lab_arguments = []
helix_source_prefix = "pr"
build_config = f"{args.architecture}.{args.kind}.{args.framework}"
category_exclusions: list[str] = []
if args.configurations is None:
args.configurations = { "CompilationMode": args.compilation_mode, "RunKind": args.kind }
using_mono = False
if args.mono_dotnet is not None:
using_mono = True
args.configurations["LLVM"] = str(args.llvm)
args.configurations["MonoInterpreter"] = str(args.mono_interpreter)
args.configurations["MonoAOT"] = str(args.mono_aot)
# TODO: Validate if this exclusion filter is still needed
extra_bdn_arguments += ["--exclusion-filter", "*Perf_Image*", "*Perf_NamedPipeStream*"]
category_exclusions += ["NoMono"]
if args.mono_interpreter:
category_exclusions += ["NoInterpreter"]
if args.mono_aot:
category_exclusions += ["NoAOT"]
using_wasm = False
if args.wasm_bundle_directory is not None:
using_wasm = True
args.configurations["CompilationMode"] = "wasm"
if args.wasm_aot:
args.configurations["AOT"] = "true"
build_config = f"wasmaot.{build_config}"
else:
build_config = f"wasm.{build_config}"
if args.javascript_engine == "javascriptcore":
args.configurations["JSEngine"] = "javascriptcore"
category_exclusions += ["NoInterpreter", "NoWASM", "NoMono"]
if args.no_dynamic_pgo:
args.configurations["PGOType"] = "nodynamicpgo"
if args.physical_promotion:
args.configurations["PhysicalPromotionType"] = "physicalpromotion"
runtime_type = ""
if args.ios_mono:
runtime_type = "Mono"
args.configurations["iOSLlvmBuild"] = str(args.ios_llvm_build)
args.configurations["iOSStripSymbols"] = str(args.ios_strip_symbols)
args.configurations["RuntimeType"] = str(runtime_type)
if args.ios_nativeaot:
runtime_type = "NativeAOT"
args.configurations["iOSStripSymbols"] = str(args.ios_strip_symbols)
args.configurations["RuntimeType"] = str(runtime_type)
if category_exclusions:
extra_bdn_arguments += ["--category-exclusion-filter", *set(category_exclusions)]
cleaned_branch_name = "main"
if args.branch is not None and args.branch.startswith("refs/heads/release"):
cleaned_branch_name = args.branch.replace("refs/heads/", "")
setup_arguments = CiSetupArgs(
channel=cleaned_branch_name,
queue=args.queue,
build_configs=[f"{k}={v}" for k, v in args.configurations.items()],
architecture=args.architecture,
get_perf_hash=True
)
if args.build_number is not None:
setup_arguments.build_number = args.build_number
if args.repository is not None:
setup_arguments.repository = f"https://github.com/{args.repository}"
if args.branch is not None:
setup_arguments.branch = args.branch
if args.commit_sha is not None:
setup_arguments.commit_sha = args.commit_sha
if not args.internal:
setup_arguments.not_in_lab = True
# TODO: Figure out if this should be the runtime or performance commit time, or if we need to capture both
if args.use_local_commit_time and args.commit_sha is not None:
get_commit_time_command = RunCommand(["git", "show", "-s", "--format=%ci", args.commit_sha])
get_commit_time_command.run()
setup_arguments.commit_time = f"\"{get_commit_time_command.stdout}\""
ignored_paths = [
payload_directory,
".git",
"artifacts",
]
shutil.copytree(args.performance_directory, performance_directory, ignore=shutil.ignore_patterns(*ignored_paths))
if args.mono_dotnet is not None:
mono_dotnet_path = os.path.join(payload_directory, "dotnet-mono")
shutil.copytree(args.mono_dotnet, mono_dotnet_path)
if args.wasm_bundle_directory is not None:
wasm_bundle_directory_path = payload_directory
shutil.copytree(args.wasm_bundle_directory, wasm_bundle_directory_path)
wasm_args = "--experimental-wasm-eh --expose_wasm"
if args.javascript_engine == "v8":
wasm_args += " --module"
extra_bdn_arguments += [
"--wasmEngine", f"/home/helixbot/.jsvu/bin/{args.javascript_engine}",
"--wasmArgs", f"\"{wasm_args}\""
"--cli", "$HELIX_CORRELATION_PAYLOAD/dotnet/dotnet",
"--wasmDataDir", "$HELIX_CORRELATION_PAYLOAD/wasm-data"
]
if args.wasm_aot:
extra_bdn_arguments += [
"--aotcompilermode", "wasm",
"--buildTimeout", "3600"
]
setup_arguments.dotnet_path = f"{wasm_bundle_directory_path}/dotnet"
if args.no_dynamic_pgo:
setup_arguments.pgo_status = "nodynamicpgo"
if args.physical_promotion:
setup_arguments.physical_promotion = "physicalpromotion"
if args.mono_aot:
if args.mono_aot_path is None:
raise Exception("Mono AOT Path must be provided for MonoAOT runs")
monoaot_dotnet_path = os.path.join(payload_directory, "monoaot")
shutil.copytree(args.mono_aot_path, monoaot_dotnet_path)
extra_bdn_arguments += [
"--runtimes", "monoaotllvm",
"--aotcompilerpath", "$HELIX_CORRELATION_PAYLOAD/monoaot/sgen/mini/mono-sgen",
"--customruntimepack", "$HELIX_CORRELATION_PAYLOAD/monoaot/pack --aotcompilermode llvm"
]
extra_bdn_arguments += ["--logBuildOutput", "--generateBinLog"]
use_core_run = False
if args.core_root_directory is not None:
use_core_run = True
new_core_root = os.path.join(payload_directory, "Core_Root")
shutil.copytree(args.core_root_directory, new_core_root, ignore=shutil.ignore_patterns("*.pdb"))
use_baseline_core_run = False
if args.baseline_core_root_directory is not None:
use_baseline_core_run = True
new_baseline_core_root = os.path.join(payload_directory, "Baseline_Core_Root")
shutil.copytree(args.baseline_core_root_directory, new_baseline_core_root)
if args.maui_version is not None:
setup_arguments.maui_version = args.maui_version
if args.android_mono:
if args.runtime_directory is None:
raise Exception("Runtime directory must be present for Android Mono benchmarks")
os.makedirs(work_item_directory, exist_ok=True)
shutil.copy(os.path.join(args.runtime_directory, "MonoBenchmarksDroid.apk"), payload_directory)
shutil.copy(os.path.join(args.runtime_directory, "androidHelloWorld", "HelloAndroid.apk"), payload_directory)
setup_arguments.architecture = "arm64"
if args.ios_mono or args.ios_nativeaot:
if args.runtime_directory is None:
raise Exception("Runtime directory must be present for IOS Mono or IOS Native AOT benchmarks")
dest_zip_folder = os.path.join(payload_directory, "iosHelloWorldZip")
shutil.copy(os.path.join(args.runtime_directory, "iosHelloWorld"), os.path.join(payload_directory, "iosHelloWorld"))
shutil.copy(os.path.join(args.runtime_directory, "iosHelloWorldZip"), dest_zip_folder)
# rename all zips in the 2nd folder to iOSSampleApp.zip
for file in glob(os.path.join(dest_zip_folder, "*.zip")):
os.rename(file, os.path.join(dest_zip_folder, "iOSSampleApp.zip"))
shutil.copytree(os.path.join(performance_directory, "docs"), work_item_directory)
return PerformanceSetupData(
payload_directory=payload_directory,
performance_directory=performance_directory,
work_item_directory=work_item_directory,
python=args.python,
bdn_arguments=bdn_arguments,
extra_bdn_arguments=extra_bdn_arguments,
setup_arguments=setup_arguments,
perf_lab_arguments=perf_lab_arguments,
bdn_categories=args.run_categories,
target_csproj=args.csproj,
kind=args.kind,
architecture=args.architecture,
use_core_run=use_core_run,
use_baseline_core_run=use_baseline_core_run,
run_from_perf_repo=False,
compare=args.compare,
mono_dotnet=using_mono,
wasm_dotnet=using_wasm,
ios_llvm_build=args.ios_llvm_build,
ios_strip_symbols=args.ios_strip_symbols,
creator=creator,
queue=args.queue,
helix_source_prefix=helix_source_prefix,
build_config=build_config,
runtime_type=runtime_type,
only_sanity_check=args.only_sanity_check)

Просмотреть файл

@ -0,0 +1,519 @@
from dataclasses import dataclass, field
import dataclasses
from datetime import timedelta
from glob import glob
import json
import os
import shutil
import sys
import urllib.request
from typing import Any
import ci_setup
from performance.common import RunCommand, iswin
import performance_setup
from send_to_helix import PerfSendToHelixArgs, perf_send_to_helix
def output_counters_for_crank(reports: list[Any]):
print("#StartJobStatistics")
statistics: dict[str, list[Any]] = {
"metadata": [],
"measurements": []
}
for report in reports:
for test in report["tests"]:
for counter in test["counters"]:
measurement_name = f"benchmarkdotnet/{test['name']}/{counter['name']}"
for result in counter["results"]:
statistics["measurements"].append({
"name": measurement_name,
"value": result
})
if counter["topCounter"] == True:
statistics["metadata"].append({
"source": "BenchmarkDotNet",
"name": measurement_name,
"aggregate": "avg",
"reduce": "avg",
"format": "n0",
"shortDescription": f"{test['name']} ({counter['metricName']})"
})
statistics["metadata"] = sorted(statistics["metadata"], key=lambda m: m["name"])
print(json.dumps(statistics))
print("#EndJobStatistics")
@dataclass
class RunPerformanceJobArgs:
queue: str
framework: str
run_kind: str
core_root_dir: str
performance_repo_dir: str
architecture: str
os_group: str
extra_bdn_args: str | None = None
run_categories: str = 'Libraries Runtime'
perflab_upload_token: str | None = None
helix_access_token: str | None = os.environ.get("HelixAccessToken")
system_access_token: str = os.environ.get("SYSTEM_ACCESSTOKEN", "")
os_sub_group: str | None = None
project_file: str | None = None
partition_count: int = 1
additional_performance_setup_parameters: dict[str, Any] = field(default_factory=dict[str, Any])
additional_ci_setup_parameters: dict[str, Any] = field(default_factory=dict[str, Any])
helix_type_suffix: str = ""
build_repository_name: str = os.environ.get("BUILD_REPOSITORY_NAME", "dotnet/performance")
build_source_branch: str = os.environ.get("BUILD_SOURCEBRANCH", "main")
build_number: str = os.environ.get("BUILD_BUILDNUMBER", "local")
internal: bool = True
pgo_run_type: str | None = None
physical_promotion_run_type: bool = False
codegen_type: str = "JIT"
runtime_type: str = "coreclr"
affinity: str | None = "0"
run_env_vars: dict[str, str] = field(default_factory=dict[str, str])
is_scenario: bool = False
runtime_flavor: str | None = None
local_build: bool = False
def run_performance_job(args: RunPerformanceJobArgs):
if args.project_file is None:
args.project_file = os.path.join(args.performance_repo_dir, "eng", "performance", "helix.proj")
if args.helix_access_token is None:
raise Exception("HelixAccessToken environment variable is not configured")
if args.perflab_upload_token is None:
if args.os_group == "windows":
args.perflab_upload_token = os.environ.get("PerfCommandUploadToken")
else:
args.perflab_upload_token = os.environ.get("PerfCommandUploadTokenLinux")
helix_post_commands: list[str] = []
if args.is_scenario:
if args.os_group == "windows":
script_extension = ".cmd"
additional_helix_pre_commands = [
f"call %HELIX_CORRELATION_PAYLOAD%\\machine-setup{script_extension}",
"xcopy %HELIX_CORRELATION_PAYLOAD%\\NuGet.config %HELIX_WORKITEM_ROOT% /Y"
]
preserve_python_path = "set ORIGPYPATH=%PYTHONPATH%"
python = "py -3"
else:
script_extension = ".sh"
additional_helix_pre_commands = [
"chmod +x $HELIX_CORRELATION_PAYLOAD/machine-setup.sh",
f". $HELIX_CORRELATION_PAYLOAD/machine-setup{script_extension}",
"cp $HELIX_CORRELATION_PAYLOAD/NuGet.config $HELIX_WORKITEM_ROOT"
]
preserve_python_path = "export ORIGPYPATH=$PYTHONPATH"
python = "python3"
if not args.internal:
helix_pre_commands = [
preserve_python_path,
*additional_helix_pre_commands
]
elif args.os_group == "windows":
helix_pre_commands = [
preserve_python_path,
"py -3 -m venv %HELIX_WORKITEM_PAYLOAD%\\.venv",
"call %HELIX_WORKITEM_PAYLOAD%\\.venv\\Scripts\\activate.bat",
"set PYTHONPATH=",
"py -3 -m pip install azure.storage.blob==12.0.0",
"py -3 -m pip install azure.storage.queue==12.0.0",
"py -3 -m pip install urllib3==1.26.15 --force-reinstall",
f"set \"PERFLAB_UPLOAD_TOKEN={args.perflab_upload_token}\"",
*additional_helix_pre_commands
]
else:
helix_pre_commands = [
preserve_python_path,
"export CRYPTOGRAPHY_ALLOW_OPENSSL_102=true"
"sudo apt-get -y install python3-venv",
"python3 -m venv $HELIX_WORKITEM_PAYLOAD/.venv",
". $HELIX_WORKITEM_PAYLOAD/.venv/bin/activate",
"export PYTHONPATH=",
"python3 -m pip install -U pip",
"pip3 install azure.storage.blob==12.0.0",
"pip3 install azure.storage.queue==12.0.0",
"pip3 install urllib3==1.26.15 --force-reinstall",
f"export PERFLAB_UPLOAD_TOKEN=\"{args.perflab_upload_token}\"",
*additional_helix_pre_commands
]
else:
if args.os_group == "windows":
helix_pre_commands = [
"set ORIGPYPATH=%PYTHONPATH%",
"py -m pip install -U pip",
"py -3 -m venv %HELIX_WORKITEM_PAYLOAD%\\.venv",
"call %HELIX_WORKITEM_PAYLOAD%\\.venv\\Scripts\\activate.bat",
"set PYTHONPATH=",
"py -3 -m pip install -U pip",
"py -3 -m pip install --user urllib3==1.26.15 --force-reinstall",
"py -3 -m pip install --user azure.storage.blob==12.0.0 --force-reinstall",
"py -3 -m pip install --user azure.storage.queue==12.0.0 --force-reinstall",
f'set "PERFLAB_UPLOAD_TOKEN={args.perflab_upload_token}"'
]
elif args.os_sub_group == "_musl":
helix_pre_commands = [
"export ORIGPYPATH=$PYTHONPATH",
"sudo apk add icu-libs krb5-libs libgcc libintl libssl1.1 libstdc++ zlib cargo",
"sudo apk add libgdiplus --repository http://dl-cdn.alpinelinux.org/alpine/edge/testing",
"python3 -m venv $HELIX_WORKITEM_PAYLOAD/.venv",
"source $HELIX_WORKITEM_PAYLOAD/.venv/bin/activate",
"export PYTHONPATH=",
"python3 -m pip install -U pip",
"pip3 install --user urllib3==1.26.15 --force-reinstall",
"pip3 install --user azure.storage.blob==12.7.1 --force-reinstall",
"pip3 install --user azure.storage.queue==12.1.5 --force-reinstall",
f'export PERFLAB_UPLOAD_TOKEN="{args.perflab_upload_token}"'
]
else:
if args.runtime_type == "wasm":
wasm_precommand = (
"sudo apt-get -y remove nodejs && "
"curl -fsSL https://deb.nodesource.com/setup_16.x | sudo -E bash - && "
"sudo apt-get -y install nodejs && "
"npm install --prefix $HELIX_WORKITEM_PAYLOAD jsvu -g && "
"$HELIX_WORKITEM_PAYLOAD/bin/jsvu --os=linux64 --engines=v8 && "
"find ~/.jsvu -ls && "
"~/.jsvu/bin/v8 -e 'console.log(`V8 version: ${this.version()}`)'")
else:
wasm_precommand = "echo"
helix_pre_commands = [
"export ORIGPYPATH=$PYTHONPATH",
"export CRYPTOGRAPHY_ALLOW_OPENSSL_102=true",
'echo "** Installing prerequistes **"',
("python3 -m pip install --user -U pip && "
"sudo apt-get -y install python3-venv && "
"python3 -m venv $HELIX_WORKITEM_PAYLOAD/.venv && "
"ls -l $HELIX_WORKITEM_PAYLOAD/.venv/bin/activate && "
"export PYTHONPATH= && "
"python3 -m pip install --user -U pip && "
"pip3 install --user urllib3==1.26.15 && "
"pip3 install --user azure.storage.blob==12.0.0 && "
"pip3 install --user azure.storage.queue==12.0.0 && "
"sudo apt-get update && "
"sudo apt -y install curl dirmngr apt-transport-https lsb-release ca-certificates && "
f"{wasm_precommand} && "
f"export PERFLAB_UPLOAD_TOKEN=\"{args.perflab_upload_token}\" "
"|| export PERF_PREREQS_INSTALL_FAILED=1"),
'test "x$PERF_PREREQS_INSTALL_FAILED" = "x1" && echo "** Error: Failed to install prerequites **"'
]
mono_interpreter = False
if args.codegen_type == "Interpreter" and args.runtime_type == "mono":
mono_interpreter = True
if args.os_group == "windows":
helix_pre_commands += ['set MONO_ENV_OPTIONS="--interpreter"']
else:
helix_pre_commands += ['export MONO_ENV_OPTIONS="--interpreter"']
if args.os_group == "windows":
helix_pre_commands += ["set MSBUILDDEBUGCOMM=1", 'set "MSBUILDDEBUGPATH=%HELIX_WORKITEM_UPLOAD_ROOT%"']
helix_post_commands = ["set PYTHONPATH=%ORIGPYPATH%"]
else:
helix_pre_commands += ["export MSBUILDDEBUGCOMM=1", 'export "MSBUILDDEBUGPATH=$HELIX_WORKITEM_UPLOAD_ROOT"']
helix_post_commands = ["export PYTHONPATH=$ORIGPYPATH"]
if args.is_scenario:
# these commands are added inside the proj file for non-scenario runs
if args.os_group == "windows":
helix_pre_commands += [
"call %HELIX_WORKITEM_PAYLOAD%\\machine-setup.cmd",
"set PYTHONPATH=%HELIX_WORKITEM_PAYLOAD%\\scripts%3B%HELIX_WORKITEM_PAYLOAD%"
]
else:
helix_pre_commands += [
"chmod +x $HELIX_WORKITEM_PAYLOAD/machine-setup.sh",
". $HELIX_WORKITEM_PAYLOAD/machine-setup.sh",
"export PYTHONPATH=$HELIX_WORKITEM_PAYLOAD/scripts:$HELIX_WORKITEM_PAYLOAD"
]
# TODO: Support custom helix log collection in post command
working_directory = args.performance_repo_dir
performance_setup_args = performance_setup.PerformanceSetupArgs(
performance_directory=args.performance_repo_dir,
core_root_directory=args.core_root_dir,
working_directory=working_directory,
queue=args.queue,
kind=args.run_kind,
no_dynamic_pgo=args.pgo_run_type == "nodynamicpgo",
physical_promotion=args.physical_promotion_run_type,
internal=args.internal,
mono_interpreter=mono_interpreter,
framework=args.framework,
use_local_commit_time=False,
run_categories=args.run_categories,
extra_bdn_args=[] if args.extra_bdn_args is None else args.extra_bdn_args.split(" "),
python="py -3" if args.os_group == "windows" else "python3",
csproj="src\\benchmarks\\micro\\MicroBenchmarks.csproj" if args.os_group == "windows" else "src/benchmarks/micro/MicroBenchmarks.csproj",
**args.additional_performance_setup_parameters
)
performance_setup_data = performance_setup.run(performance_setup_args)
performance_setup_data.set_environment_variables(save_to_pipeline=False)
setup_arguments = dataclasses.replace(performance_setup_data.setup_arguments, **args.additional_ci_setup_parameters)
setup_arguments.local_build = args.local_build
if args.affinity != "0":
setup_arguments.affinity = args.affinity
if args.run_env_vars:
setup_arguments.run_env_vars = [f"{k}={v}" for k, v in args.run_env_vars.items()]
setup_arguments.target_windows = args.os_group == "windows"
if args.os_group == "windows":
os.environ["TargetsWindows"] = "true"
if args.is_scenario:
setup_arguments.output_file = os.path.join(performance_setup_data.payload_directory, "machine-setup")
setup_arguments.install_dir = os.path.join(performance_setup_data.payload_directory, "dotnet")
else:
tools_dir = os.path.join(performance_setup_data.performance_directory, "tools")
setup_arguments.output_file = os.path.join(tools_dir, "machine-setup")
setup_arguments.install_dir = os.path.join(tools_dir, "dotnet", performance_setup_data.architecture)
ci_setup.main(setup_arguments)
if args.is_scenario:
performance_setup_data.payload_directory += os.path.sep
dotnet_path = os.path.join(setup_arguments.install_dir, "dotnet")
shutil.copyfile(os.path.join(args.performance_repo_dir, "NuGet.config"), performance_setup_data.payload_directory)
shutil.copytree(os.path.join(args.performance_repo_dir, "scripts"), os.path.join(performance_setup_data.payload_directory, "scripts"))
shutil.copytree(os.path.join(args.performance_repo_dir, "src", "scenarios", "shared"), os.path.join(performance_setup_data.payload_directory, "shared"))
shutil.copytree(os.path.join(args.performance_repo_dir, "src", "scenarios", "staticdeps"), os.path.join(performance_setup_data.payload_directory, "staticdeps"))
framework = os.environ["PERFLAB_Framework"]
os.environ["PERFLAB_TARGET_FRAMEWORKS"] = framework
if args.os_group == "windows":
runtime_id = f"win-{args.architecture}"
elif args.os_group == "osx":
runtime_id = f"osx-{args.architecture}"
else:
runtime_id = f"linux-{args.architecture}"
# build Startup
RunCommand([
dotnet_path, "publish",
"-c", "Release",
"-o", os.path.join(performance_setup_data.payload_directory, "startup"),
"-f", framework,
"-r", runtime_id,
"--self-contained",
os.path.join(args.performance_repo_dir, "src", "tools", "ScenarioMeasurement", "Startup", "Startup.csproj"),
"-p:DisableTransitiveFrameworkReferenceDownloads=true"]).run()
# build SizeOnDisk
RunCommand([
dotnet_path, "publish",
"-c", "Release",
"-o", os.path.join(performance_setup_data.payload_directory, "SOD"),
"-f", framework,
"-r", runtime_id,
"--self-contained",
os.path.join(args.performance_repo_dir, "src", "tools", "ScenarioMeasurement", "SizeOnDisk", "SizeOnDisk.csproj"),
"-p:DisableTransitiveFrameworkReferenceDownloads=true"]).run()
# build MemoryConsumption
RunCommand([
dotnet_path, "publish",
"-c", "Release",
"-o", os.path.join(performance_setup_data.payload_directory, "MemoryConsumption"),
"-f", framework,
"-r", runtime_id,
"--self-contained",
os.path.join(args.performance_repo_dir, "src", "tools", "ScenarioMeasurement", "MemoryConsumption", "MemoryConsumption.csproj"),
"-p:DisableTransitiveFrameworkReferenceDownloads=true"]).run()
# download PDN
escaped_upload_token = str(os.environ.get("PerfCommandUploadTokenLinux")).replace("%25", "%")
pdn_url = f"https://pvscmdupload.blob.core.windows.net/assets/paint.net.5.0.3.portable.{args.architecture}.zip{escaped_upload_token}"
pdn_dest = os.path.join(performance_setup_data.payload_directory, "PDN")
os.makedirs(pdn_dest)
with urllib.request.urlopen(pdn_url) as response, open(os.path.join(pdn_dest, "PDN.zip"), "wb") as f:
data = response.read()
f.write(data)
environ_copy = os.environ.copy()
python = "py -3" if iswin() else "python3"
os.environ["CorrelationPayloadDirectory"] = performance_setup_data.payload_directory
os.environ["Architecture"] = args.architecture
os.environ["TargetsWindows"] = "true" if args.os_group == "windows" else "false"
os.environ["WorkItemDirectory"] = args.performance_repo_dir
os.environ["HelixTargetQueues"] = args.queue
os.environ["Python"] = python
RunCommand([*(python.split(" ")), "-m", "pip", "install", "--upgrade", "pip"]).run()
RunCommand([*(python.split(" ")), "-m", "pip", "install", "urllib3==1.26.15"]).run()
RunCommand([*(python.split(" ")), "-m", "pip", "install", "requests"]).run()
scenarios_path = os.path.join(args.performance_repo_dir, "src", "scenarios")
script_path = os.path.join(args.performance_repo_dir, "scripts")
os.environ["PYTHONPATH"] = f"{os.environ.get('PYTHONPATH', '')}{os.pathsep}{script_path}{os.pathsep}{scenarios_path}"
print(f"PYTHONPATH={os.environ['PYTHONPATH']}")
os.environ["DOTNET_ROOT"] = setup_arguments.install_dir
os.environ["PATH"] = f"{setup_arguments.install_dir}{os.pathsep}{os.environ['PATH']}"
os.environ["DOTNET_CLI_TELEMETRY_OPTOUT"] = "1"
os.environ["DOTNET_MULTILEVEL_LOOKUP"] = "0"
os.environ["UseSharedCompilation"] = "false"
print("Current dotnet directory:", setup_arguments.install_dir)
print("If more than one version exist in this directory, usually the latest runtime and sdk will be used.")
RunCommand([
"dotnet", "msbuild", args.project_file,
"/restore",
"/t:PreparePayloadWorkItems",
f"/p:RuntimeFlavor={args.runtime_flavor or ''}"
f"/bl:{os.path.join(args.performance_repo_dir, 'artifacts', 'log', performance_setup_data.build_config, 'PrepareWorkItemPayloads.binlog')}"],
verbose=True).run()
if args.os_group == "windows" and args.architecture == "arm64":
RunCommand(["taskkill", "/im", "dotnet.exe", "/f"]).run()
RunCommand(["del", os.path.join(setup_arguments.install_dir, "*"), "/F", "/S", "/Q"]).run()
RunCommand(["xcopy", os.path.join(args.performance_repo_dir, "tools", "dotnet", "arm64", "*"), "/E", "/I", "/Y"]).run()
# restore env vars
os.environ.update(environ_copy)
performance_setup_data.work_item_directory = args.performance_repo_dir
# TODO: Support WASM from runtime repository
if args.os_group == "windows":
cli_arguments = [
"--dotnet-versions", "%DOTNET_VERSION%",
"--cli-source-info", "args",
"--cli-branch", "%PERFLAB_BRANCH%",
"--cli-commit-sha", "%PERFLAB_HASH%",
"--cli-repository", "https://github.com/%PERFLAB_REPO%",
"--cli-source-timestamp", "%PERFLAB_BUILDTIMESTAMP%"
]
else:
cli_arguments = [
"--dotnet-versions", "$DOTNET_VERSION",
"--cli-source-info", "args",
"--cli-branch", "$PERFLAB_BRANCH",
"--cli-commit-sha", "$PERFLAB_HASH",
"--cli-repository", "https://github.com/$PERFLAB_REPO",
"--cli-source-timestamp", "$PERFLAB_BUILDTIMESTAMP"
]
os.environ["DownloadFilesFromHelix"] = "true"
perf_send_to_helix_args = PerfSendToHelixArgs(
helix_source=f"{performance_setup_data.helix_source_prefix}/{args.build_repository_name}/{args.build_source_branch}",
helix_type=f"test/performance/{performance_setup_data.kind}/{args.framework}/{performance_setup_data.architecture}/{args.helix_type_suffix}",
helix_access_token=args.helix_access_token,
helix_target_queues=[performance_setup_data.queue],
helix_pre_commands=helix_pre_commands,
helix_post_commands=helix_post_commands,
creator=performance_setup_data.creator,
architecture=args.architecture,
work_item_timeout=timedelta(hours=4),
work_item_dir=performance_setup_data.work_item_directory,
correlation_payload_dir=performance_setup_data.payload_directory,
project_file=args.project_file,
build_config=performance_setup_data.build_config,
performance_repo_dir=args.performance_repo_dir,
system_access_token=args.system_access_token,
helix_build=args.build_number,
dotnet_cli_package_type="",
dotnet_cli_version="",
enable_xunit_reporter=False,
helix_prereq_commands=[],
include_dotnet_cli=False,
wait_for_work_item_completion=True,
partition_count=args.partition_count,
cli_arguments=cli_arguments,
runtime_flavor=args.runtime_flavor or ""
)
perf_send_to_helix(perf_send_to_helix_args)
results_glob = os.path.join(performance_setup_data.payload_directory, "performance", "artifacts", "helix-results", '**', '*perf-lab-report.json')
all_results: list[Any] = []
for result_file in glob(results_glob, recursive=True):
with open(result_file, 'r', encoding="utf8") as report_file:
all_results.extend(json.load(report_file))
output_counters_for_crank(all_results)
def main(argv: list[str]):
args: dict[str, Any] = {}
i = 1
while i < len(argv):
key = argv[i]
bool_args = {
"--internal": "internal",
"--is-scenario": "is_scenario",
"--local-build": "local_build",
}
if key in bool_args:
args[bool_args[key]] = True
i += 1
continue
simple_arg_map = {
"--queue": "queue",
"--framework": "framework",
"--run-kind": "run_kind",
"--architecture": "architecture",
"--core-root-dir": "core_root_dir",
"--performance-repo-dir": "performance_repo_dir",
"--perflab-upload-token": "perflab_upload_token",
"--helix-access-token": "helix_access_token",
"--system-access-token": "system_access_token",
"--project-file": "project_file",
"--helix-type-suffix": "helix_type_suffix",
"--build-repository-name": "build_repository_name",
"--build-source-branch": "build_source_branch",
"--build-number": "build_number",
"--pgo-run-type": "pgo_run_type",
"--codegen-type": "codegen_type",
"--runtime-type": "runtime_type",
"--run-categories": "run_categories",
"--extra-bdn-args": "extra_bdn_args",
"--affinity": "affinity",
"--os-group": "os_group",
"--os-sub-group": "os_sub_group",
"--runtime-flavor": "runtime_flavor"
}
if key in simple_arg_map:
arg_name = simple_arg_map[key]
val = argv[i + 1]
elif key == "--partition-count":
arg_name = "partition_count"
val = int(argv[i + 1])
else:
raise Exception(f"Invalid argument: {key}")
args[arg_name] = val
i += 2
# TODO: support additional_performance_setup_parameters and additional_ci_setup_parameters
run_performance_job(RunPerformanceJobArgs(**args))
if __name__ == "__main__":
main(sys.argv)

74
scripts/send_to_helix.py Normal file
Просмотреть файл

@ -0,0 +1,74 @@
from dataclasses import dataclass
from datetime import timedelta
import os
from performance.common import RunCommand, iswin
@dataclass
class PerfSendToHelixArgs:
project_file: str
architecture: str
helix_build: str
helix_source: str
helix_type: str
helix_target_queues: list[str]
helix_access_token: str
helix_pre_commands: list[str]
helix_post_commands: list[str]
work_item_dir: str
work_item_timeout: timedelta
correlation_payload_dir: str
include_dotnet_cli: bool
dotnet_cli_package_type: str
dotnet_cli_version: str
enable_xunit_reporter: bool
wait_for_work_item_completion: bool
creator: str
helix_prereq_commands: list[str]
partition_count: int
cli_arguments: list[str]
runtime_flavor: str
performance_repo_dir: str
build_config: str
system_access_token: str
def run_shell(script: str, args: list[str]):
RunCommand(["chmod", "+x", script]).run()
RunCommand([script, *args], verbose=True).run()
def run_powershell(script: str, args: list[str]):
RunCommand(["powershell.exe", script, *args], verbose=True).run()
def perf_send_to_helix(args: PerfSendToHelixArgs):
os.environ["Architecture"] = args.architecture
os.environ["BuildConfig"] = args.build_config
os.environ["HelixSource"] = args.helix_source
os.environ["HelixType"] = args.helix_type
os.environ["HelixBuild"] = args.helix_build
os.environ["HelixTargetQueues"] = ";".join(args.helix_target_queues)
os.environ["HelixAccessToken"] = args.helix_access_token
os.environ["HelixPreCommands"] = ";".join(args.helix_pre_commands)
os.environ["HelixPostCommands"] = ";".join(args.helix_post_commands)
os.environ["HelixPrereqCommands"] = ";".join(args.helix_prereq_commands)
os.environ["WorkItemDirectory"] = args.work_item_dir
os.environ["WorkItemTimeout"] = str(args.work_item_timeout)
os.environ["CorrelationPayloadDirectory"] = args.correlation_payload_dir
os.environ["IncludeDotNetCli"] = str(args.include_dotnet_cli)
os.environ["DotNetCliPackageType"] = args.dotnet_cli_package_type
os.environ["DotNetCliVersion"] = args.dotnet_cli_version
os.environ["EnableXUnitReporter"] = str(args.enable_xunit_reporter)
os.environ["WaitForWorkItemCompletion"] = str(args.wait_for_work_item_completion)
os.environ["Creator"] = str(args.creator)
os.environ["SYSTEM_ACCESSTOKEN"] = args.system_access_token
os.environ["PartitionCount"] = str(args.partition_count)
os.environ["CliArguments"] = " ".join(args.cli_arguments)
binlog_dest = os.path.join(args.performance_repo_dir, "artifacts", "log", args.build_config, "SendToHelix.binlog")
send_params = [args.project_file, "/restore", "/t:Test", f"/p:RuntimeFlavor={args.runtime_flavor}", f"/bl:{binlog_dest}"]
common_dir = os.path.join(args.performance_repo_dir, "eng", "common")
if iswin():
run_powershell(os.path.join(common_dir, "msbuild.ps1"), ["-warnaserror", "0", *send_params])
else:
run_shell(os.path.join(common_dir, "msbuild.sh"), send_params)

Просмотреть файл

@ -2,10 +2,9 @@
Because of how pytest finds things, all import modules must start with scripts.
'''
from scripts.dotnet import CSharpProject, CSharpProjFile
from scripts.dotnet import CSharpProject
import os
import sys
def test_new():
CSharpProject.new('console', 'test_new', False, '.')
CSharpProject.new('console', 'test_new', 'test_bin', False, '.')
assert os.path.isdir('test_new')
assert os.path.isfile(os.path.combine('test_new', 'test_new.csproj'))
assert os.path.isfile(os.path.join('test_new', 'test_new.csproj'))

Просмотреть файл

@ -1,3 +1,5 @@
from random import randint
import uuid
from azure.storage.blob import BlobClient, ContentSettings
from azure.storage.queue import QueueClient, TextBase64EncodePolicy
from azure.core.exceptions import ResourceExistsError
@ -5,20 +7,17 @@ from traceback import format_exc
from glob import glob
from performance.common import retry_on_exception
import os
import time
from logging import getLogger
def get_unique_name(filename, unique_id) -> str:
newname = "{0}-{1}".format(unique_id,
os.path.basename(filename))
def get_unique_name(filename: str, unique_id: str) -> str:
newname = "{0}-{1}".format(unique_id, os.path.basename(filename))
if len(newname) > 1024:
newname = "{0}-perf-lab-report.json".format(randint(1000, 9999))
return newname
def upload(globpath, container, queue, sas_token_env, storage_account_uri):
def upload(globpath: str, container: str, queue: str, sas_token_env: str, storage_account_uri: str):
try:
sas_token_env = sas_token_env
sas_token = os.getenv(sas_token_env)
if sas_token is None:
getLogger().error("Sas token environment variable {} was not defined.".format(sas_token_env))
@ -27,7 +26,7 @@ def upload(globpath, container, queue, sas_token_env, storage_account_uri):
files = glob(globpath, recursive=True)
any_upload_or_queue_failed = False
for infile in files:
blob_name = get_unique_name(infile, os.getenv('HELIX_WORKITEM_ID'))
blob_name = get_unique_name(infile, os.getenv('HELIX_WORKITEM_ID') or str(uuid.uuid4()))
getLogger().info("uploading {}".format(infile))
@ -61,4 +60,4 @@ def upload(globpath, container, queue, sas_token_env, storage_account_uri):
except Exception as ex:
getLogger().error('{0}: {1}'.format(type(ex), str(ex)))
getLogger().error(format_exc())
return 1
return 1

Просмотреть файл

@ -134,8 +134,9 @@ class AndroidInstrumentationHelper(object):
# rethrow the original exception
raise
if runninginlab():
copytree(TRACEDIR, os.path.join(helixuploaddir(), 'traces'))
helix_upload_dir = helixuploaddir()
if runninginlab() and helix_upload_dir is not None:
copytree(TRACEDIR, os.path.join(helix_upload_dir, 'traces'))
if uploadtokenpresent():
import upload
globpath = os.path.join(

Просмотреть файл

@ -7,15 +7,16 @@ routines for fixing up code in templates
'''
from re import sub
from typing import List
def readfile(file: str) -> []:
ret = []
def readfile(file: str) -> List[str]:
ret: List[str] = []
with open(file, "r") as opened:
for line in opened:
ret.append(line)
return ret
def writefile(file: str, lines: []):
def writefile(file: str, lines: List[str]):
with open(file, "w") as opened:
opened.writelines(lines)
@ -29,7 +30,7 @@ def insert_after(file: str, search: str, insert: str):
writefile(file, lines)
def replace_line(file: str, search: str, replace: str):
lines = []
lines: List[str] = []
for line in readfile(file):
lines.append(sub(search, replace, line))
writefile(file, lines)

Просмотреть файл

@ -7,6 +7,7 @@ import os
from logging import getLogger
from argparse import ArgumentParser
from typing import Any, Optional
from shared import const
class CrossgenArguments:
@ -17,10 +18,10 @@ class CrossgenArguments:
'''
def __init__(self):
self.coreroot = None
self.singlefile = None
self.compositefile = None
self.singlethreaded = None
self.coreroot: str = None
self.singlefile: Optional[str] = None
self.compositefile: Optional[str] = None
self.singlethreaded: Optional[bool] = None
def add_crossgen_arguments(self, parser: ArgumentParser):
"Arguments to generate AOT code with Crossgen"
@ -98,7 +99,7 @@ Suppress internal Crossgen2 parallelism
'''
)
def parse_crossgen_args(self, args):
def parse_crossgen_args(self, args: Any):
self.singlefile = args.single
self.coreroot = args.coreroot
@ -109,7 +110,7 @@ Suppress internal Crossgen2 parallelism
getLogger().error('Specify an assembly to crossgen with --single <assembly name>')
sys.exit(1)
def parse_crossgen2_args(self, args):
def parse_crossgen2_args(self, args: Any):
self.coreroot = args.coreroot
self.singlefile = args.single
self.compositefile = args.composite

Просмотреть файл

@ -10,7 +10,7 @@ def remove_aab_files(output_dir="."):
if file.endswith(".aab"):
os.remove(os.path.join(output_dir, file))
def install_versioned_maui(precommands):
def install_versioned_maui(precommands: PreCommands):
target_framework_wo_platform = precommands.framework.split('-')[0]
# Download what we need

Просмотреть файл

@ -8,6 +8,7 @@ import shutil
import subprocess
from logging import getLogger
from argparse import ArgumentParser
from typing import List, Optional
from dotnet import CSharpProject, CSharpProjFile
from shared import const
from shared.crossgen import CrossgenArguments
@ -43,7 +44,7 @@ class PreCommands:
parser = ArgumentParser()
subparsers = parser.add_subparsers(title='Operations',
description='Common preperation steps for perf tests. Should run under src\scenarios\<test asset folder>',
description='Common preperation steps for perf tests. Should run under src\\scenarios\\<test asset folder>',
dest='operation')
default_parser = subparsers.add_parser(DEFAULT, help='Default operation (placeholder command and no specific operation will be executed)' )
@ -118,7 +119,7 @@ class PreCommands:
bin_dir: str,
exename: str,
working_directory: str,
language: str = None,
language: Optional[str] = None,
no_https: bool = False,
no_restore: bool = True):
'makes a new app with the given template'
@ -194,7 +195,7 @@ class PreCommands:
self.project = CSharpProject(csproj, const.BINDIR)
self._updateframework(csproj.file_name)
def execute(self, build_args: list = []):
def execute(self, build_args: List[str] = []):
'Parses args and runs precommands'
if self.operation == DEFAULT:
pass
@ -210,6 +211,7 @@ class PreCommands:
if self.nativeaot:
build_args.append('/p:PublishAot=true')
build_args.append('/p:PublishAotUsingRuntimePack=true')
build_args.append("/p:EnableWindowsTargeting=true")
self._publish(configuration=self.configuration, runtime_identifier=self.runtime_identifier, framework=self.framework, output=self.output, build_args=build_args)
if self.operation == CROSSGEN:
startup_args = [
@ -267,7 +269,7 @@ class PreCommands:
staticpath = os.path.join(helixpayload(), "staticdeps")
shutil.copyfile(os.path.join(staticpath, f"PerfLab.{language_file_extension}"), os.path.join(projpath, f"PerfLab.{language_file_extension}"))
def install_workload(self, workloadid: str, install_args: list = ["--skip-manifest-update"]):
def install_workload(self, workloadid: str, install_args: List[str] = ["--skip-manifest-update"]):
'Installs the workload, if needed'
if not self.has_workload:
if self.readonly_dotnet:
@ -303,7 +305,7 @@ class PreCommands:
else:
replace_line(projectfile, r'<TargetFramework>.*?</TargetFramework>', f'<TargetFramework>{self.framework}</TargetFramework>')
def _publish(self, configuration: str, framework: str = None, runtime_identifier: str = None, output: str = None, build_args: list = []):
def _publish(self, configuration: str, framework: str, runtime_identifier: Optional[str] = None, output: Optional[str] = None, build_args: List[str] = []):
self.project.publish(configuration,
output or const.PUBDIR,
True,
@ -314,12 +316,12 @@ class PreCommands:
*['-bl:%s' % self.binlog] if self.binlog else [],
*build_args)
def _restore(self):
def _restore(self, restore_args: List[str] = ["/p:EnableWindowsTargeting=true"]):
self.project.restore(packages_path=get_packages_directory(),
verbose=True,
args=['-bl:%s-restore.binlog' % self.binlog] if self.binlog else [])
args=(['-bl:%s-restore.binlog' % self.binlog] if self.binlog else []) + restore_args)
def _build(self, configuration: str, framework: str = None, output: str = None, build_args: list = []):
def _build(self, configuration: str, framework: str, output: Optional[str] = None, build_args: List[str] = []):
self.project.build(configuration,
True,
get_packages_directory(),