* Update markdowns to net8.0.

* Removed BDN and ToC headers from benchmarkdotnet.md table of contents.
This commit is contained in:
Parker Bibus 2022-11-09 12:13:39 -08:00 коммит произвёл GitHub
Родитель b29ee95f57
Коммит 794a3ffe3c
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
10 изменённых файлов: 95 добавлений и 88 удалений

Просмотреть файл

@ -34,10 +34,10 @@ For **Self-Contained Empty Console App Size On Disk** scenario, run precommand t
```cmd
cd emptyconsoletemplate
python3 pre.py publish -f net7.0 -c Release -r win-x64
python3 pre.py publish -f net8.0 -c Release -r win-x64
```
`-f net7.0` sets the new template project targeting `net7.0` framework; `-c Release` configures the publish to be in release; `-r win-x64` takes an [RID](https://docs.microsoft.com/en-us/dotnet/core/rid-catalog)(Runtime Identifier) and specifies which runtime it supports.
`-f net8.0` sets the new template project targeting `net8.0` framework; `-c Release` configures the publish to be in release; `-r win-x64` takes an [RID](https://docs.microsoft.com/en-us/dotnet/core/rid-catalog)(Runtime Identifier) and specifies which runtime it supports.
**Note that by specifying RID option `-r <RID>`, it defaults to publish the app into a [SCD](https://docs.microsoft.com/en-us/dotnet/core/deploying/#publish-self-contained)(Self-contained Deployment) app; without it, a [FDD](https://docs.microsoft.com/en-us/dotnet/core/deploying/#publish-framework-dependent)(Framework Dependent Deployment) app will be published.**
@ -84,27 +84,28 @@ Same instruction of [Scenario Tests Guide - Step 4](./scenarios-workflow.md#step
- netcoreapp3.1
- net6.0
- net7.0
- net8.0
- \<-r RID> values:
- ""(WITHOUT `-r <RID>` --> FDD app)
- `"-r <RID>"` (WITH `-r` --> SCD app, [list of RID](https://docs.microsoft.com/en-us/dotnet/core/rid-catalog))
| Scenario | Asset Directory | Precommand | Testcommand | Postcommand | Supported Framework | Supported Platform |
|-----------------------------------------------|-------------------------|-----------------------------------------------|-----------------|-------------|--------------------------------------------------|--------------------|
| Static Console Template Publish Startup | staticconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0 | Windows |
| Static Console Template Publish SizeOnDisk | staticconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| Static Console Template Build SizeOnDisk | staticconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| Static VB Console Template Publish Startup | staticvbconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0 | Windows |
| Static VB Console Template Publish SizeOnDisk | staticvbconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| Static VB Console Template Build SizeOnDisk | staticvbconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| Static Console Template Publish Startup | staticconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows |
| Static Console Template Publish SizeOnDisk | staticconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| Static Console Template Build SizeOnDisk | staticconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| Static VB Console Template Publish Startup | staticvbconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows |
| Static VB Console Template Publish SizeOnDisk | staticvbconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| Static VB Console Template Build SizeOnDisk | staticvbconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| Static Winforms Template Publish Startup | staticwinformstemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1 | Windows |
| Static Winforms Template Publish SizeOnDisk | staticwinformstemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1 | Windows;Linux |
| Static Winforms Template Build SizeOnDisk | staticwinformstemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1 | Windows;Linux |
| New Console Template Publish Startup | emptyconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0 | Windows |
| New Console Template Publish SizeOnDisk | emptyconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| New Console Template Build SizeOnDisk | emptyconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| New VB Console Template Publish Startup | emptyvbconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0 | Windows |
| New VB Console Template Publish SizeOnDisk | emptyvbconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| New VB Console Template Build SizeOnDisk | emptyvbconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0 | Windows;Linux |
| New Console Template Publish Startup | emptyconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows |
| New Console Template Publish SizeOnDisk | emptyconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| New Console Template Build SizeOnDisk | emptyconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| New VB Console Template Publish Startup | emptyvbconsoletemplate | pre.py publish -f TFM -c Release | test.py startup | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows |
| New VB Console Template Publish SizeOnDisk | emptyvbconsoletemplate | pre.py publish -f TFM -c Release /<-r RID> | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
| New VB Console Template Build SizeOnDisk | emptyvbconsoletemplate | pre.py build -f TFM -c Release | test.py sod | post.py | netcoreapp3.1;net6.0;net7.0;net8.0 | Windows;Linux |
## Relevant Links

Просмотреть файл

@ -17,10 +17,10 @@ BenchmarkDotNet is the benchmarking tool that allows to run benchmarks for .NET,
- [Reading the Results](#reading-the-results)
- [Reading the Histogram](#reading-the-histogram)
- [Reading Memory Statistics](#reading-memory-statistics)
- [Multiple Runtimes](#multiple-runtimes)
- [Regressions](#regressions)
- [Profiling](#profiling)
- [Disassembly](#disassembly)
- [Multiple Runtimes](#multiple-runtimes)
- [Regressions](#regressions)
- [Private Runtime Builds](#private-runtime-builds)
- [Running In Process](#running-in-process)
- [CoreRun](#corerun)
@ -59,7 +59,7 @@ In order to build or run the benchmarks you will need the **.NET Core command-li
### Using .NET Cli
To build the benchmarks you need to have the right `dotnet cli`. This repository allows you to benchmark .NET Core 3.1, .NET 6.0 and .NET 7.0 so you need to install all of them.
To build the benchmarks you need to have the right `dotnet cli`. This repository allows you to benchmark .NET Core 3.1, .NET 6.0, .NET 7.0, and .NET 8.0 so you need to install all of them.
All you need to do is run the following command:
@ -70,8 +70,8 @@ dotnet build -c Release
If you don't want to install all of them and just run the benchmarks for selected runtime(s), you need to manually edit the [MicroBenchmarks.csproj](../src/benchmarks/micro/MicroBenchmarks.csproj) file.
```diff
-<TargetFrameworks>netcoreapp3.1;net6.0;net7.0</TargetFrameworks>
+<TargetFrameworks>net7.0</TargetFrameworks>
-<TargetFrameworks>netcoreapp3.1;net6.0;net7.0;net8.0</TargetFrameworks>
+<TargetFrameworks>net8.0</TargetFrameworks>
```
The alternative is to set `PERFLAB_TARGET_FRAMEWORKS` environment variable to selected Target Framework Moniker.
@ -81,7 +81,7 @@ The alternative is to set `PERFLAB_TARGET_FRAMEWORKS` environment variable to se
If you don't want to install `dotnet cli` manually, we have a Python 3 script which can do that for you. All you need to do is to provide the frameworks:
```cmd
py .\scripts\benchmarks_ci.py --frameworks net7.0
py .\scripts\benchmarks_ci.py --frameworks net8.0
```
## Running the Benchmarks
@ -91,7 +91,7 @@ py .\scripts\benchmarks_ci.py --frameworks net7.0
To run the benchmarks in interactive mode you have to execute `dotnet run -c Release -f $targetFrameworkMoniker` in the folder with benchmarks project.
```cmd
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net7.0
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net8.0
Available Benchmarks:
#0 Burgers
#1 ByteMark
@ -122,37 +122,37 @@ The glob patterns are applied to full benchmark name: namespace.typeName.methodN
- Run all the benchmarks from BenchmarksGame namespace:
```cmd
dotnet run -c Release -f net7.0 --filter BenchmarksGame*
dotnet run -c Release -f net8.0 --filter BenchmarksGame*
```
- Run all the benchmarks with type name Richards:
```cmd
dotnet run -c Release -f net7.0 --filter *.Richards.*
dotnet run -c Release -f net8.0 --filter *.Richards.*
```
- Run all the benchmarks with method name ToStream:
```cmd
dotnet run -c Release -f net7.0 --filter *.ToStream
dotnet run -c Release -f net8.0 --filter *.ToStream
```
- Run ALL benchmarks:
```cmd
dotnet run -c Release -f net7.0 --filter *
dotnet run -c Release -f net8.0 --filter *
```
- You can provide many filters (logical disjunction):
```cmd
dotnet run -c Release -f net7.0 --filter System.Collections*.Dictionary* *.Perf_Dictionary.*
dotnet run -c Release -f net8.0 --filter System.Collections*.Dictionary* *.Perf_Dictionary.*
```
- To print a **joined summary** for all of the benchmarks (by default printed per type), use `--join`:
```cmd
dotnet run -c Release -f net7.0 --filter BenchmarksGame* --join
dotnet run -c Release -f net8.0 --filter BenchmarksGame* --join
```
Please remember that on **Unix** systems `*` is resolved to all files in current directory, so you need to escape it `'*'`.
@ -164,7 +164,7 @@ To print the list of all available benchmarks you need to pass `--list [tree/fla
Example: Show the tree of all the benchmarks from System.Threading namespace that can be run for .NET 7.0:
```cmd
dotnet run -c Release -f net7.0 --list tree --filter System.Threading*
dotnet run -c Release -f net8.0 --list tree --filter System.Threading*
```
```log
@ -259,7 +259,7 @@ If you want to disassemble the benchmarked code, you need to use the [Disassembl
You can do that by passing `--disassm` to the app or by using `[DisassemblyDiagnoser(printAsm: true, printSource: true)]` attribute or by adding it to your config with `config.With(DisassemblyDiagnoser.Create(new DisassemblyDiagnoserConfig(printAsm: true, recursiveDepth: 1))`.
Example: `dotnet run -c Release -f net7.0 -- --filter System.Memory.Span<Int32>.Reverse -d`
Example: `dotnet run -c Release -f net8.0 -- --filter System.Memory.Span<Int32>.Reverse -d`
```assembly
; System.Runtime.InteropServices.MemoryMarshal.GetReference[[System.Byte, System.Private.CoreLib]](System.Span`1<Byte>)
@ -285,30 +285,30 @@ M00_L00:
The `--runtimes` or just `-r` allows you to run the benchmarks for **multiple Runtimes**.
Available options are: Mono, wasmnet70, CoreRT, net462, net47, net471, net472, netcoreapp3.1, net6.0 and net7.0.
Available options are: Mono, wasmnet70, CoreRT, net462, net47, net471, net472, netcoreapp3.1, net6.0, net7.0, and net8.0.
Example: run the benchmarks for .NET 6.0 and 7.0:
Example: run the benchmarks for .NET 7.0 and 8.0:
```cmd
dotnet run -c Release -f net6.0 --runtimes net6.0 net7.0
dotnet run -c Release -f net7.0 --runtimes net7.0 net8.0
```
**Important: The host process needs to be the lowest common API denominator of the runtimes you want to compare!** In this case, it was `net6.0`.
**Important: The host process needs to be the lowest common API denominator of the runtimes you want to compare!** In this case, it was `net7.0`.
## Regressions
To perform a Mann–Whitney U Test and display the results in a dedicated column you need to provide the Threshold for Statistical Test via `--statisticalTest` argument. The value can be relative (5%) or absolute (10ms, 100ns, 1s)
Example: run Mann–Whitney U test with relative ratio of 5% for `BinaryTrees_2` for .NET 6.0 (base) vs .NET 7.0 (diff). .NET 6.0 will be baseline because it was first.
Example: run Mann–Whitney U test with relative ratio of 5% for `BinaryTrees_2` for .NET 7.0 (base) vs .NET 8.0 (diff). .NET 7.0 will be baseline because it was first.
```cmd
dotnet run -c Release -f net7.0 --filter *BinaryTrees_2* --runtimes net6.0 net7.0 --statisticalTest 5%
dotnet run -c Release -f net8.0 --filter *BinaryTrees_2* --runtimes net7.0 net8.0 --statisticalTest 5%
```
| Method | Toolchain | Mean | MannWhitney(5%) |
|-------------- |-------------- |---------:|---------------- |
| BinaryTrees_2 | net6.0 | 124.4 ms | Base |
| BinaryTrees_2 | net7.0 | 153.7 ms | Slower |
| BinaryTrees_2 | net7.0 | 124.4 ms | Base |
| BinaryTrees_2 | net6.0 | 153.7 ms | Slower |
**Note:** to compare the historical results you need to use [Results Comparer](../src/tools/ResultsComparer/README.md)
@ -329,7 +329,7 @@ Please use this option only when you are sure that the benchmarks you want to ru
It's possible to benchmark private builds of [dotnet/runtime](https://github.com/dotnet/runtime) using CoreRun.
```cmd
dotnet run -c Release -f net7.0 --coreRun $thePath
dotnet run -c Release -f net8.0 --coreRun $thePath
```
**Note:** You can provide more than 1 path to CoreRun. In such case, the first path will be the baseline and all the benchmarks are going to be executed for all CoreRuns you have specified.
@ -352,7 +352,7 @@ public void PrintInfo()
You can also use any dotnet cli to build and run the benchmarks.
```cmd
dotnet run -c Release -f net7.0 --cli "C:\Projects\performance\.dotnet\dotnet.exe"
dotnet run -c Release -f net8.0 --cli "C:\Projects\performance\.dotnet\dotnet.exe"
```
This is very useful when you want to compare different builds of .NET.
@ -374,5 +374,5 @@ More info can be found [here](https://github.com/dotnet/BenchmarkDotNet/issues/7
To run benchmarks with private CoreRT build you need to provide the `IlcPath`. Example:
```cmd
dotnet run -c Release -f net7.0 -- --ilcPath C:\Projects\corert\bin\Windows_NT.x64.Release
dotnet run -c Release -f net8.0 -- --ilcPath C:\Projects\corert\bin\Windows_NT.x64.Release
```

Просмотреть файл

@ -8,6 +8,8 @@
- [Code Organization](#code-organization)
- [dotnet runtime Prerequisites for CLR](#dotnet-runtime-prerequisites-for-clr)
- [dotnet runtime Prerequisites for wasm](#dotnet-runtime-prerequisites-for-wasm)
- [Run the benchmarks with the interpreter](#run-the-benchmarks-with-the-interpreter)
- [Run the benchmarks with AOT](#run-the-benchmarks-with-aot)
- [Preventing Regressions](#preventing-regressions)
- [Running against the latest .NET Core SDK](#running-against-the-latest-net-core-sdk)
- [Solving Regressions](#solving-regressions)
@ -100,7 +102,7 @@ During the port from xunit-performance to BenchmarkDotNet, the namespaces, type
Please remember that you can filter the benchmarks using a glob pattern applied to namespace.typeName.methodName ([read more](./benchmarkdotnet.md#Filtering-the-Benchmarks)):
```cmd
dotnet run -c Release -f net7.0 --filter System.Memory*
dotnet run -c Release -f net8.0 --filter System.Memory*
```
(Run the above command on `src/benchmarks/micro/MicroBenchmarks.csproj`.)
@ -120,8 +122,8 @@ C:\Projects\runtime> build -c Release
Every time you want to run the benchmarks against local build of [dotnet/runtime](https://github.com/dotnet/runtime) you need to provide the path to CoreRun:
```cmd
dotnet run -c Release -f net7.0 --filter $someFilter \
--coreRun C:\Projects\runtime\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe
dotnet run -c Release -f net8.0 --filter $someFilter \
--coreRun C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe
```
**Note:** BenchmarkDotNet expects a path to `CoreRun.exe` file (`corerun` on Unix), not to `Core_Root` folder.
@ -135,7 +137,7 @@ C:\Projects\runtime\src\libraries\System.Text.RegularExpressions\src> dotnet msb
**Note:** the exception to this rule are libraries that **are not part of the shared SDK**. The `build` script of the runtime repo does not copy them to the CoreRun folder so you need to do it on your own:
```cmd
cp artifacts\bin\runtime\net7.0-Windows_NT-Release-x64\Microsoft.Extensions.Caching.Memory.dll artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\
cp artifacts\bin\runtime\net8.0-Windows_NT-Release-x64\Microsoft.Extensions.Caching.Memory.dll artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\
```
Of course only if you want to benchmark these specific libraries. If you don't, the default versions defined in [MicroBenchmarks.csproj](../src/benchmarks/micro/MicroBenchmarks.csproj) project file are going to get used.
@ -156,14 +158,14 @@ In order to run the benchmarks against local [dotnet/runtime](https://github.com
/path/to/dotnet/runtime$ ./dotnet.sh build -p:TargetOS=Browser -p:TargetArchitecture=wasm -c Release src/mono/wasm/Wasm.Build.Tests /t:InstallWorkloadUsingArtifacts
```
This would produce `/path/to/dotnet/runtime/artifacts/bin/dotnet-net7+latest`, which should be used to run the benchmarks.
This would produce `/path/to/dotnet/runtime/artifacts/bin/dotnet-net8+latest`, which should be used to run the benchmarks.
3. And you need `/path/to/dotnet/runtime/src/mono/wasm/test-main.js`
#### Run the benchmarks with the interpreter
```cmd
/path/to/dotnet/performance$ python3 ./scripts/benchmarks_ci.py -f net7.0 --dotnet-path </path/to/dotnet/runtime/>artifacts/bin/dotnet-net7+latest --wasm --bdn-artifacts artifacts/BenchmarkDotNet.Artifacts
/path/to/dotnet/performance$ python3 ./scripts/benchmarks_ci.py -f net8.0 --dotnet-path </path/to/dotnet/runtime/>artifacts/bin/dotnet-net8+latest --wasm --bdn-artifacts artifacts/BenchmarkDotNet.Artifacts
--bdn-arguments="--anyCategories Libraries Runtime --category-exclusion-filter NoInterpreter NoWASM NoMono --logBuildOutput --wasmDataDir </path/to/dotnet/runtime>/src/mono/wasm --filter <filter>"
```
@ -172,7 +174,7 @@ This would produce `/path/to/dotnet/runtime/artifacts/bin/dotnet-net7+latest`, w
Essentially, add `--aotcompilermode wasm` to the `--bdn-arguments=".."`:
```cmd
/path/to/dotnet/performance$ python3 ./scripts/benchmarks_ci.py --csproj src/benchmarks/micro/MicroBenchmarks.csproj -f net7.0 --dotnet-path </path/to/dotnet/runtime/>artifacts/bin/dotnet-net7+latest --wasm --bdn-artifacts artifacts/BenchmarkDotNet.Artifacts
/path/to/dotnet/performance$ python3 ./scripts/benchmarks_ci.py --csproj src/benchmarks/micro/MicroBenchmarks.csproj -f net8.0 --dotnet-path </path/to/dotnet/runtime/>artifacts/bin/dotnet-net8+latest --wasm --bdn-artifacts artifacts/BenchmarkDotNet.Artifacts
--bdn-arguments="--category-exclusion-filter NoInterpreter NoWASM NoMono --aotcompilermode wasm --logBuildOutput --buildTimeout 3600 --wasmDataDir </path/to/dotnet/runtime>/src/mono/wasm --filter <filter>"
```
@ -183,9 +185,9 @@ Preventing regressions is a fundamental part of our performance culture. The che
**Before introducing any changes that may impact performance**, you should run the benchmarks that test the performance of the feature that you are going to work on and store the results in a **dedicated** folder.
```cmd
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net7.0 \
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net8.0 \
--artifacts "C:\results\before" \
--coreRun "C:\Projects\runtime\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe" \
--coreRun "C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe" \
--filter System.IO.Pipes*
```
@ -198,9 +200,9 @@ After you introduce the changes and rebuild the part of [dotnet/runtime](https:/
```cmd
C:\Projects\runtime\src\libraries\System.IO.Pipes\src> dotnet msbuild /p:Configuration=Release
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net7.0 \
C:\Projects\performance\src\benchmarks\micro> dotnet run -c Release -f net8.0 \
--artifacts "C:\results\after" \
--coreRun "C:\Projects\runtime\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe" \
--coreRun "C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe" \
--filter System.IO.Pipes*
```
@ -225,7 +227,7 @@ No Slower results for the provided threshold = 2% and noise filter = 0.3ns.
To run the benchmarks against the latest .NET Core SDK you can use the [benchmarks_ci.py](../scripts/benchmarks_ci.py) script. It's going to download the latest .NET Core SDK(s) for the provided framework(s) and run the benchmarks for you. Please see [Prerequisites](./prerequisites.md#python) for more.
```cmd
C:\Projects\performance> py scripts\benchmarks_ci.py -f net7.0 \
C:\Projects\performance> py scripts\benchmarks_ci.py -f net8.0 \
--bdn-arguments="--artifacts "C:\results\latest_sdk"" \
--filter System.IO.Pipes*
```
@ -245,7 +247,7 @@ The real performance investigation starts with profiling. We have a comprehensiv
To profile the benchmarked code and produce an ETW Trace file ([read more](./benchmarkdotnet.md#Profiling)):
```cmd
dotnet run -c Release -f net7.0 --profiler ETW --filter $YourFilter
dotnet run -c Release -f net8.0 --profiler ETW --filter $YourFilter
```
The benchmarking tool is going to print the path to the `.etl` trace file. You should open it with PerfView or Windows Performance Analyzer and start the analysis from there. If you are not familiar with PerfView, you should watch [PerfView Tutorial](https://channel9.msdn.com/Series/PerfView-Tutorial) by @vancem first. It's an investment that is going to pay off very quickly.
@ -262,7 +264,7 @@ If profiling using the `--profiler ETW` is not enough, you should use a differen
BenchmarkDotNet has some extra features that might be useful when doing performance investigation:
- You can run the benchmarks against [multiple Runtimes](./benchmarkdotnet.md#Multiple-Runtimes). It can be very useful when the regression has been introduced between .NET Core releases, for example: between net6.0 and net7.0.
- You can run the benchmarks against [multiple Runtimes](./benchmarkdotnet.md#Multiple-Runtimes). It can be very useful when the regression has been introduced between .NET Core releases, for example: between net7.0 and net8.0.
- You can run the benchmarks using provided [dotnet cli](./benchmarkdotnet.md#dotnet-cli). You can download few dotnet SDKs, unzip them and just run the benchmarks to spot the version that has introduced the regression to narrow down your investigation.
- You can run the benchmarks using few [CoreRuns](./benchmarkdotnet.md#CoreRun). You can build the latest [dotnet/runtime](https://github.com/dotnet/runtime) in Release, create a copy of the folder with CoreRun and use git to checkout an older commit. Then rebuild [dotnet/runtime](https://github.com/dotnet/runtime) and run the benchmarks against the old and new builds. This can narrow down your investigation to the commit that has introduced the bug.
@ -313,7 +315,7 @@ Because the benchmarks are not in the [dotnet/runtime](https://github.com/dotnet
The first thing you need to do is send a PR with the new API to the [dotnet/runtime](https://github.com/dotnet/runtime) repository. Once your PR gets merged and a new NuGet package is published to the [dotnet/runtime](https://github.com/dotnet/runtime) NuGet feed, you should remove the Reference to a `.dll` and install/update the package consumed by [MicroBenchmarks](../src/benchmarks/micro/MicroBenchmarks.csproj). You can do this by running the following script locally:
```cmd
/home/adsitnik/projects/performance>python3 ./scripts/benchmarks_ci.py --filter $YourFilter -f net7.0
/home/adsitnik/projects/performance>python3 ./scripts/benchmarks_ci.py --filter $YourFilter -f net8.0
```cmd
This script will try to pull the latest .NET Core SDK from [dotnet/runtime](https://github.com/dotnet/runtime) nightly build, which should contain the new API that you just merged in your first PR, and use that to build MicroBenchmarks project and then run the benchmarks that satisfy the filter you provided.

Просмотреть файл

@ -84,7 +84,7 @@ Same instruction of [Step 4 in Scenario Tests Guide](scenarios-workflow.md#step-
For the purpose of quick reference, the commands can be summarized into the following matrix:
| Scenario | Asset Directory | Precommand | Testcommand | Postcommand | Supported Framework | Supported Platform |
|-------------------------------------|-----------------|-------------------------------------------------------------|-------------------------------------------------------------------|-------------|---------------------|--------------------|
| SOD - New Blazor Template - Publish | blazor | pre.py publish --msbuild "/p:_TrimmerDumpDependencies=true" | test.py sod --scenario-name "SOD - New Blazor Template - Publish" | post.py | net7.0 | Windows;Linux |
| SOD - New Blazor Template - Publish | blazor | pre.py publish --msbuild "/p:_TrimmerDumpDependencies=true" | test.py sod --scenario-name "SOD - New Blazor Template - Publish" | post.py | net7.0; net8.0 | Windows;Linux |
## Relevant Links

Просмотреть файл

@ -65,16 +65,16 @@ The build produces two things that we care about:
* `dotnet` and all `System.XYZ.dlls` used internally to run Libraries unit tests. It can be used by Visual Studio Profiler to run the code that you want to profile. Example:
```log
C:\Projects\runtime\artifacts\bin\testhost\net6.0-windows-Release-x64\dotnet.exe
C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\dotnet.exe
```
* `CoreRun` and all `System.XYZ.dlls` that can be used to run the code that you want to profile. Example:
```log
C:\Projects\runtime\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe
C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe
```
* But the dotnet/runtime build only produces the artifacts necessary for a _runtime_, not for an _sdk_. Visual Studio will require a full SDK to be able to compile your console app from the next step. One way to convert your generated _runtime_ into a full _sdk_, is to navigate to the `runtime\.dotnet\` folder, copy the `packs` and `sdk` folders located inside, and then paste them inside `runtime\artifacts\bin\testhost\net6.0-windows-Release-x64\`.
* But the dotnet/runtime build only produces the artifacts necessary for a _runtime_, not for an _sdk_. Visual Studio will require a full SDK to be able to compile your console app from the next step. One way to convert your generated _runtime_ into a full _sdk_, is to navigate to the `runtime\.dotnet\` folder, copy the `packs` and `sdk` folders located inside, and then paste them inside `runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\`.
Once you rebuild the part of [dotnet/runtime](https://github.com/dotnet/runtime) you are working on, the appropriate `.dll` gets updated and the next time you run profiler, dotnet|CoreRun is going to use the updated library.
@ -134,7 +134,7 @@ It's recommended to disable Tiered JIT (to avoid the need of warmup) and emit fu
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net6.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<DebugType>pdbonly</DebugType>
<DebugSymbols>true</DebugSymbols>
@ -178,7 +178,7 @@ start %sln%
You can just save it as `startvs.cmd` file and run providing path to the `testhost` folder produced by [dotnet/runtime](https://github.com/dotnet/runtime) build and a VS solution with repo project:
```cmd
startvs.cmd "C:\Projects\runtime\artifacts\bin\testhost\net6.0-windows-Release-x64\" "C:\Projects\repro\ProfilingDocs.sln"
startvs.cmd "C:\Projects\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\" "C:\Projects\repro\ProfilingDocs.sln"
```
### CPU Usage

Просмотреть файл

@ -77,10 +77,10 @@ In our **startup time of an empty console template** example, we can run
```cmd
cd emptyconsoletemplate
python3 pre.py publish -f net6.0 -c Release
python3 pre.py publish -f net8.0 -c Release
```
The above command creates a new dotnet console template in `emptyconsoletemplate\app\` folder, builds the project targeting net6.0 in Release and publishs it to `emptyconsoletemplate\pub\` folder.
The above command creates a new dotnet console template in `emptyconsoletemplate\app\` folder, builds the project targeting net8.0 in Release and publishs it to `emptyconsoletemplate\pub\` folder.
Run `python3 pre.py --help` for more command options and their meanings.

Просмотреть файл

@ -45,19 +45,21 @@ Run precommand to create a new console template.
```cmd
cd emptyconsoletemplate
python3 pre.py default -f net6.0
python3 pre.py default -f net8.0
```
The `default` command prepares the asset (creating a new project if the asset is a template and copying it to `app\`). Now there should be source code of a console template project under `app\`.
Note that it is important to be aware SDK takes different paths for different TFMs, and you can configure which TFM your SDK tests against. Howeverm your SDK version should be >= the TFM version because SDK cannot build a project that has a newer runtime. Here's a matrix of valid SDK vs. TFM combinations:
| | netcoreapp2.1 | netcoreapp3.1 | net5.0 | net6.0 |
|--------------|---------------|---------------|--------|--------|
| .NET 2.1 SDK | x | | | |
| .NET 3.1 SDK | x | x | | |
| .NET 5 SDK | x | x | x | |
| .NET 6 SDK | x | x | x | x |
| | netcoreapp2.1 | netcoreapp3.1 | net5.0 | net6.0 | net7.0 | net8.0 |
|--------------|---------------|---------------|--------|--------|--------|--------|
| .NET 2.1 SDK | x | | | | | |
| .NET 3.1 SDK | x | x | | | | |
| .NET 5 SDK | x | x | x | | | |
| .NET 6 SDK | x | x | x | x | | |
| .NET 7 SDK | x | x | x | x | x | |
| .NET 8 SDK | x | x | x | x | x | x |
You can change TFM of the project by specifying `-f <tfm>`, which allows to replace the `<TargetFramework></TargetFramework>` property in the project file to be the custom TFM value (make sure it's a valid TFM value) you specified.
@ -101,18 +103,20 @@ Same instruction of [Step 4 in Scenario Tests Guide](scenarios-workflow.md#step-
- netcoreapp3.1
- net5.0
- net6.0
- net7.0
- net8.0
- \<build option> values:
- clean_build
- build_no_change
| Scenario | Asset Directory | Precommand | Testcommand | Postcommand | Supported Framework | Supported Platform |
|:------------------------------|:---------------------|:-------------------------|:----------------------------|:------------|:------------------------------------------|:-------------------|
| SDK Console Template | emptyconsoletemplate | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0 | Windows;Linux |
| SDK .NET 2.0 Library Template | netstandard2.0 | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0 | Windows;Linux |
| SDK ASP.NET MVC App Template | mvcapptemplate | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1;net5.0;net6.0 | Windows;Linux |
| SDK Console Template | emptyconsoletemplate | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0;net7.0;net8.0 | Windows;Linux |
| SDK .NET 2.0 Library Template | netstandard2.0 | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0;net7.0;net8.0 | Windows;Linux |
| SDK ASP.NET MVC App Template | mvcapptemplate | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1;net5.0;net6.0;net7.0;net8.0 | Windows;Linux |
| SDK Web Large 3.0 | weblarge3.0 | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1 | Windows;Linux |
| SDK Windows Forms Large | windowsformslarge | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1 | Windows |
| SDK WPF Large | wpflarge | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1 | Windows |
| SDK Windows Forms Template | windowsforms | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1 | Windows |
| SDK WPF Template | wpf | pre.py default -f \<tfm> | test.py sdk \<build option> | post.py | netcoreapp3.1 | Windows |
| SDK New Console | emptyconsoletemplate | N/A | test.py sdk new_console | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0 | Windows;Linux |
| SDK New Console | emptyconsoletemplate | N/A | test.py sdk new_console | post.py | netcoreapp2.1;netcoreapp3.1;net5.0;net6.0;net7.0;net8.0 | Windows;Linux |

Просмотреть файл

@ -9,7 +9,7 @@ We're going to see how changing gen0size affects performance. We'll start by cre
```yaml
vary: config
test_executables:
defgcperfsim: /performance/artifacts/bin/GCPerfSim/release/net7.0/GCPerfSim.dll
defgcperfsim: /performance/artifacts/bin/GCPerfSim/release/net8.0/GCPerfSim.dll
coreclrs:
a:
core_root: ./coreclr

Просмотреть файл

@ -42,8 +42,8 @@ Adding the new _GCPerfSim_ build, the `yaml` file would look like this:
```yml
vary: executable
test_executables:
orig_gcperfsim: C:\repos\gcperfsim-backup\GCPerfSim\release\net7.0\GCPerfSim.dll
mod_gcperfsim: C:\repos\performance\artifacts\bin\GCPerfSim\release\net7.0\GCPerfSim.dll
orig_gcperfsim: C:\repos\gcperfsim-backup\GCPerfSim\release\net8.0\GCPerfSim.dll
mod_gcperfsim: C:\repos\performance\artifacts\bin\GCPerfSim\release\net8.0\GCPerfSim.dll
coreclrs:
a:
core_root: C:\repos\core_root

Просмотреть файл

@ -12,38 +12,38 @@ To learn more about designing benchmarks, please read [Microbenchmark Design Gui
## Quick Start
The first thing that you need to choose is the Target Framework. Available options are: `netcoreapp3.1|net6.0|net7.0|net462`. You can specify the target framework using `-f|--framework` argument. For the sake of simplicity, all examples below use `net7.0` as the target framework.
The first thing that you need to choose is the Target Framework. Available options are: `netcoreapp3.1|net6.0|net7.0|net8.0|net462`. You can specify the target framework using `-f|--framework` argument. For the sake of simplicity, all examples below use `net8.0` as the target framework.
The following commands are run from the `src/benchmarks/micro` directory.
To run the benchmarks in Interactive Mode, where you will be asked which benchmark(s) to run:
```cmd
dotnet run -c Release -f net7.0
dotnet run -c Release -f net8.0
```
To list all available benchmarks ([read more](../../../docs/benchmarkdotnet.md#Listing-the-Benchmarks)):
```cmd
dotnet run -c Release -f net7.0 --list flat|tree
dotnet run -c Release -f net8.0 --list flat|tree
```
To filter the benchmarks using a glob pattern applied to namespace.typeName.methodName ([read more](../../../docs/benchmarkdotnet.md#Filtering-the-Benchmarks)):
```cmd
dotnet run -c Release -f net7.0 --filter *Span*
dotnet run -c Release -f net8.0 --filter *Span*
```
To profile the benchmarked code and produce an ETW Trace file ([read more](../../../docs/benchmarkdotnet.md#Profiling)):
```cmd
dotnet run -c Release -f net7.0 --filter $YourFilter --profiler ETW
dotnet run -c Release -f net8.0 --filter $YourFilter --profiler ETW
```
To run the benchmarks for multiple runtimes ([read more](../../../docs/benchmarkdotnet.md#Multiple-Runtimes)):
```cmd
dotnet run -c Release -f net6.0 --filter * --runtimes net6.0 net7.0
dotnet run -c Release -f net7.0 --filter * --runtimes net7.0 net8.0
```
## Private Runtime Builds
@ -51,19 +51,19 @@ dotnet run -c Release -f net6.0 --filter * --runtimes net6.0 net7.0
If you contribute to [dotnet/runtime](https://github.com/dotnet/runtime) and want to benchmark **local builds of .NET Core** you need to build [dotnet/runtime](https://github.com/dotnet/runtime) in Release (including tests - so a command similar to `build clr+libs+libs.tests -rc release -lc release`) and then provide the path(s) to CoreRun(s). Provided CoreRun(s) will be used to execute every benchmark in a dedicated process:
```cmd
dotnet run -c Release -f net7.0 --filter $YourFilter \
--corerun C:\git\runtime\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe
dotnet run -c Release -f net8.0 --filter $YourFilter \
--corerun C:\git\runtime\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe
```
To make sure that your changes don't introduce any regressions, you can provide paths to CoreRuns with and without your changes and use the Statistical Test feature to detect regressions/improvements ([read more](../../../docs/benchmarkdotnet.md#Regressions)):
```cmd
dotnet run -c Release -f net7.0 \
dotnet run -c Release -f net8.0 \
--filter BenchmarksGame* \
--statisticalTest 3ms \
--coreRun \
"C:\git\runtime_upstream\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe" \
"C:\git\runtime_fork\artifacts\bin\testhost\net7.0-windows-Release-x64\shared\Microsoft.NETCore.App\7.0.0\CoreRun.exe"
"C:\git\runtime_upstream\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe" \
"C:\git\runtime_fork\artifacts\bin\testhost\net8.0-windows-Release-x64\shared\Microsoft.NETCore.App\8.0.0\CoreRun.exe"
```
If you **prefer to use dotnet cli** instead of CoreRun, you need to pass the path to cli via the `--cli` argument.