0e89e33dbe | ||
---|---|---|
.github/workflows | ||
assets | ||
config | ||
nanoFramework.Benchmark | ||
nanoFramework.Benchmark.Sample | ||
.github_changelog_generator | ||
.gitignore | ||
CHANGELOG.md | ||
LICENSE.md | ||
NuGet.Config | ||
README.md | ||
azure-pipelines.yml | ||
nanoFramework.Benchmark.nuspec | ||
nanoFramework.Benchmark.sln | ||
version.json |
README.md
nanoFramework.Benchmark
Welcome to the .NET nanoFramework Benchmark repository
Build status
Component | Build Status | NuGet Package |
---|---|---|
nanoFramework.Benchmark |
What is the .NET nanoFramework Benchmark
The nanoFramework.Benchmark tool helps you to measure and track performance of the nanoFramework code. You can easily turn normal method into benchmark by just adding one attribute!
Heavily inspired by BenchmarkDotNet.
Example bellow will:
- Run Setup method once before running any benchmark method.
- Run each benchmark method 10 time.
- Prepare data to be passed into each parses.
- Invoke ConsoleParser (prints data in console as table).
public class CompareObjectTypesBenchmark
{
object[] array;
[Setup]
public void Setup()
{
array = new object[] {
(int)42,
(byte)42,
"This is a super string",
(ulong)42,
new Version(4, 2)
};
}
[Benchmark]
public void CompareByString()
{
for (int i = 0; i < array.Length; i++)
{
object obja = array.GetValue(i);
var typea = obja.GetType();
CompareUsingString(typea);
}
}
[Benchmark]
public void CompareUsingTypeofIf()
{
for (int i = 0; i < array.Length; i++)
{
object obja = array.GetValue(i);
var typea = obja.GetType();
CompareUsingTypeofIf(typea);
}
}
[Benchmark]
public void CompareUsingTypeofIfReturn()
{
for (int i = 0; i < array.Length; i++)
{
object obja = array.GetValue(i);
var typea = obja.GetType();
CompareUsingTypeofIfReturn(typea);
}
}
}
Output:
Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName | ItterationCount | Mean | Min | Max |
| -------------------------------------------------------------------- |
| CompareByString | 10 | 10 ms | 10 ms | 10 ms |
| CompareUsingTypeofIf | 10 | 3 ms | 0 ms | 10 ms |
| CompareUsingTypeofIfReturn | 10 | 5 ms | 0 ms | 10 ms |
How to run .NET nanoFramework Benchmark
Benchmark methods must be run on real hardware. (Note - You may received an OutOfMemory exception if you run many iterations and your hardware doesn't have enough memory.)
- Create a benchmark project using the "Blank Application (nanoFramework)" template
- Update the Program.cs file to the following:
public class Program
{
public static void Main()
{
BenchmarkRunner.Run(typeof(IAssemblyHandler).Assembly);
Thread.Sleep(Timeout.Infinite);
}
}
public interface IAssemblyHandler { }
- Add the nanoFramework.Benchmark nuget package to the project
- Lastly attach a nanoFramework device and run the benchmark project
Attributes
Class
IterationCount
Specify how many times each benchmark method needs to be invoked when running benchmark methods. Default is 10.
Logger
Sometimes you can try something that benchmark library does not support. In such situation, instead of debugging code, you may inject logger object.
DebugLogger
Wrapper around nanoFramework.Logging.Debug. Prints logs to console.
Methods
Setup
SetupAttribute is used to specify which method should be invoked only once before running benchmark methods. It can be used to initialize collections/objects etc. The execution time of setup method is not taken into account when calculating benchmark results.
Note, that setup method must be public and without parameters.
Benchmark
BenchmarkAttribute is used to specify which method should be invoked as benchmark.
Note, that benchmark method must be public and without parameters.
Baseline
BaselineAttribute is used to specify which method should be considered as baseline for calculation. Add new column "Ratio" to output.
Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName | ItterationCount | Mean | Ratio | Min | Max |
| ------------------------------------------------------------------------------ |
| CompareByString | 100 | 10 ms | 1.0 | 10 ms | 10 ms |
| CompareUsingTypeofIf | 100 | 5.9 ms | 0.5900 | 0 ms | 10 ms |
| CompareUsingTypeofIfReturn | 100 | 5.5 ms | 0.5500 | 0 ms | 10 ms |
Parsers
You can specify parsers as attributes on class. Every parsers is invoked after benchmark run, so you can get results in multiple formats.
By default only ConsoleParser is applied.
New parses can be easily implement by creating new class and implementing IResultParses interface. Also new attribute needs to be
ConsoleParser
You can use CsvParserAttribute to add parser which prints data in console in table format.
Output example
Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName | ItterationCount | Mean | Min | Max |
| -------------------------------------------------------------------- |
| CompareByString | 100 | 8.9 ms | 0 ms | 10 ms |
| CompareUsingTypeofIf | 100 | 4.1 ms | 0 ms | 10 ms |
| CompareUsingTypeofIfReturn | 100 | 4.2 ms | 0 ms | 10 ms |
CsvParser
You can use CsvParserAttribute to add parser which prints data in console in CSV format.
Output example
CSV export: CompareObjectTypesBenchmark benchmark class.
MethodName;ItterationCount;Mean;Min;Max
CompareByString;100;8.9 ms;0 ms;10 ms
CompareUsingTypeofIf;100;4.1 ms;0 ms;10 ms
CompareUsingTypeofIfReturn;100;4.2 ms;0 ms;10 ms
Feedback and documentation
For documentation, providing feedback, issues and finding out how to contribute please refer to the Home repo.
Join our Discord community here.
Credits
The list of contributors to this project can be found at CONTRIBUTORS.
License
The nanoFramework Class Libraries are licensed under the MIT license.
Code of Conduct
This project has adopted the code of conduct defined by the Contributor Covenant to clarify expected behaviour in our community. For more information see the .NET Foundation Code of Conduct.
.NET Foundation
This project is supported by the .NET Foundation.