πŸ“¦ A benchmark library for .NET nanoFramework inspired in BenchmarkDotNet
ΠŸΠ΅Ρ€Π΅ΠΉΡ‚ΠΈ ΠΊ Ρ„Π°ΠΉΠ»Ρƒ
nfbot 0e89e33dbe
Update 1 NuGet dependencies
2024-10-24 01:41:06 +01:00
.github/workflows Work CI-CD 2022-12-29 20:09:55 +00:00
assets Update azure-pipelines.yml 2022-09-01 16:18:12 +02:00
config Update azure-pipelines.yml 2022-09-01 16:21:54 +02:00
nanoFramework.Benchmark Update 1 NuGet dependencies 2024-10-24 01:41:06 +01:00
nanoFramework.Benchmark.Sample Update 1 NuGet dependencies 2024-09-30 12:53:04 +01:00
.github_changelog_generator Work CI-CD 2022-09-02 19:30:06 +01:00
.gitignore Initial commit 2022-08-21 23:49:19 +02:00
CHANGELOG.md Update CHANGELOG for v1.0.68 2024-06-03 02:50:42 +00:00
LICENSE.md Update azure-pipelines.yml 2022-09-01 16:18:12 +02:00
NuGet.Config Work CI-CD (#1) 2022-09-02 19:18:59 +01:00
README.md Added "how to run..." documentation (#11) 2022-10-19 07:00:09 +01:00
azure-pipelines.yml Work CI-CD 2024-01-29 15:17:09 +00:00
nanoFramework.Benchmark.nuspec Update 1 NuGet dependencies 2024-10-24 01:41:06 +01:00
nanoFramework.Benchmark.sln Create nanoFramework.Benchmark with sample. 2022-08-22 00:36:37 +02:00
version.json Work CI-CD 2022-12-22 13:46:00 +00:00

README.md

nanoFramework.Benchmark

Quality Gate Status Reliability Rating #yourfirstpr Discord

nanoFramework logo


Welcome to the .NET nanoFramework Benchmark repository

Build status

Component Build Status NuGet Package
nanoFramework.Benchmark Build Status NuGet

What is the .NET nanoFramework Benchmark

The nanoFramework.Benchmark tool helps you to measure and track performance of the nanoFramework code. You can easily turn normal method into benchmark by just adding one attribute!

Heavily inspired by BenchmarkDotNet.

Example bellow will:

  1. Run Setup method once before running any benchmark method.
  2. Run each benchmark method 10 time.
  3. Prepare data to be passed into each parses.
  4. Invoke ConsoleParser (prints data in console as table).
public class CompareObjectTypesBenchmark
{
    object[] array;

    [Setup]
    public void Setup()
    {
        array = new object[] {
                    (int)42,
                    (byte)42,
                    "This is a super string",
                    (ulong)42,
                    new Version(4, 2)
                };
    }

    [Benchmark]
    public void CompareByString()
    {
        for (int i = 0; i < array.Length; i++)
        {
            object obja = array.GetValue(i);
            var typea = obja.GetType();
            CompareUsingString(typea);
        }
    }

    [Benchmark]
    public void CompareUsingTypeofIf()
    {
        for (int i = 0; i < array.Length; i++)
        {
            object obja = array.GetValue(i);
            var typea = obja.GetType();
            CompareUsingTypeofIf(typea);
        }
    }

    [Benchmark]
    public void CompareUsingTypeofIfReturn()
    {
        for (int i = 0; i < array.Length; i++)
        {
            object obja = array.GetValue(i);
            var typea = obja.GetType();
            CompareUsingTypeofIfReturn(typea);
        }
    }
}

Output:

Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName                 | ItterationCount | Mean  | Min   | Max   |
| -------------------------------------------------------------------- |
| CompareByString            | 10              | 10 ms | 10 ms | 10 ms |
| CompareUsingTypeofIf       | 10              | 3 ms  | 0 ms  | 10 ms |
| CompareUsingTypeofIfReturn | 10              | 5 ms  | 0 ms  | 10 ms |

How to run .NET nanoFramework Benchmark

Benchmark methods must be run on real hardware. (Note - You may received an OutOfMemory exception if you run many iterations and your hardware doesn't have enough memory.)

  1. Create a benchmark project using the "Blank Application (nanoFramework)" template

image

  1. Update the Program.cs file to the following:
public class Program
{
    public static void Main()
    {
        BenchmarkRunner.Run(typeof(IAssemblyHandler).Assembly);
        Thread.Sleep(Timeout.Infinite);
    }
}
public interface IAssemblyHandler { }
  1. Add the nanoFramework.Benchmark nuget package to the project

image

  1. Lastly attach a nanoFramework device and run the benchmark project

Attributes

Class

IterationCount

Specify how many times each benchmark method needs to be invoked when running benchmark methods. Default is 10.

Logger

Sometimes you can try something that benchmark library does not support. In such situation, instead of debugging code, you may inject logger object.

DebugLogger

Wrapper around nanoFramework.Logging.Debug. Prints logs to console.

Methods

Setup

SetupAttribute is used to specify which method should be invoked only once before running benchmark methods. It can be used to initialize collections/objects etc. The execution time of setup method is not taken into account when calculating benchmark results.

Note, that setup method must be public and without parameters.

Benchmark

BenchmarkAttribute is used to specify which method should be invoked as benchmark.

Note, that benchmark method must be public and without parameters.

Baseline

BaselineAttribute is used to specify which method should be considered as baseline for calculation. Add new column "Ratio" to output.

Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName                 | ItterationCount | Mean   | Ratio  | Min   | Max   |
| ------------------------------------------------------------------------------ |
| CompareByString            | 100             | 10 ms  | 1.0    | 10 ms | 10 ms |
| CompareUsingTypeofIf       | 100             | 5.9 ms | 0.5900 | 0 ms  | 10 ms |
| CompareUsingTypeofIfReturn | 100             | 5.5 ms | 0.5500 | 0 ms  | 10 ms |

Parsers

You can specify parsers as attributes on class. Every parsers is invoked after benchmark run, so you can get results in multiple formats.

By default only ConsoleParser is applied.

New parses can be easily implement by creating new class and implementing IResultParses interface. Also new attribute needs to be

ConsoleParser

You can use CsvParserAttribute to add parser which prints data in console in table format.

Output example

Console export: CompareObjectTypesBenchmark benchmark class.
| MethodName                 | ItterationCount | Mean   | Min  | Max   |
| -------------------------------------------------------------------- |
| CompareByString            | 100             | 8.9 ms | 0 ms | 10 ms |
| CompareUsingTypeofIf       | 100             | 4.1 ms | 0 ms | 10 ms |
| CompareUsingTypeofIfReturn | 100             | 4.2 ms | 0 ms | 10 ms |

CsvParser

You can use CsvParserAttribute to add parser which prints data in console in CSV format.

Output example

CSV export: CompareObjectTypesBenchmark benchmark class.
MethodName;ItterationCount;Mean;Min;Max
CompareByString;100;8.9 ms;0 ms;10 ms
CompareUsingTypeofIf;100;4.1 ms;0 ms;10 ms
CompareUsingTypeofIfReturn;100;4.2 ms;0 ms;10 ms

Feedback and documentation

For documentation, providing feedback, issues and finding out how to contribute please refer to the Home repo.

Join our Discord community here.

Credits

The list of contributors to this project can be found at CONTRIBUTORS.

License

The nanoFramework Class Libraries are licensed under the MIT license.

Code of Conduct

This project has adopted the code of conduct defined by the Contributor Covenant to clarify expected behaviour in our community. For more information see the .NET Foundation Code of Conduct.

.NET Foundation

This project is supported by the .NET Foundation.