Examples for using ONNX Runtime for machine learning inferencing.
Обновлено 2024-11-05 01:51:35 +03:00
An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
Обновлено 2022-08-29 19:33:49 +03:00
Python Inference Script(PyIS)
Обновлено 2022-03-01 07:30:18 +03:00
A library for collecting features and performing inference of machine learning evaluations based on those features, useful especially in situations where the feature publishing software components are strongly decoupled from the software components that wish to exploit those features in machine learning models.
Обновлено 2020-11-17 23:53:12 +03:00