The pre- and post- processing library for ONNX Runtime
Обновлено 2024-11-07 13:57:10 +03:00
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Обновлено 2024-11-06 23:00:04 +03:00
The pre- and post- processing library for ONNX Runtime
Обновлено 2024-11-06 20:55:16 +03:00
Examples for using ONNX Runtime for machine learning inferencing.
Обновлено 2024-11-05 01:51:35 +03:00
Open Enclave port of the ONNX runtime for confidential inferencing on Azure Confidential Computing
Обновлено 2022-12-12 19:28:21 +03:00
An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
Обновлено 2022-08-29 19:33:49 +03:00