ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Обновлено 2024-12-04 05:21:29 +03:00
The pre- and post- processing library for ONNX Runtime
Обновлено 2024-12-03 12:11:42 +03:00
The pre- and post- processing library for ONNX Runtime
Обновлено 2024-11-29 11:16:09 +03:00
Examples for using ONNX Runtime for machine learning inferencing.
Обновлено 2024-11-05 01:51:35 +03:00
Open Enclave port of the ONNX runtime for confidential inferencing on Azure Confidential Computing
Обновлено 2022-12-12 19:28:21 +03:00
An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
Обновлено 2022-08-29 19:33:49 +03:00