DoWhy is a Python library for causal inference that supports explicit modeling and testing of causal assumptions. DoWhy is based on a unified language for causal inference, combining causal graphical models and potential outcomes frameworks.
Обновлено 2024-11-21 01:27:34 +03:00
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Обновлено 2024-11-20 04:04:47 +03:00
ALICE (Automated Learning and Intelligence for Causation and Economics) is a Microsoft Research project aimed at applying Artificial Intelligence concepts to economic decision making. One of its goals is to build a toolkit that combines state-of-the-art machine learning techniques with econometrics in order to bring automation to complex causal inference problems. To date, the ALICE Python SDK (econml) implements orthogonal machine learning algorithms such as the double machine learning work of Chernozhukov et al. This toolkit is designed to measure the causal effect of some treatment variable(s) t on an outcome variable y, controlling for a set of features x.
Обновлено 2024-11-18 23:05:01 +03:00
Hummingbird compiles trained ML models into tensor computation for faster inference.
Обновлено 2024-11-16 00:52:33 +03:00
Examples for using ONNX Runtime for machine learning inferencing.
Обновлено 2024-11-05 01:51:35 +03:00
Infer.NET is a framework for running Bayesian inference in graphical models
Обновлено 2024-09-05 02:51:09 +03:00
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Обновлено 2024-07-31 00:16:53 +03:00
Variational inference for hierarchical dynamical systems
Обновлено 2024-07-25 14:00:08 +03:00
PrivGAN: Protecting GANs from membership inference attacks at low cost
Обновлено 2024-06-18 01:55:22 +03:00
Schema decoration for inference code
Обновлено 2024-05-17 20:49:11 +03:00
The InnerEye-Gateway is a Windows service that acts as a DICOM end point to run inference on https://github.com/microsoft/InnerEye-DeepLearning models.
Обновлено 2024-03-21 12:50:43 +03:00
Enables inference and deployment of InnerEye-DeepLearning (https://github.com/microsoft/InnerEye-deeplearning) models as an async REST API on Azure
Обновлено 2024-03-21 12:48:29 +03:00
Global type inference for C# 8 nullable reference types
Обновлено 2024-01-16 23:08:03 +03:00
ONNX Runtime Web benchmark tool
Обновлено 2023-08-16 03:11:17 +03:00
Sample notebooks for optimized training and inference of Hugging Face models on Azure Databricks
Обновлено 2023-03-28 19:41:14 +03:00
Self-Supervised Document-to-Document Similarity Ranking via Contextualized Language Models and Hierarchical Inference
Обновлено 2022-11-28 22:08:28 +03:00
An Open Enclave port of the ONNX inference server with data encryption and attestation capabilities to enable confidential inference on Azure Confidential Computing.
Обновлено 2022-08-29 19:33:49 +03:00
The AI for Earth API Platform is a distributed infrastructure designed to provide a secure, scalable, and customizable API hosting, designed to handle the needs of long-running/asynchronous machine learning model inference. It is to be used with the AI For Earth API Framework (https://github.com/microsoft/AIforEarth-API-Development).
Обновлено 2022-08-01 23:32:15 +03:00
Python Inference Script(PyIS)
Обновлено 2022-03-01 07:30:18 +03:00
Knowledge Distillation as Semiparametric Inference
Обновлено 2021-05-17 23:15:27 +03:00
A library for collecting features and performing inference of machine learning evaluations based on those features, useful especially in situations where the feature publishing software components are strongly decoupled from the software components that wish to exploit those features in machine learning models.
Обновлено 2020-11-17 23:53:12 +03:00
Lightweight tool that generates Go structures and Tensorflow inference execution code from SavedModel
Обновлено 2020-04-27 20:53:21 +03:00
Sample code for running deterministic variational inference to train Bayesian neural networks
Обновлено 2018-10-10 12:06:01 +03:00
Demonstration of Jackknife Variational Inference for Variational Autoencoders, related to ICLR 2018 paper.
Обновлено 2018-02-21 13:36:23 +03:00