DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Обновлено 2024-10-17 20:50:55 +03:00
TorchGeo: datasets, samplers, transforms, and pre-trained models for geospatial data
Обновлено 2024-10-17 12:09:06 +03:00
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Обновлено 2024-10-17 09:34:07 +03:00
Hummingbird compiles trained ML models into tensor computation for faster inference.
Обновлено 2024-10-16 21:50:20 +03:00
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Обновлено 2024-09-13 14:49:34 +03:00
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Обновлено 2024-07-31 00:16:53 +03:00
CoCosNet v2: Full-Resolution Correspondence Learning for Image Translation
Обновлено 2024-07-25 14:07:42 +03:00
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Обновлено 2024-07-03 13:54:08 +03:00
UniSpeech - Large Scale Self-Supervised Learning for Speech
Обновлено 2024-04-05 16:14:48 +03:00
Federated Learning Utilities and Tools for Experimentation
Обновлено 2024-01-11 22:20:09 +03:00
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Обновлено 2024-01-09 18:03:18 +03:00
Accelerate your Neural Architecture Search (NAS) through fast, reproducible and modular research.
Обновлено 2023-12-22 22:29:11 +03:00
maximal update parametrization (µP)
Обновлено 2023-10-21 06:45:36 +03:00
This repository provides code for machine learning algorithms for edge devices developed at Microsoft Research India.
Обновлено 2023-07-07 00:42:22 +03:00
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Обновлено 2023-06-12 21:59:00 +03:00
Cross-domain Correspondence Learning for Exemplar-based Image Translation. (CVPR 2020 Oral)
Обновлено 2022-12-07 08:35:12 +03:00
Large-scale pretraining for dialogue
Обновлено 2022-10-18 02:41:52 +03:00
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
Обновлено 2022-09-23 02:59:07 +03:00
A PyTorch Graph Neural Network Library
Обновлено 2022-02-01 20:31:29 +03:00
machine learning at scale on Azure Machine Learning
Обновлено 2021-11-18 15:48:14 +03:00
Japanese NLP sample codes
Обновлено 2021-10-19 17:45:33 +03:00
Common PyTorch Modules
Обновлено 2021-10-18 13:52:28 +03:00
This is a list of open-source projects at Microsoft Research NLP Group
Обновлено 2020-09-30 01:11:02 +03:00