Common PyTorch Modules
Перейти к файлу
Miltos eccb7c3f18
Merge pull request #3 from microsoft/kmaziarz-fix-mlp-docstring
Fix double negative in the docstring for `construct_mlp`
2021-10-18 11:51:39 +01:00
.github/workflows Add link to documentation in README. 2021-10-15 19:37:04 +01:00
docs Change theme and autobuild action (try 1) 2021-10-15 18:58:56 +01:00
mi_module_zoo Fix double negative in the docstring for `construct_mlp` 2021-10-18 11:43:09 +01:00
tests Allow 2D masks in relational attention (e.g. for "causal" attention) 2021-10-16 10:06:32 +01:00
.editorconfig Initial import. 2021-10-09 12:13:49 +01:00
.gitattributes Initial import. 2021-10-09 12:13:49 +01:00
.gitignore Initial import. 2021-10-09 12:13:49 +01:00
.pre-commit-config.yaml Initial import. 2021-10-09 12:13:49 +01:00
CODE_OF_CONDUCT.md CODE_OF_CONDUCT.md committed 2021-10-08 10:46:47 -07:00
LICENSE LICENSE committed 2021-10-08 10:46:48 -07:00
README.md Add link to documentation in README. 2021-10-15 19:37:04 +01:00
SECURITY.md Support Py3.7 and other fixes. 2021-10-09 12:53:16 +01:00
SUPPORT.md Support Py3.7 and other fixes. 2021-10-09 12:53:16 +01:00
azure-pipelines.yml minor 2021-10-09 15:32:53 +01:00
environment.yml Initial import. 2021-10-09 12:13:49 +01:00
pyproject.toml Initial import. 2021-10-09 12:13:49 +01:00
setup.py Assume PyTorch >= 1.9.1 2021-10-09 15:55:57 +01:00

README.md

Machine Intelligence PyTorch Module Zoo PyPI

This package contains implementations standalone, commonly reusable PyTorch nn.Modules. To install it run

pip install mi-module-zoo

Documentation of the library is found at https://microsoft.github.io/mi-module-zoo/.

This library is maintained by the Machine Intelligence group in Microsoft Research.

Modules

A list of the modules follows, for detailed documentation, please check the docstring of each module.

  • mi_model_zoo.mlp.construct_mlp() A function that generates an nn.Sequential for a multilinear perceptron.
  • mi_model_zoo.settransformer.SetTransformer The Set Transformer models.
  • mi_model_zoo.settransformer.ISAB An Inducing-point Self-Attention Block from the Set Transformer paper.
  • mi_model_zoo.RelationalMultiheadAttention The relational multi-head attention variants, supporting both sparse and dense relationships, including Shaw et. al. (2019), RAT-SQL, and GREAT variants.
  • mi_model_zoo.relationaltransformerlayers.RelationalTransformerEncoderLayer A relational transformer encoder layer that supports both dense and sparse relations among elements. Supports ReZero and a variety of normalization modes.
  • mi_model_zoo.relationaltransformerlayers.RelationalTransformerDecoderLayer A relational transformer decoder layer that supports both dense and sparse relations among encoded-decoded and decoded-decoded elements. Supports ReZero and a variety of normalization modes.

Utilities

  • mi_model_zoo.utils.randomutils.set_seed() Set the seed across Python, NumPy, and PyTorch (CPU+CUDA).
  • mi_model_zoo.utils.activationutils.get_activation_fn() Get an activation function by name.

Developing

To develop in this repository, clone the repository, install pre-commit, and run

pre-commit install
Releasing to pip

To deploy a package to PyPI, create a release on GitHub with a git tag of the form vX.Y.Z. A GitHub Action will automatically build and push the package to PyPI.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.