This package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
Обновлено 2024-10-27 15:51:02 +03:00
Hierarchical Transformers for Knowledge Graph Embeddings (EMNLP 2021)
Обновлено 2024-07-25 14:00:19 +03:00
Federated Learning Utilities and Tools for Experimentation
Обновлено 2024-01-11 22:20:09 +03:00
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale
Обновлено 2023-12-20 23:40:01 +03:00
maximal update parametrization (µP)
Обновлено 2023-10-21 06:45:36 +03:00
EsViT: Efficient self-supervised Vision Transformers
Обновлено 2023-08-28 04:38:18 +03:00
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Обновлено 2023-07-25 17:21:55 +03:00
Research code for CVPR 2021 paper "End-to-End Human Pose and Mesh Reconstruction with Transformers"
Обновлено 2023-07-07 01:02:19 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
This is an official implementation of CvT: Introducing Convolutions to Vision Transformers.
Обновлено 2022-06-22 07:21:09 +03:00
[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
Обновлено 2022-03-27 08:21:56 +03:00
Japanese NLP sample codes
Обновлено 2021-10-19 17:45:33 +03:00
A relation-aware semantic parsing model from English to SQL
Обновлено 2021-06-19 03:42:00 +03:00