This package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
machine-learning
deep-learning
nlp
natural-language-processing
privacy
ner
transformers
pii
named-entity-recognition
spacy
flair
Обновлено 2024-11-22 01:17:04 +03:00
Context aware, pluggable and customizable data protection and de-identification SDK for text and images
hacktoberfest
microsoft
python
privacy
transformers
anonymization
anonymization-service
data-anonymization
data-loss-prevention
data-masking
data-protection
de-identification
dlp
pii
pii-anonymization
pii-anonymization-service
presidio
privacy-protection
text-anonymization
Обновлено 2024-11-22 00:54:06 +03:00
Hierarchical Transformers for Knowledge Graph Embeddings (EMNLP 2021)
Обновлено 2024-07-25 14:00:19 +03:00
Federated Learning Utilities and Tools for Experimentation
machine-learning
pytorch
simulation
gloo
nccl
personalization
privacy-tools
transformers-models
distributed-learning
federated-learning
Обновлено 2024-01-11 22:20:09 +03:00
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale
Обновлено 2023-12-20 23:40:01 +03:00
EsViT: Efficient self-supervised Vision Transformers
Обновлено 2023-08-28 04:38:18 +03:00
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
deep-learning
natural-language-processing
representation-learning
transformers
language-model
natural-language-understanding
pretraining
contrastive-learning
pretrained-language-model
Обновлено 2023-07-25 17:21:55 +03:00
Research code for CVPR 2021 paper "End-to-End Human Pose and Mesh Reconstruction with Transformers"
Обновлено 2023-07-07 01:02:19 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
This is an official implementation of CvT: Introducing Convolutions to Vision Transformers.
Обновлено 2022-06-22 07:21:09 +03:00
[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
Обновлено 2022-03-27 08:21:56 +03:00
A relation-aware semantic parsing model from English to SQL
Обновлено 2021-06-19 03:42:00 +03:00