Context aware, pluggable and customizable data protection and de-identification SDK for text and images
hacktoberfest
microsoft
python
privacy
transformers
anonymization
anonymization-service
data-anonymization
data-loss-prevention
data-masking
data-protection
de-identification
dlp
pii
pii-anonymization
pii-anonymization-service
presidio
privacy-protection
text-anonymization
Обновлено 2024-11-22 11:45:47 +03:00
This package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
machine-learning
deep-learning
nlp
natural-language-processing
privacy
ner
transformers
pii
named-entity-recognition
spacy
flair
Обновлено 2024-11-22 01:17:04 +03:00
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Обновлено 2024-11-18 06:00:53 +03:00
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Обновлено 2024-10-18 13:31:05 +03:00
Hierarchical Transformers for Knowledge Graph Embeddings (EMNLP 2021)
Обновлено 2024-07-25 14:00:19 +03:00
This is a collection of our NAS and Vision Transformer work.
Обновлено 2024-07-25 13:29:13 +03:00
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
object-detection
image-classification
semantic-segmentation
imagenet
swin-transformer
ade20k
mask-rcnn
mscoco
Обновлено 2024-07-15 18:00:32 +03:00
Graphormer is a general-purpose deep learning backbone for molecular modeling.
Обновлено 2024-05-28 09:22:34 +03:00
Federated Learning Utilities and Tools for Experimentation
machine-learning
pytorch
simulation
gloo
nccl
personalization
privacy-tools
transformers-models
distributed-learning
federated-learning
Обновлено 2024-01-11 22:20:09 +03:00
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale
Обновлено 2023-12-20 23:40:01 +03:00
maximal update parametrization (µP)
Обновлено 2023-10-21 06:45:36 +03:00
Table Transformer (TATR) is a deep learning model for extracting tables from unstructured documents (PDFs and images). This is also the official repository for the PubTables-1M dataset and GriTS evaluation metric.
Обновлено 2023-09-07 07:38:34 +03:00
EsViT: Efficient self-supervised Vision Transformers
Обновлено 2023-08-28 04:38:18 +03:00
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
deep-learning
natural-language-processing
representation-learning
transformers
language-model
natural-language-understanding
pretraining
contrastive-learning
pretrained-language-model
Обновлено 2023-07-25 17:21:55 +03:00
Investigations on Transformer Visualizations by the Aether Prototyping and Incubation team
Обновлено 2023-07-19 05:34:54 +03:00
Research code for CVPR 2021 paper "End-to-End Human Pose and Mesh Reconstruction with Transformers"
Обновлено 2023-07-07 01:02:19 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Обновлено 2023-06-13 00:30:31 +03:00
Transformer-based machine reading comprehension in combination with Azure Cognitive Search
Обновлено 2023-06-13 00:29:56 +03:00
A unified 3D Transformer Pipeline for visual synthesis
Обновлено 2023-05-29 12:07:41 +03:00
The implementation of DeBERTa
representation-learning
self-attention
deeplearning
bert
transformer-encoder
language-model
natural-language-understanding
roberta
Обновлено 2023-03-25 13:22:59 +03:00
Large-scale pretraining for dialogue
machine-learning
pytorch
transformer
dialogue
data-processing
dialogpt
gpt-2
text-data
text-generation
Обновлено 2022-10-18 02:41:52 +03:00
This is an official implementation for "SimMIM: A Simple Framework for Masked Image Modeling".
Обновлено 2022-09-29 18:17:40 +03:00
This is an official implementation of CvT: Introducing Convolutions to Vision Transformers.
Обновлено 2022-06-22 07:21:09 +03:00
FastFormers - highly efficient transformer models for NLU
Обновлено 2022-05-19 01:07:52 +03:00
[NeurIPS 2021 Spotlight] Official code for "Focal Self-attention for Local-Global Interactions in Vision Transformers"
Обновлено 2022-03-27 08:21:56 +03:00
CSWin Transformer: A General Vision Transformer Backbone with Cross-Shaped, CVPR 2022
Обновлено 2022-03-18 04:29:20 +03:00
Ramp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
nlp
ner
classification
question-answering
dstoolkit
machine-reading-comprehension
summarization
transformer
Обновлено 2022-01-10 02:07:54 +03:00
Japanese NLP sample codes
Обновлено 2021-10-19 17:45:33 +03:00