This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
Обновлено 2024-07-15 18:00:32 +03:00
We design an effective Relation-Aware Global Attention (RGA) module for CNNs to globally infer the attention.
Обновлено 2023-06-12 21:55:25 +03:00
SR-CNN
Обновлено 2022-11-28 22:09:47 +03:00
LQ-Nets: Learned Quantization for Highly Accurate and Compact Deep Neural Networks
Обновлено 2022-08-30 11:00:20 +03:00
Fast R-CNN Object Detection on Azure using CNTK
Обновлено 2018-03-24 00:14:57 +03:00
A python implementation for a CNTK Fast-RCNN evaluation client
Обновлено 2017-07-04 11:22:40 +03:00