[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Mask Transfiner for High-Quality Instance Segmentation, CVPR 2022
#计算机科学#Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
[CVPR 2023] IMP: iterative matching and pose estimation with transformer-based recurrent module
#计算机科学#[MICCAI 2023] DAE-Former: Dual Attention-guided Efficient Transformer for Medical Image Segmentation
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "EcoFormer: Energy-Saving Attention with Linear Complexity"
Official PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
[ICLR 2022] "Unified Vision Transformer Compression" by Shixing Yu*, Tianlong Chen*, Jiayi Shen, Huan Yuan, Jianchao Tan, Sen Yang, Ji Liu, Zhangyang Wang
Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre-train from scratch. We investigated if multilingual models cou...
[ICCV 2023] Efficient Video Action Detection with Token Dropout and Context Refinement
#计算机科学#This repository contains the official code for Energy Transformer---an efficient Energy-based Transformer variant for graph classification
Official Implementation of Energy Transformer in PyTorch for Mask Image Reconstruction
A custom Tensorflow implementation of Google's Electra NLP model with compositional embeddings using complementary partitions
Demo code for CVPR2023 paper "Sparsifiner: Learning Sparse Instance-Dependent Attention for Efficient Vision Transformers"
#自然语言处理#This is the source code of article how to create a chatbot in python . i.e A chatbot using the Reformer, also known as the efficient Transformer, to generate dialogues between two bots.
MetaFormer-Based Global Contexts-Aware Network for Efficient Semantic Segmentation (Accepted by WACV 2024)
#计算机科学#Gated Attention Unit (TensorFlow implementation)