Unofficial implementation of "Prompt-to-Prompt Image Editing with Cross Attention Control" with Stable Diffusion
PyTorch code for our paper "Image Super-Resolution with Cross-Scale Non-Local Attention and Exhaustive Self-Exemplars Mining" (CVPR2020).
Officail Implementation for "Cross-Image Attention for Zero-Shot Appearance Transfer"
CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
翻译 - CCNet:语义分割的跨界关注(ICCV 2019)。
A simple cross attention that updates both the source and target in one step
Official Pytorch implementation of Dual Cross-Attention for Medical Image Segmentation
CVPR 2023: Learning to Render Novel Views from Wide-Baseline Stereo Pairs
PyTorch source code for "Stacked Cross Attention for Image-Text Matching" (ECCV 2018)
Video-P2P: Video Editing with Cross-attention Control
Code of Cross Attention Network for Few-shot Classification (NeurIPS 2019).
“Zero setup” cross compilation and “cross testing” of Rust crates
Official implementation code for Attention Swin U-Net: Cross-Contextual Attention Mechanism for Skin Lesion Segmentation paper
Source code and data for [Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction]
Cross-Attention in Coupled Unmixing Nets for Unsupervised Hyperspectral Super-Resolution, ECCV, 2020. (PyTorch)
[CVPR 2019 Oral] Multi-Channel Attention Selection GAN with Cascaded Semantic Guidance for Cross-View Image Translation
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Self-Constraining and Attention-based Hashing Network for Bit-Scalable Cross-Modal Retrieval
Fully-automatic cross-seeding with Torznab
Bilateral Cross-Modality Graph Matching Attention for Feature Fusion in Visual Question Answering
Classification with backbone Resnet and attentions: SE-Channel Attention, BAM - (Spatial Attention, Channel Attention, Joint Attention), CBAM - (Spatial Attention, Channel Attention, Joint Attention)
Criss-Cross Attention (2d&3d) for Semantic Segmentation in pure Pytorch with a faster and more precise implementation.
Ring attention implementation with flash attention