Implementation of Linformer for Pytorch
My take on a practical implementation of Linformer for Pytorch.
Reproducing the Linear Multihead Attention introduced in Linformer paper (Linformer: Self-Attention with Linear Complexity)
list of efficient attention modules
翻译 - 有效关注模块列表
We implement a encoder/decoder seq2seq model, which has the same performance (similar perplexity) as traditional transformer but also reduce time complexity from O(n^2) to O(n).
Vision Xformers
Implementation of faster and more efficient sequence modeling deep learning algorithms based on fairseq, e.g. Lite-Transormer, Reformer, Linformer, efficient variants of attention mechanism and so on
Study of the Linformer model from Facebook AI