GitHub 中文社区
回车: Github搜索    Shift+回车: Google搜索
论坛
排行榜
趋势
登录

©2025 GitHub中文社区论坛GitHub官网网站地图GitHub官方翻译

  • X iconGitHub on X
  • Facebook iconGitHub on Facebook
  • Linkedin iconGitHub on LinkedIn
  • YouTube iconGitHub on YouTube
  • Twitch iconGitHub on Twitch
  • TikTok iconGitHub on TikTok
  • GitHub markGitHub’s organization on GitHub

编程语言

”multihead-attention“ 的搜索结果

Linear-Multihead-Attention
@kuixu

Reproducing the Linear Multihead Attention introduced in Linformer paper (Linformer: Self-Attention with Linear Complexity)

linformerdetrtransformerattention-mechanism
Python76
5 年前

相关主题

attentiontransformerPyTorchattention-mechanismmultihead-attentionself-attentionTensorflowneural-networkslinformerattention-mechanisms

Google   Bing   GitHub

multihead-siamese-nets
@tlatkowski

#自然语言处理#Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.

multihead-attentionsemantic-similarity深度神经网络
Jupyter Notebook182
2 年前
Stepwise_Monotonic_Multihead_Attention
@keonlee9420

PyTorch Implementation of Stepwise Monotonic Multihead Attention similar to Enhancing Monotonicity for Robust Autoregressive Transformer TTS

ttstransformerattention
Python36
4 年前
Attention_based_MultiHead_model_for_aircraft_engine_RUL_prediction_
@abiodun-ayodeji

Attention-based multihead model for optimized aircraft engine remaining useful life prediction

Jupyter Notebook55
1 年前
Multihead-Attention
@renjunxiang

Multihead Attention for PyTorch

Python27
6 年前
FlashMHA
@kyegomez

An simple pytorch implementation of Flash MultiHead Attention

人工智能artificial-neural-networksattentionattention-mechanisms
Jupyter Notebook21
1 年前
MultiHeadAttention-cpp
@dianhsu

C++7
4 年前
MultiHead_Attention
@Doreenruirui

Python2
7 年前
acoustic-scene-analysis-with-multihead-self-attention
@KrishnaDN

This repo contains implementation of the paper "Acoustic Scene Analysis With Multihead Self Attention" by Weimin Wang, Weiran Wang, Ming Sun, Chao Wang from Amazon Alexa team

Python4
5 年前
pytorch-transformer
@akurniawan

Implementation of "Attention is All You Need" paper

PyTorchattentionattention-is-all-you-needmultihead-attention
Python33
1 年前
ring-flash-attention
@zhuzilin

Ring attention implementation with flash attention

Python795
8 天前
AttentioNN
@zaidalyafeai

All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.

notebooksattentionneural-networks
Jupyter Notebook231
6 年前
AttentionResnet
@thanhkaist

Classification with backbone Resnet and attentions: SE-Channel Attention, BAM - (Spatial Attention, Channel Attention, Joint Attention), CBAM - (Spatial Attention, Channel Attention, Joint Attention)

Python60
5 年前
flash-attention
@Dao-AILab

Fast and memory-efficient exact attention

Python18.28 k
1 天前
苏剑林(Jianlin Su)
attention
苏剑林(Jianlin Su)@bojone

some attention implements

Python1.45 k
6 年前
ResNeSt
@zhanghang1989

#计算机科学#ResNeSt: Split-Attention Networks

深度学习resnetresnestPyTorchdetectron-models
Python3.26 k
3 年前
Image_Segmentation
@LeeJunHyun

Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.

Python2.93 k
2 年前
ringattention
@haoliuhl

Large Context Attention

large-language-modelslong-contextmemory-efficienttransformers
Python717
6 个月前
Neighborhood-Attention-Transformer
@SHI-Labs

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

PyTorch
Python1.12 k
1 年前
Various-Attention-mechanisms
@monk1337

This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras

attention-mechanismattentionKerasPyTorchattention-model
Python126
4 年前
MoH
@SkyworkAI

MoH: Multi-Head Attention as Mixture-of-Head Attention

attentiondit大语言模型mixture-of-expertsmoe
Python259
8 个月前
Attention-OCR
@da03

Visual Attention based OCR

Python1.12 k
7 年前
MenghaoGuo
EANet
MenghaoGuo@MenghaoGuo

External Attention Network

Python403
3 年前
AoANet
@husthuaan

Code for paper "Attention on Attention for Image Captioning". ICCV 2019

image-captioningattention-mechanismiccv2019
Python333
4 年前
attention-module
@Jongchan

Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"

Python2.16 k
2 年前
LSTM-Attention
@negar-rostamzadeh

LSTM-Attention

Python74
8 年前
Separius/awesome-fast-attention
awesome-fast-attention存档
@Separius

list of efficient attention modules

transformerattentionAwesome Listsreformerlongformer
Python1.01 k
4 年前
GAT
@PetarV-

Graph Attention Networks (https://arxiv.org/abs/1710.10903)

graph-attention-networksattention-mechanismself-attentionTensorflowneural-networks
Python3.39 k
3 年前
loading...