list of efficient attention modules
翻译 - 有效关注模块列表
Abstractive and Extractive Text summarization using Transformers.
Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre-train from scratch. We investigated if multilingual models cou...
2020 AI研习社 金融用户评论分类
#自然语言处理#Convert pretrained RoBerta models to various long-document transformer models
Longformer Encoder Decoder model for the legal domain, trained for long document abstractive summarization task.
#自然语言处理#using transformers to do text classification.
#自然语言处理#This GitHub repository implements a novel approach for detecting Initial Public Offering (IPO) underpricing using pre-trained Transformers. The models, extended to handle large S-1 filings, leverage b...
#自然语言处理#Industrial Text Scoring using Multimodal Deep Natural Language Processing 🚀 | Code for IEA AIE 2022 paper
#自然语言处理#[제 13회 투빅스 컨퍼런스] YoYAK - Yes or Yes, Attention with gap-sentence for Korean long sequence
#自然语言处理#Kaggle NLP competition - Top 2% solution (36/2060)
#自然语言处理#This project applies the Longformer model to sentiment analysis using the IMDB movie review dataset. The Longformer model, introduced in "Longformer: The Long-Document Transformer," tackles long docum...
Fine-tuned Longformer for Summarization of Machine Learning Articles
A summarization website that can generate summaries from either YouTube videos or PDF files.
Project as part of COMP34812: Natural Language Understanding
A hyperpartisan news article classification system using BERT-based techniques. The goal was to leverage state-of-the-art transformer models like BERT, ROBERTa, and Longformer to accurately classify n...
#自然语言处理#This project was developed for a Kaggle competition focused on detecting Personally Identifiable Information (PII) in student writing. The primary objective was to build a robust model capable of iden...
Focus - Understanding contextual retrievability.