中文生成式预训练模型
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
翻译 - 用于分类,NER,QA,语言建模,语言生成,T5,多模式和会话式AI的变压器
A repo to explore different NLP tasks which can be solved using T5
Tools and scripts for experimenting with Transformers: Bert, T5...
Summarization Task using Bart and T5 models.
翻译 - 使用Bart和T5模型的汇总任务。
日本語T5モデル
Question Generation using Google T5 and Text2Text
The GNUVario source code for the TTGO-T5
Kernel for Lenovo K3-Note (K50-T5)
Fine tune a T5 transformer model using PyTorch & Transformers🤗
Device tree for Lenovo K3-Note (K50-t5)
Fast & Simple repository for pre-training and fine-tuning T5-style models
Complete mask index with Mengzi T5 pretrained model
Demo of the T5 model for various pre-trained task.
Prompt Fine-tuning on GLM, BART and Flan-T5.
A paraphrase generator built using the T5 model which produces paraphrased English sentences.
NLU & NLG (zero-shot) depend on mengzi-t5-base-mt pretrained model