#计算机科学#OpenMMLab Pre-training Toolbox and Benchmark
翻译 - OpenMMLab图像分类工具箱和基准
#自然语言处理#Unilm是一个跨任务、语言和模式的大规模自监督预训练模型
#自然语言处理#Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
翻译 - PyTorch 中的开源预训练模型框架和预训练模型 Zoo
Grounded Language-Image Pre-training
Self-Supervised Speech Pre-training and Representation Learning Toolkit
翻译 - 自我监督的语音预训练和表征学习工具包。
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Video PreTraining (VPT): Learning to Act by Watching Unlabeled Online Videos
Strategies for Pre-training Graph Neural Networks
#自然语言处理#ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
翻译 - ELECTRA:预先训练文本编码器作为鉴别器,而不是生成器
Multi-modality pre-training
#大语言模型#PhoGPT: Generative Pre-training for Vietnamese (2023)
ICLR 2022 Paper, SOTA Table Pre-training Model, TAPEX: Table Pre-training via Learning a Neural SQL Executor
A short tutorial on Elmo training (Pre trained, Training on new data, Incremental training)
Chinese Transformer Generative Pre-Training Model
Unsupervised Pre-training for Person Re-identification (LUPerson)
Code for ALBEF: a new vision-language pre-training method
MASS: Masked Sequence to Sequence Pre-training for Language Generation
翻译 - MASS:用于语言生成的蒙版序列到序列预训练
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing