#自然语言处理#Unilm是一个跨任务、语言和模式的大规模自监督预训练模型
#自然语言处理#An Open-Source Framework for Prompt-Learning.
翻译 - 用于快速学习的开源工具包。
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
#数据仓库#Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
翻译 - 汉语语言理解评估基准:数据集,基线,预训练模型,语料库和排行榜
#自然语言处理#Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling...
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
翻译 - [CVPR 2021]内卷化:为视觉识别反转卷积的内在性,这是一种全新的神经算子
#计算机科学#[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers