#计算机科学#Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
#自然语言处理#Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
#自然语言处理#BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
翻译 - 可视化Transformer模型中的注意力的工具(BERT,GPT-2,Albert,XLNet,RoBERTa,CTRL等)
#自然语言处理#Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
RoBERTa中文预训练模型: RoBERTa for Chinese
The implementation of DeBERTa
翻译 - DeBERTa的实施
CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition