this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
#自然语言处理#Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
基于 RoBERTa-wwm-ext 模型的微博中文情绪识别
微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)
个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0
PyTorch - Albert Large V2, Bert Base Uncased, Bert Large Uncased WWM Finetuned Squad, Distil Roberta Base, Roberta Base Squad2, Roberta large Squad2
chinese-roberta-wwm-ext-large修复mlm参数
Chinese NER task with BERT/roberta/macbert/bert_wwm for keras
Pretrain own bert / roberta model with wwm-mlm and modified tokenizer (both chinese and english).
Chinese coreference resolution, use chinese-roberta-wwm-ext pretrain model on CLUE WSC2020 dataset
RoBERTa中文预训练模型: RoBERTa for Chinese
中文指代消解:基于HFL的预训练模型chinese-roberta-wwm-ext,训练评测CLUE WSC2020数据集
RoBERTa models for Polish
roBERTa training for SQuAD
Replication package for RoBERTa
roberta lab connector for ev3dev
Named Entity Recognition with Pretrained XLM-RoBERTa
bert、roberta ner命名实体识别
A PyTorch implementation of a BiLSTM\BERT\Roberta(+CRF) model for Named Entity Recognition.