#自然语言处理#Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
基于预训练模型(BERT,BERT-wwm)的文本分类模板,CCF BDCI新闻情感分析A榜4/2735。
个人基于谷歌开源的BERT编写的文本分类器(基于微调方式),可自由加载NLP领域知名的预训练语言模型BERT、Bert-wwm、Roberta、ALBert以及ERNIE1.0
微调预训练语言模型,解决多标签分类任务(可加载BERT、Roberta、Bert-wwm以及albert等知名开源tf格式的模型)
chinese wwm masking and ngram masking based on jieba
We released BERT-wwm, a Chinese pre-training model based on Whole Word Masking technology, and models closely related to this technology. 我们发布了基于全词遮罩(Whole Word Masking)技术的中文预训练模型BERT-wwm,以及与此技术密切相关的模...
不用tensorflow estimator,分别采用字mask和wwm mask在中文领域内finetune bert模型
CCKS 2019 中文短文本实体链指比赛技术创新奖解决方案
从jieba分词到BERT-wwm,一步步带你进入中文NLP的世界
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
BERT-NER (nert-bert) with google bert https://github.com/google-research.
KG-BERT: BERT for Knowledge Graph Completion
BERT-related papers
翻译 - BERT相关论文
State-of-the-Art Text Embeddings
翻译 - BERT和XLNet的句子嵌入
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
chinese-roberta-wwm-ext-large修复mlm参数
Google AI 2018 BERT pytorch implementation
翻译 - Google AI 2018 BERT pytorch实施
BERT distillation(基于BERT的蒸馏实验 )
Transformer related optimization, including BERT, GPT
Bert-classification and bert-dssm implementation with keras.
bert-base-chinese example