keras版的transformer模型库
#自然语言处理#BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
翻译 - 可视化Transformer模型中的注意力的工具(BERT,GPT-2,Albert,XLNet,RoBERTa,CTRL等)
Google AI 2018 BERT pytorch implementation
翻译 - Google AI 2018 BERT pytorch实施
Code for paper Fine-tune BERT for Extractive Summarization
翻译 - 纸的代码微调BERT,用于提取摘要
#自然语言处理#Leveraging BERT and c-TF-IDF to create easily interpretable topics.
翻译 - 利用BERT和基于类的TF-IDF创建易于理解的主题。
#自然语言处理#Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Implementation of BERT that could load official pre-trained models for feature extraction and prediction
翻译 - BERT的实现可以加载官方的预训练模型以进行特征提取和预测
BERT-NER (nert-bert) with google bert https://github.com/google-research.
KG-BERT: BERT for Knowledge Graph Completion
BERT-related papers
翻译 - BERT相关论文
State-of-the-Art Text Embeddings
翻译 - BERT和XLNet的句子嵌入
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
BERT distillation(基于BERT的蒸馏实验 )
Transformer related optimization, including BERT, GPT
Bert-classification and bert-dssm implementation with keras.
bert-base-chinese example
BERT as language model, fork from https://github.com/google-research/bert
Bert-based models(BERT, MTB, CP) for relation extraction.
BERT score for text generation