A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
翻译 - 围绕tensor2tensor的包装器,可以灵活地训练,交互和生成神经聊天机器人的数据。
An Implementation of 'Attention is all you need' with Chinese Corpus
Datasets I have created for scientific summarization, and a trained BertSum model
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
#计算机科学#Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
Tensor2tensor experiment with SpecAugment
Tesseract and Transformer
Grammar Error Correction Based on Tensor2Tensor
Vietnamese Diacritic Restoration using Transformer Sequence-to-Sequence Model
#计算机科学#Unofficial Implementation of Universal Transformer https://arxiv.org/abs/1807.03819
#自然语言处理#Transfer learning of NLP with transformer, based on tensor2tensor
#自然语言处理#Text Sentiment Classification (Computational Intelligence Lab, ETH Zurich, 2018)
Trained Transformer model for Speech Recognition