#自然语言处理#A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
翻译 - 一个模型库,用于探索最新的深度学习拓扑和优化自然语言处理神经网络的技术
#自然语言处理#A collection of datasets that pair questions with SQL queries.
#自然语言处理#A Japanese tokenizer based on recurrent neural networks
#自然语言处理#Simple Solution for Multi-Criteria Chinese Word Segmentation
#自然语言处理#A frame-semantic parsing system based on a softmax-margin SegRNN.
Source code for an ACL2017 paper on Chinese word segmentation
#自然语言处理#BiLSTM-CRF for sequence labeling in Dynet
Source code for an ACL2016 paper of Chinese word segmentation
An Implementation of Transformer (Attention Is All You Need) in DyNet
Code for paper "End-to-End Reinforcement Learning for Automatic Taxonomy Induction", ACL 2018
#自然语言处理#Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
#自然语言处理#Transition-based joint syntactic dependency parser and semantic role labeler using a stack LSTM RNN architecture.
#计算机科学#Code for the paper "Extreme Adaptation for Personalized Neural Machine Translation"
#自然语言处理#Source code for the paper "Morphological Inflection Generation with Hard Monotonic Attention"
#计算机科学#An attentional NMT model in Dynet
See http://github.com/onurgu/joint-ner-and-md-tagger This repository is basically a Bi-LSTM based sequence tagger in both Tensorflow and Dynet which can utilize several sources of information about ea...
#自然语言处理#DyNet implementation of stack LSTM experiments by Grefenstette et al.
Selective Encoding for Abstractive Sentence Summarization in DyNet