Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
XLNet: Generalized Autoregressive Pretraining for Language Understanding
翻译 - XLNet:用于语言理解的广义自回归预训练
🆕 Demo code for the Natural Language Understanding Service.
Samples for the Language Understanding Intelligent Service (LUIS)
翻译 - 语言理解智能服务(LUIS)的样本
This is a tutorial that aims to demonstrate the practical fundamentals behind using LanguageExt in a fashion though step-by-step tutorials which introduce and then build up on concepts.
#自然语言处理#Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
翻译 - ERNIE的语言理解实现(包括预训练模型和微调工具)
Multi-Task Deep Neural Networks for Natural Language Understanding
翻译 - 用于自然语言理解的多任务深度神经网络
PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
A python chatbot framework with Natural Language Understanding and Artificial Intelligence.
JGLUE: Japanese General Language Understanding Evaluation
Adversarial Training for Natural Language Understanding
Measuring Massive Multitask Language Understanding | ICLR 2021
LUKE -- Language Understanding with Knowledge-based Embeddings
Chinese Biomedical Language Understanding Evaluation benchmark (ChineseBLUE)
DeepSeek-VL: Towards Real-World Vision-Language Understanding
NEZHA: Neural Contextualized Representation for Chinese Language Understanding
Spoken Language Understanding(SLU)/Slot Filling in Keras
🤗 ParsBERT: Transformer-based Model for Persian Language Understanding
datasets of natural language understanding and dialogue state tracking
Python toolkit for Chinese Language Understanding(CLUE) Evaluation benchmark
mPLUG-DocOwl: Modularized Multimodal Large Language Model for Document Understanding