#自然语言处理#为 Jax、PyTorch 和 TensorFlow 打造的先进的自然语言处理
#计算机科学#A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch
#计算机科学#My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pret...
翻译 - 我对原始变压器模型的实现(Vaswani等)。另外,我还包括了parker.py文件,用于可视化本来似乎很难的概念。当前包括IWSLT预训练模型。
#自然语言处理#💁 Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. 🛫☑️
Minimalist NMT for educational purposes
#自然语言处理#Automatically split your PyTorch models on multiple GPUs for training & inference
#自然语言处理#Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBER...
#自然语言处理#医药知识图谱自动问答系统实现,包括构建知识图谱、基于知识图谱的流水线问答以及前端实现。实体识别(基于词典+BERT_CRF)、实体链接(Sentence-BERT做匹配)、意图识别(基于提问词+领域词词典)。
#计算机科学#Minimal implementation of Decision Transformer: Reinforcement Learning via Sequence Modeling in PyTorch for mujoco control tasks in OpenAI gym
#计算机科学#HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
#自然语言处理#Label data using HuggingFace's transformers and automatically get a prediction service
Implementation of the paper Video Action Transformer Network
This shows how to fine-tune Bert language model and use PyTorch-transformers for text classififcation
#自然语言处理#State-of-the-art NLP through transformer models in a modular design and consistent APIs.
#自然语言处理#Wonderful Matrices to Build Small Language Models
#计算机科学#A little Python application to auto tag your photos with the power of machine learning.
#计算机科学#A better PyTorch data loader capable of custom image operations and image subsets
#自然语言处理#Instructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Generative Pretrained Transformer 2 (GPT-2) for Language Modeling using the PyTorch-Transformers library.
#自然语言处理#Utilizing webscraping and state-of-the-art NLP to generate TV show episode summaries.