#大语言模型#PhoGPT: Generative Pre-training for Vietnamese (2023)
#自然语言处理#An autoregressive language model like ChatGPT.
#大语言模型#A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by...
#大语言模型#Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
#计算机科学#ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on ...
#自然语言处理#An Industrial Project about NLP in Finance Application
#自然语言处理#Repository for personal experiments
(GPT-1) | Generative Pre-trained Transformer - 1
#自然语言处理#Repository for all things Natural Language Processing
#大语言模型#PyTorch implementation of GPT from scratch
I built a GPT model from scratch to generate text
#自然语言处理#A Generatively Pretrained Transformer that generates Shakespeare-eque quotes