#自然语言处理#KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
翻译 - KakaoBrain KoGPT(韩国生成式预训练变压器)
#大语言模型#This repository contains a hand-curated resources for Prompt Engineering with a focus on Generative Pre-trained Transformer (GPT), ChatGPT, PaLM etc
#大语言模型#The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
The PyTorch implementation of fine-tuning the GPT-2(Generative Pre-trained Transformer 2) for dialogue generation.
Large Language Models (LLMs) and Generative Pre-trained Transformers (GPTs) for Legal
Chinese Transformer Generative Pre-Training Model
(CVPR2023/TPAMI2024) Integrally Pre-Trained Transformer Pyramid Networks -- A Hierarchical Vision Transformer for Masked Image Modeling
Implementation of "SpikeGPT: Generative Pre-trained Language Model with Spiking Neural Networks"
Third party Pytorch implement of Image Processing Transformer (Pre-Trained Image Processing Transformer arXiv:2012.00364v2)
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
CyBERTron-LM is a project which collects some pre-trained Transformer-based models.
Modified implementation of DCGAN focused on generative art. Includes pre-trained models for landscapes, nude-portraits, and others.
pre-trained Language Models
Portuguese pre-trained BERT models
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Open Language Pre-trained Model Zoo