#自然语言处理#Chinese version of GPT2 training code, using BERT tokenizer.
翻译 - 使用BERT标记程序的中文版GPT2培训代码。
#自然语言处理#GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
翻译 - 使用Mesh-tensorflow库,可并行执行模型GPT2和类似GPT3的模型,并能够扩展到完整的GPT3尺寸(甚至可能更多!)。
Javascript BPE Encoder Decoder for GPT-2 / GPT-3
An implementation of training for GPT2, supports TPUs
翻译 - GPT2培训的实施,支持TPU
Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts
翻译 - Python软件包可轻松地在新文本上重新训练OpenAI的GPT-2文本生成模型
#大语言模型#Private chat with local GPT with document, images, video, etc. 100% private, Apache 2.0. Supports oLLaMa, Mixtral, llama.cpp, and more. Demo: https://gpt.h2o.ai/ https://gpt-docs.h2o.ai/
retrain gpt-2 in colab
Prompt tuning toolkit for GPT-2 and GPT-Neo
PyTorch Implementation of OpenAI GPT-2
GPT-2 Telegram Chat bot
PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3
OpenAI GPT-2 Flask API
Load GPT-2 checkpoint and generate texts
GPT-2 based essay writing AI
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
翻译 - 具有OpenAI gpt-2 Pytorch实现的简单文本生成器
Deploy OpenAI's GPT-2 to production
tf.keras implementation for OpenAI GPT 2
The fastest JavaScript BPE Tokenizer Encoder Decoder for OpenAI's GPT-2 / GPT-3 / GPT-4 / GPT-4o. Port of OpenAI's tiktoken with additional features.