#大语言模型#RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN a...
#计算机科学#Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
AI Code Completions
翻译 - 全语言自动完成器:https://tabnine.com/
This repository contains demos I made with the Transformers library by HuggingFace.
翻译 - 这个存储库包含我用 HuggingFace 的 Transformers 库制作的演示。
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
翻译 - 使用Mesh-tensorflow库,可并行执行模型GPT2和类似GPT3的模型,并能够扩展到完整的GPT3尺寸(甚至可能更多!)。
#自然语言处理#Chinese version of GPT2 training code, using BERT tokenizer.
翻译 - 使用BERT标记程序的中文版GPT2培训代码。
[NeurIPS 2024 Best Paper][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction". An *ultr...
#自然语言处理#Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
#自然语言处理#An unnecessarily tiny implementation of GPT-2 in NumPy.
#自然语言处理#Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
翻译 - PyTorch 中的开源预训练模型框架和预训练模型 Zoo
#自然语言处理#GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
#自然语言处理#Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
#大语言模型#Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJ...
#自然语言处理#Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
翻译 - Kashgari是用于文本标签和文本分类的可立即投入生产的NLP Transfer学习框架,其中包括Word2Vec,BERT和GPT2语言嵌入。
#自然语言处理#Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow. This is part of the CASL project: http://casl-project.ai/
#计算机科学#Large-scale pretraining for dialogue
翻译 - 对话的大规模预培训
#大语言模型#Simple UI for LLM Model Finetuning
A Large-scale Chinese Short-Text Conversation Dataset and Chinese pre-training dialog models
#计算机科学#Guide to using pre-trained large language models of source code
#自然语言处理#🦄 State-of-the-Art Conversational AI with Transfer Learning
翻译 - with具备转移学习功能的最先进的会话式AI