#大语言模型#A RWKV management and startup tool, full automation, only 8MB. And provides an interface compatible with the OpenAI API. RWKV is a large language model that is fully open source and available for comm...
#大语言模型#INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
使用Gradio制作的基于RWKV的角色扮演的webui
Vision-RWKV: Efficient and Scalable Visual Perception with RWKV-Like Architectures
Framework agnostic python runtime for RWKV models
A torchless, c++ rwkv implementation using 8bit quantization, written in cuda/hip/vulkan for maximum compatibility and minimum dependencies
RWKV-v2-RNN trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.
📖 — Notebooks related to RWKV
LLaMa/RWKV onnx models, quantization and testcase
RWKV infctx trainer, for training arbitary context sizes, to 10k and beyond!
Gradio UI for RWKV LLM
BlinkDL's RWKV-v4 running in the browser
RWKV (Receptance Weighted Key Value) is a RNN with Transformer-level performance
Enhancing LangChain prompts to work better with RWKV models
This project aims to make RWKV Accessible to everyone using a Hugging Face like interface, while keeping it close to the R and D RWKV branch of code.
The nanoGPT-style implementation of RWKV Language Model - an RNN with GPT-level LLM performance.
MultilingualShareGPT, the free multi-language corpus for LLM training