#大语言模型#Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any...
Stable Diffusion and Flux in pure C/C++
#大语言模型#INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
#大语言模型#Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization
#计算机科学#Suno AI's Bark model in C/C++ for fast text-to-speech generation
#安卓#Whisper Dart is a cross platform library for dart and flutter that allows converting audio to text / speech to text / inference from Open AI models
#大语言模型#Self-evaluating interview for AI coders
#计算机科学#Port of MiniGPT4 in C++ (4bit, 5bit, 6bit, 8bit, 16bit CPU inference with GGML)
CLIP inference in plain C/C++ with no extra dependencies
#大语言模型#Large Language Models for All, 🦙 Cult and More, Stay in touch !
Inference Vision Transformer (ViT) in plain C/C++ with ggml