#大语言模型#Gitleaks 是一个开源SAST(静态应用安全测试)命令行工具,用于检测Git 仓库以防止把密码、API 密钥和访问令牌等机密信息硬编码到代码中
#大语言模型#本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
#学习与技能提升#Low-code framework for building custom LLMs, neural networks, and other AI models
翻译 - Ludwig是在TensorFlow之上构建的工具箱,无需编写代码即可训练和测试深度学习模型。
#计算机科学#SkyPilot: Run AI and batch jobs on any infra (Kubernetes or 14+ clouds). Get unified execution, cost savings, and high GPU availability via a simple interface.
Efficient Triton Kernels for LLM Training
#大语言模型#An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
#大语言模型#H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://docs.h2o.ai/h2o-llmstudio/
#大语言模型#Code examples and resources for DBRX, a large language model developed by Databricks
#计算机科学#dstack is a lightweight, open-source alternative to Kubernetes & Slurm, simplifying AI container orchestration with multi-cloud & on-prem support. It natively supports NVIDIA, AMD, TPU, and Intel acce...
#大语言模型#MoBA: Mixture of Block Attention for Long-Context LLMs
DLRover: An Automatic Distributed Deep Learning System
#区块链#Nvidia GPU exporter for prometheus using nvidia-smi binary
#计算机科学#Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
LLM-PowerHouse: Unleash LLMs' potential through curated tutorials, best practices, and ready-to-use code for custom training and inferencing.
#大语言模型#LLM (Large Language Model) FineTuning
#大语言模型#irresponsible innovation. Try now at https://chat.dev/
#大语言模型#Repo for fine-tuning Casual LLMs
USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
#大语言模型#The official repo of Aquila2 series proposed by BAAI, including pretrained & chat large language models.