Ongoing research training transformer models at scale
翻译 - 正在进行的大规模研究转换语言模型的研究,包括:BERT&GPT-2
#计算机科学#Supercharge Your Model Training
DeepRacer workshop content. This Guidance demonstrates how software developers can use an Amazon SageMaker Notebook instance to directly train and evaluate AWS DeepRacer models with full control
翻译 - DeepRacer研讨会内容
#计算机科学#A WebGL accelerated JavaScript library for training and deploying ML models.
翻译 - 一个WebGL加速的JavaScript库,用于训练和部署ML模型。
#自然语言处理#LLM training code for Databricks foundation models
Library for training machine learning models with privacy for training data
翻译 - 用于训练机器学习模型的库,具有用于训练数据的隐私
YOLO ModelCompression MultidatasetTraining
Training PyTorch models with differential privacy
翻译 - 使用不同的隐私训练PyTorch模型
Modeling, training, eval, and inference code for OLMo
#自然语言处理#Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
翻译 - PyTorch 中的开源预训练模型框架和预训练模型 Zoo
Minimalistic large language model 3D-parallelism training
Prune a model while finetuning or training.
Data used for LSTM model training
Generate text images for training deep learning ocr model
Chinese Transformer Generative Pre-Training Model
Examples for using ONNX Runtime for model training.
Multi-modal Content Creation Model Training Infrastructure including the FACT model (AI Choreographer) implementation.
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Opencv face recognition model training
SD-Trainer. LoRA & Dreambooth training scripts & GUI use kohya-ss's trainer, for diffusion model.
The code for the bark-voicecloning model. Training and inference.
Vessel classification: feature generation and model training/inference.
Training model for math character rec.
#计算机科学#Cramming the training of a (BERT-type) language model into limited compute.
Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
Stable Diffusion XL training and inference as a cog model