#自然语言处理#Data processing for and with foundation models! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷
#自然语言处理#Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
翻译 - PyTorch 中的开源预训练模型框架和预训练模型 Zoo
#计算机科学#Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
#大语言模型#Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
#计算机科学#Code for TKDE paper "Self-supervised learning on graphs: Contrastive, generative, or predictive"
#计算机科学#An Open-sourced Knowledgable Large Language Model Framework.
Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).
#自然语言处理#Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
#计算机科学#Unified Training of Universal Time Series Forecasting Transformers
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
#自然语言处理#Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
#Awesome# Large Language Model-enhanced Recommender System Papers
#自然语言处理#[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
#网络爬虫#Official repository for "Craw4LLM: Efficient Web Crawling for LLM Pretraining"
Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"