#自然语言处理#Unilm是一个跨任务、语言和模式的大规模自监督预训练模型
#自然语言处理#An Open-Source Framework for Prompt-Learning.
翻译 - 用于快速学习的开源工具包。
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
#自然语言处理#Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
翻译 - 汉语语言理解评估基准:数据集,基线,预训练模型,语料库和排行榜
#自然语言处理#Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
#计算机科学#[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling...
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
翻译 - [CVPR 2021]内卷化:为视觉识别反转卷积的内在性,这是一种全新的神经算子
#计算机科学#An Open-sourced Knowledgable Large Language Model Framework.
#自然语言处理#Must-read Papers on Knowledge Editing for Large Language Models.
#计算机科学#[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
#计算机科学#Official Repository for the Uni-Mol Series Methods
[MICCAI 2019 Young Scientist Award] [MEDIA 2020 Best Paper Award] Models Genesis
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
#计算机科学#Self-supervised contrastive learning for time series via time-frequency consistency
#自然语言处理#Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
#计算机科学#[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
#自然语言处理#PERT: Pre-training BERT with Permuted Language Model
A work in progress to build out solutions in Rust for MLOPs
[KDD'2024] "UrbanGPT: Spatio-Temporal Large Language Models"
[ICLR 2024 Oral] Supervised Pre-Trained 3D Models for Medical Image Analysis (9,262 CT volumes + 25 annotated classes)