Polynomial Learning Rate Decay Scheduler for PyTorch
Pytorch cyclic cosine decay learning rate scheduler
Gradually-Warmup Learning Rate Scheduler for PyTorch
Cyclic learning rate TensorFlow implementation.
It contains Data Augmentaion, Strided convolution, Batch Normalization, Leaky Relu, Global Average pooling, L2 Regularization, learning rate decay, He initialization, Tensorboard, Save, Restore
PyTorch implementation of CNNs for CIFAR benchmark
翻译 - CIFAR基准的CNN的PyTorch实施
Transfer Learning using state-of-the-art CNN architectures (ResNet34 and Xception). Class engineering, learning rate/weight decay tuning and one-cycle policy are implemented.
PyTorch learning rate scheduler CosineAnnealingWarmRestarts with initial linear warmup for n steps followed by wight decay in consecutive cycles
Implemented the deep learning techniques using Google Tensorflow that cover deep neural networks with a fully connected network using SGD and ReLUs; Regularization with a multi-layer neural network us...
Learning Rate Warmup in PyTorch
A learning rate range test implementation in PyTorch
Decoupled Weight Decay Regularization (ICLR 2019)
On the Variance of the Adaptive Learning Rate and Beyond
翻译 - 自适应学习率的方差及超越
Adaptive and Momental Bounds for Adaptive Learning Rate Methods.
Python Implementation of Decay Replay Mining (DREAM)
PyTorch optimizers with sparse momentum and weight decay
Deep Learning 中文翻译
Sample code for learning redux with hooks by creating an exchange rate calculator
ASP.NET Core rate limiting middleware
翻译 - ASP.NET Core限速中间件
Rate limiter middleware
Simple Tensorflow implementation of "Adaptive Gradient Methods with Dynamic Bound of Learning Rate" (ICLR 2019)
API Rate Limit Decorator
#计算机科学#《MIT Deep Learning》PDF格式
Basic rate-limiting middleware for the Express web server