On the Variance of the Adaptive Learning Rate and Beyond
翻译 - 自适应学习率的方差及超越
RAdam implemented in Keras & TensorFlow
翻译 - 在Keras和TensorFlow中实施RAdam
Ranger - a synergistic optimizer using RAdam (Rectified Adam), Gradient Centralization and LookAhead in one codebase
翻译 - Ranger-在一个代码库中使用RAdam(Rectified Adam)和LookAhead的协同优化器
Radamsa fuzzer ported to rust lang
RAdam optimizer for keras
An Android port of radamsa fuzzer
Simple Tensorflow implementation of "On The Variance Of The Adaptive Learning Rate And Beyond"
Radamsa fuzzer extension for Burp Suite
在苏剑林老师的代码上改了一下,改成了python3.6,基于膨胀卷积,字词混合向量,radam梯度优化算法,百度百科词向量的阅读理解模型
2019 Fall semester: Dynamic Simulations class. Visualize optimizers using MATLAB - GD, SGD, Momentum, Adagrad, Adadelta, RMSProp, Adam, NAdam, RAdam