PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
#计算机科学#Pytorch implementation of preconditioned stochastic gradient descent (Kron and affine preconditioner, low-rank approximation preconditioner and more)
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
翻译 - 一个现代的C ++接口,用于制定和解决线性,二次和二阶锥问题。
#计算机科学#Distributed K-FAC Preconditioner for PyTorch
#计算机科学#An implementation of PSGD Kron second-order optimizer for PyTorch
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
This repository implements FEDL using pytorch
#计算机科学#Tensorflow implementation of preconditioned stochastic gradient descent
#计算机科学#PyTorch implementation of the Hessian-free optimizer
Hessian-based stochastic optimization in TensorFlow and keras
Implementation of PSGD optimizer in JAX
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
Subsampled Riemannian trust-region (RTR) algorithms
Prototyping of matrix free Newton methods in Julia