Metadata-Version: 2.1
Name: abel-pytorch
Version: 0.0.1
Summary: ABEL Scheduler
Home-page: https://github.com/tourdeml/abel-pytorch
Author: Vaibhav Balloli
Author-email: balloli.vb@gmail.com
License: MIT
Keywords: learning rate,pytorch
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Description-Content-Type: text/markdown

# How to decay your Learning Rate (PyTorch)

PyTorch implementation of `ABEL` LRScheduler based on weight-norm. If you find this work interesting, do consider starring the repository. If you use this in your research, don't forget to cite!

[Original paper](https://arxiv.org/pdf/2103.12682v1.pdf)

[Docs](https://abel-pytorch.readthedocs.io/en/latest/)

## Installation

WIP - not available on PyPi yet.
```
pip install abel-pytorch
```

## Usage

```python
import torch
from torch import nn, optim
from abel import ABEL

model = resnet18()
optim = optim.SGD(model.parameters(), 1e-3)
scheduler = ABEL(optim, 0.9)

for i, (images, labels) in enumerate(trainloader):
  # forward pass...
  optim.step()
  scheduler.step()

```

## Cite original paper:
```
@article{lewkowycz2021decay,
  title={How to decay your learning rate},
  author={Lewkowycz, Aitor},
  journal={arXiv preprint arXiv:2103.12682},
  year={2021}
}
```

## Cite this work:
```
@misc{abel2021pytorch,
  author = {Vaibhav Balloli},
  title = {A PyTorch implementation of ABEL},
  year = {2021},
  howpublished = {\url{https://github.com/tourdeml/abel-pytorch}}
}
```

