Skip to content

Instantly share code, notes, and snippets.

@j-min
Created June 25, 2017 14:07
Show Gist options
  • Select an option

  • Save j-min/a07b235877a342a1b4f3461f45cf33b3 to your computer and use it in GitHub Desktop.

Select an option

Save j-min/a07b235877a342a1b4f3461f45cf33b3 to your computer and use it in GitHub Desktop.

Revisions

  1. j-min created this gist Jun 25, 2017.
    13 changes: 13 additions & 0 deletions exp_lr_scheduler.py
    Original file line number Diff line number Diff line change
    @@ -0,0 +1,13 @@
    # http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html

    def exp_lr_scheduler(optimizer, epoch, init_lr=0.001, lr_decay_epoch=7):
    """Decay learning rate by a factor of 0.1 every lr_decay_epoch epochs."""
    lr = init_lr * (0.1**(epoch // lr_decay_epoch))

    if epoch % lr_decay_epoch == 0:
    print('LR is set to {}'.format(lr))

    for param_group in optimizer.param_groups:
    param_group['lr'] = lr

    return optimizer