Editing
Optimization Algorithms
(section)
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== <span style="color: #FFFFFF;">Applying</span> == '''Learning rate schedule with warmup and cosine decay:''' <syntaxhighlight lang="python"> import torch import torch.nn as nn import math class WarmupCosineSchedule(torch.optim.lr_scheduler._LRScheduler): def __init__(self, optimizer, warmup_steps, total_steps, eta_min=0.0, last_epoch=-1): self.warmup_steps = warmup_steps self.total_steps = total_steps self.eta_min = eta_min super().__init__(optimizer, last_epoch) def get_lr(self): step = self.last_epoch if step < self.warmup_steps: # Linear warmup return [base_lr * step / max(1, self.warmup_steps) for base_lr in self.base_lrs] # Cosine decay progress = (step - self.warmup_steps) / max(1, self.total_steps - self.warmup_steps) return [self.eta_min + (base_lr - self.eta_min) * 0.5 * (1 + math.cos(math.pi * progress)) for base_lr in self.base_lrs] # Standard transformer training recipe model = nn.TransformerEncoder(...) optimizer = torch.optim.AdamW( model.parameters(), lr=3e-4, betas=(0.9, 0.999), # Momentum coefficients eps=1e-8, weight_decay=0.1 # Decoupled L2 regularization ) scheduler = WarmupCosineSchedule(optimizer, warmup_steps=1000, total_steps=100000) # Training step for batch in dataloader: loss = model(batch) loss.backward() torch.nn.utils.clip_grad_norm_(model.parameters(), max_norm=1.0) # Gradient clipping optimizer.step() scheduler.step() optimizer.zero_grad() </syntaxhighlight> ; Optimizer selection guide : '''Standard deep learning''' β AdamW (default for transformers, NLP, vision) : '''ResNet / CNN training''' β SGD + momentum (0.9) + cosine decay (ImageNet recipe) : '''Very large batch''' β LARS (vision) or LAMB (NLP) for stable large-batch training : '''RNN / LSTM''' β Adam + gradient clipping (max_norm=5.0) : '''Fast research iteration''' β Adam with liberal LR (1e-3), no schedule initially </div> <div style="background-color: #8B4500; color: #FFFFFF; padding: 20px; border-radius: 8px; margin-bottom: 15px;">
Summary:
Please note that all contributions to BloomWiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
BloomWiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Tools
What links here
Related changes
Special pages
Page information