Skip to content
Snippets Groups Projects
Commit f922973a authored by Neta Zmora's avatar Neta Zmora
Browse files

Bug fix: set the overall loss when not using a compression scheduler

If compression_scheduler==None, then we need to set the value of
losses[OVERALL_LOSS_KEY] (so it is the same as losses[OBJECTIVE_LOSS_KEY]).
This was overlooked.
parent bc982f7e
No related branches found
No related tags found
No related merge requests found
...@@ -509,6 +509,8 @@ def train(train_loader, model, criterion, optimizer, epoch, ...@@ -509,6 +509,8 @@ def train(train_loader, model, criterion, optimizer, epoch,
if lc.name not in losses: if lc.name not in losses:
losses[lc.name] = tnt.AverageValueMeter() losses[lc.name] = tnt.AverageValueMeter()
losses[lc.name].add(lc.value.item()) losses[lc.name].add(lc.value.item())
else:
losses[OVERALL_LOSS_KEY].add(loss.item())
# Compute the gradient and do SGD step # Compute the gradient and do SGD step
optimizer.zero_grad() optimizer.zero_grad()
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment