Skip to content
  • Neta Zmora's avatar
    0a8a3b31
    Bug fix: remove softmax layer from model loading code · 0a8a3b31
    Neta Zmora authored
    We should only add softmax when we explicitly require it (as when
    exporting to ONNX), because CrossEntropyLoss implicitly computes
    softmax on the logits it receives as input.
    
    This cade was left there by mistake and should have never been
    pushed to git.
    0a8a3b31
    Bug fix: remove softmax layer from model loading code
    Neta Zmora authored
    We should only add softmax when we explicitly require it (as when
    exporting to ONNX), because CrossEntropyLoss implicitly computes
    softmax on the logits it receives as input.
    
    This cade was left there by mistake and should have never been
    pushed to git.
Loading