-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
Thank you for your contribution! There is a question for you:
In keep_autoaugment.py, there are some codes:
"images_half.requires_grad = True
if self.early:
preds = model(images_half,True)
else:
preds = model(images_half)
**score, _ = torch.max(preds, 1)**
score.mean().backward()
slc_, _ = torch.max(torch.abs(images_half.grad), dim=1)"
Is the operation "torch.max" differentiable?In my project, it seems to stop the process of backward.
I would appreciate that if you could help me to handle this question!!!!!!!!!!!!!!!!!!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels