-
Notifications
You must be signed in to change notification settings - Fork 0
Inoutseg fix #25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Inoutseg fix #25
Conversation
…to change it, and also using argmax instead of sum and inverting at the correct order to avoid multiplying the effect of the transformation out of proportions
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR fixes issues with the in/out segmentation region selection logic and adds the mix_prob parameter to transform classes. The fix addresses the incorrect handling of multi-channel segmentation masks and reorders operations in transform application.
Changes:
- Fixed
_apply_region_modefunction to correctly handle multi-channel segmentation masks by usingargmaxbefore applying mode inversion - Added
mix_probparameter toRandomInverseGPUandRandomHistogramEqualizationGPUclasses - Reordered operations in
RandomConvTransformGPUto applymix_probmixing before region selection - Updated configuration values for various transform parameters
- Code formatting improvements (trailing whitespace removal, function signature formatting)
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
| auglab/transforms/gpu/contrast.py | Fixed segmentation mask handling logic, added mix_prob parameter support, reordered transform operations |
| auglab/transforms/gpu/transforms.py | Added mix_prob parameter to transform instantiations, formatting improvements |
| auglab/configs/transform_params_gpu.json | Updated transform configuration parameters including probabilities and in_seg/out_seg values |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| # Mix with original based on mix_prob | ||
| if torch.rand(1).item() < self.mix_prob: | ||
| alpha = torch.rand(1, device=input.device) | ||
| channel_data[b] = alpha * orig[b] + (1 - alpha) * channel_data[b] |
Copilot
AI
Jan 14, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The mix_prob operation is applied per batch element inside the loop, which means each batch item gets a different random alpha value. This differs from the RandomConvTransformGPU where mixing happens once for all batch elements. Consider whether this per-batch randomization is intentional or if consistency across transforms is desired.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@NathanMolinier that is actually a good question, what behavior do we want. We should be consistent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both are fine, and should have a similar impact onto the training. I would not worry too much on this
| "in_seg": 0, | ||
| "out_seg": 0, |
Copilot
AI
Jan 14, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The in_seg and out_seg values are set to 0 (integer) instead of 0.0 (float) for FunctionTransform, while all other transforms use float notation (0.0, 0.1, etc.). For consistency and to avoid potential type issues, use 0.0 for float values throughout the configuration.
No description provided.