-
Notifications
You must be signed in to change notification settings - Fork 21
Imaging: Infrastructure, Polarization, Order Parameter and other helpers. #121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
…y for larger images). Also filtering instead of clipping (fixed bug where u and v values would explode)
Updated polarization calculation. Priority given to reference_radius followed by max_neighbours. Added docstrings in lattice.py Removed Union from dataset.py and lattice.py (This was causing merge conflicts in dataset.py)
…test was passing on local.
|
The notebook is attached here : drift_corr_22.ipynb |
|
Updated notebook with new terminology and defaults: |
cophus
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks pretty good, though very long! As long as we carefully test on different polarization cases, this is a great PR.
| return dr, dc, amp, disp_cap_px | ||
|
|
||
|
|
||
| def site_colors(number): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like the order of site_colors, and I don't see why some of the plot_atoms methods are starting with blue and green instead of red and blue at the start. Suggested colors:
(1.00, 0.00, 0.00), # 0: red
(0.00, 0.70, 1.00), # 1: lighter blue
(0.00, 0.70, 0.00), # 2: green with lower perceptual brightness
(1.00, 0.00, 1.00), # 3: magenta
(1.00, 0.70, 0.00), # 4: orange
(0.00, 0.00, 1.00), # 5: full blue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made this change. Will commit and push with all the other changes.
I was getting a little confused about what would happen if I used something other than RGB, but after playing around a bit, I think I've gotten a better idea of colormaps now.
|
|
||
| return self | ||
|
|
||
| def measure_polarization( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need to plot some kind of a legend here (can be an optional flag) to show the relationship between the sublattices and the reference points.
Also
plot_polarization_vectors = True
isn't plotting arrows like my original function, but instead blocky triangles.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| return self | ||
|
|
||
| # --- Plotting Functions --- | ||
| def plot_polarization_vectors( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The default plotting should definitely be arrows, not triangles
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just checked, the default is still arrows. In this case, the tail of the arrow is so small that we only see the head which is why it appears as a triangle.
This code and the output shows that scaling the vectors makes them more appear better:
pol = lattice.measure_polarization(
measure_ind=0,
reference_ind=1,
min_neighbours=2,
max_neighbours=6,
reference_radius = 25,
plot_polarization_vectors = True,
length_scale = 10.0,
)
|
|
||
| return fig, ax | ||
|
|
||
| def plot_polarization_image( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed this. This is the new plot. It looks much closer in shape to the original image.
Here are the debugging stats:
Square tiles : True
Pixel size a: 16, b: 16
Canvas size: 590 X 410 pixels
Image size : 576 X 576 pixelsSquare tiles : False
Pixel size a: 14, b: 19
Canvas size: 526 X 476 pixels
Image size : 576 X 576 pixels
square_tiles = False, gives a much better result but is not perfect, but that is primarily due to rounding of pixel_size to int.
Also, image_size includes padding and edge atoms are not measured, so the canvas for the tiled image is expected to be slightly smaller.
| num_components = num_phases | ||
|
|
||
| # ========== Combined Plot: Scatter overlaid on Contour ========== | ||
| if plot_gmm_visualization: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is it plot_gmm_classification or plot_gmm_visualization?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is "plot_gmm_visualization".
I made a typo in the notebook. I think I changed it between commits and forgot to update it in both places.
|
|
||
| return out | ||
|
|
||
| def calculate_order_parameter( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The automated order parameter stuff is cool, but unstable. Running multiple times:


So maybe we should add an option to run with restarts, i.e. we run it 10 times and keep the best one. We could also add a verbose output option to print the centers, so the user can then use those as starting guesses for manual placement. And finally the coloring can be made more stable by for example sorting by theta or du or some variable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added the verbose and run_with_restarts. It is much more stable now. Will commit it along with the other changes. Also will write pytests for it.
The error metric that I used to identify best fit was:
error = 1 - mean of probabilities of best fit gaussian
This is an example of the updated function call:
lattice = lattice.calculate_order_parameter(
pol,
num_phases = 2,
# phase_polarization_peak_array=np.array([
# [-0.05, -0.06],
# [-0.07, -0.15],
# ]),
refine_means = True,
run_with_restarts= True,
num_restarts=10,
verbose = True,
plot_gmm_classification=True,
plot_order_parameter=True,
theta_shift_deg=135,
)
The verbose output is as follows:
Restart 1/10:
Means:
[[-0.05009812 -0.09428304]
[-0.04911226 -0.05252744]]
Error: 0.2764
Restart 2/10:
Means:
[[-0.07159821 -0.11598569]
[-0.0319477 -0.05124588]]
Error: 0.0994
Restart 3/10:
Means:
[[-0.07552166 -0.12696803]
[-0.03684867 -0.05695215]]
Error: 0.0831
Restart 4/10:
Means:
[[-0.05594809 -0.04905914]
[-0.0472936 -0.09287718]]
Error: 0.2146
Restart 5/10:
Means:
[[-0.03199476 -0.05130583]
[-0.07165027 -0.11609169]]
Error: 0.0992
Restart 6/10:
Means:
[[-0.07159617 -0.11598093]
[-0.03194538 -0.05124338]]
Error: 0.0994
Restart 7/10:
Means:
[[-0.04768286 -0.09615319]
[-0.05406971 -0.04775179]]
Error: 0.2077
Restart 8/10:
Means:
[[-0.04591268 -0.06583382]
[-0.06921173 -0.15349995]]
Error: 0.0446
Restart 9/10:
Means:
[[-0.07172052 -0.1164266 ]
[-0.03218292 -0.05143669]]
Error: 0.0991
Restart 10/10:
Means:
[[-0.06921186 -0.15347165]
[-0.0459066 -0.06581662]]
Error: 0.0446
Best results after restarts:
Means:
[[-0.04591268 -0.06583382]
[-0.06921173 -0.15349995]]
Error: 0.0446
Looking at the commented out initial guess and the final best fit, the results are pretty stable and good.
src/quantem/imaging/lattice.py
Outdated
| # Second: Overlay scatter points with classification colors | ||
| point_colors = create_colors_from_probabilities( | ||
| probabilities, num_components, scatter_colours | ||
| ) # FIXED: pass scatter_colours |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please remove useless comments
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed unnecessary comments.
src/quantem/imaging/lattice.py
Outdated
| ): | ||
| """ | ||
| Build and return an RGB superpixel image indexed by integer (a,b), colored by | ||
| the same JCh cyclic mapping used for polarization vectors. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what does "JCh cyclic mapping" mean?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
JCh cycling refers to building a cyclic (wrap‑around) colormap by tracing a path in the J–C–h coordinates of a color‑appearance model, typically CIECAM02.
J is perceptual lightness (how light/dark a color appears).
C is chroma (colorfulness/saturation).
h is hue angle (position around the color wheel, 0–360°).
This is the response given by AI. I asked ChatGPT for help with the Docstring, and apparently this is technical term for the type of colormap we are using to determine the color based on the angle and magnitude.
I can remove this term or simplify it a bit if you'd like.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest you explain this in simpler language
src/quantem/imaging/lattice.py
Outdated
|
|
||
| # Create a smooth transition function | ||
| def smooth_transition(x): | ||
| return 4 * x**3 - 3 * x**4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be 3x^2 - 2x^2 for a symmetric sigmoid function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I made this change. Will commit and push with all the other changes.
tests/imaging/test_lattice.py
Outdated
|
|
||
| assert lattice is not None | ||
|
|
||
| def test_lattice_with_complex_numbers(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why are we supporting complex numbers here? Raises a warning in pytest too:
tests/imaging/test_lattice.py::TestLatticeRobustness::test_lattice_with_complex_numbers
/Users/cophus/repos/quantem/src/quantem/imaging/lattice.py:69: ComplexWarning: Casting complex values to real discards the imaginary part
arr = arr.astype(float, copy=False)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was also debating whether to support complex numbers or not. I kept it as I thought we could use it for training models in the future on simulated datasets. Although, in that case I should probably switch to take the absolute value of the data in case of complex numbers.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No absolute values please - real! Also you should never need to write .real() [or .abs()]
…alize_order_parameter() and pytest for Lattice(AutoSerialize).
|
Heads-up: you'll need to pull dev into the PR to enable the updated automated checks. |
…nd cleaned pytests. Removed unnecessary pytests.
…Added plot_atoms_2d().
|
@cophus I have added all the changes requested by you. This PR is ready for review. Here is the updated notebook. |



This PR has all the basic infrastructure for imaging and lattice, integration with Dataset2d, along with updates to Drift Correction.
The current method involves defining the lattice using origin, u and v vectors first, and then add atoms to it. Identifying atoms first and defining the lattice based on that is WIP. Once ready, will modify so that either method can be chosen based on requirement.
Added functions to measure physical polarization and calculate order parameter based on polarization for phase identification.
Created a local torch based Gaussian Mixture Model for order parameter calculation. This avoids any additional dependencies and can be relocated if required (depending upon demand).
Added multiple plotting functions and helper functions to create various plots like bathroom tile polarization, polarization vector plot, custom 2phase colorbar and 3phase color triangle.
Also added pytests for Lattice and TorchGMM.