Hi, I am using XCIST / CatSim to simulate CT scans with a simple phantom containing water and Fe inserts. I observed an unexpected issue in the sinogram when using lower tube voltages (80 kVp and 100 kVp):
The sinogram values (.prep) corresponding to the metal (Fe) regions appear clipped/flattened at the top
Additional observation:
Even when increasing the tube current to 500000 mA, the clipping still persists, suggesting that this issue may not be caused by photon noise alone.
My Questions:
(1)Is this behavior expected in XCIST simulations at low kVp?
(2)Could this be caused by:
photon starvation (I ≈ 0)?
internal lower bound / clipping on detected intensity before log transform?
numerical stability handling (e.g., log floor)?
(3)If so, where in the pipeline is this handled? (e.g., detector model, log conversion, preprocessing)
(4)Is there a way to:
disable or adjust this clipping?
access the raw (unclipped) projection values?
Hi, I am using XCIST / CatSim to simulate CT scans with a simple phantom containing water and Fe inserts. I observed an unexpected issue in the sinogram when using lower tube voltages (80 kVp and 100 kVp):
The sinogram values (.prep) corresponding to the metal (Fe) regions appear clipped/flattened at the top
Additional observation:
Even when increasing the tube current to 500000 mA, the clipping still persists, suggesting that this issue may not be caused by photon noise alone.
My Questions:
(1)Is this behavior expected in XCIST simulations at low kVp?
(2)Could this be caused by:
photon starvation (I ≈ 0)?
internal lower bound / clipping on detected intensity before log transform?
numerical stability handling (e.g., log floor)?
(3)If so, where in the pipeline is this handled? (e.g., detector model, log conversion, preprocessing)
(4)Is there a way to:
disable or adjust this clipping?
access the raw (unclipped) projection values?