Skip to content

Fix Reshape with 0-dims and rank change (issue #2104)#2142

Open
kali wants to merge 2 commits intomainfrom
fix-reshape-zero
Open

Fix Reshape with 0-dims and rank change (issue #2104)#2142
kali wants to merge 2 commits intomainfrom
fix-reshape-zero

Conversation

@kali
Copy link
Copy Markdown
Collaborator

@kali kali commented Apr 17, 2026

compute_shape_with_tf_rules used a volume-matching approach to replace 0s in the shape spec, which failed when consecutive input dims of value 1 caused the iterator to not advance. Replace with simple positional substitution per the ONNX spec: shape[i]=0 means copy input[i].

Fixes Moonshine TTS model loading where RoPE reshapes like [1,52,8,32] → [0,0,8,16,2] produced [1,1,8,16,2] instead of [1,52,8,16,2].

compute_shape_with_tf_rules used a volume-matching approach to
replace 0s in the shape spec, which failed when consecutive input
dims of value 1 caused the iterator to not advance. Replace with
simple positional substitution per the ONNX spec: shape[i]=0 means
copy input[i].

Fixes Moonshine TTS model loading where RoPE reshapes like
[1,52,8,32] → [0,0,8,16,2] produced [1,1,8,16,2] instead of
[1,52,8,16,2].
@kali kali force-pushed the fix-reshape-zero branch from 2337bd0 to f2222b3 Compare April 17, 2026 15:26
ONNX Shape/Size outputs are declared TDim in tract so symbolic
dims survive shape-plumbing chains. The Cast loader already
rewrites Cast(to=i64) to Cast(to=TDim) to keep that invariant
when the exporter inserts an explicit int64 round-trip.

Some exporters (e.g. Moonshine) cast shape values to int32
instead. Widen the same rewrite to i32 so those chains also
stay in TDim; without this, the first Cast(TDim->i32) loses
the symbol and downstream Reshape lowering bails with
"shape input is variable".
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant