Skip to content

fix: reject incompatible decimal precision/scale in native_datafusion scan#4090

Merged
andygrove merged 3 commits intoapache:mainfrom
andygrove:fix-issue-4089-decimal-precision
Apr 27, 2026
Merged

fix: reject incompatible decimal precision/scale in native_datafusion scan#4090
andygrove merged 3 commits intoapache:mainfrom
andygrove:fix-issue-4089-decimal-precision

Conversation

@andygrove
Copy link
Copy Markdown
Member

@andygrove andygrove commented Apr 25, 2026

Which issue does this PR close?

Closes #4089.

Rationale for this change

When the native_datafusion scan reads a Parquet column whose physical type is Decimal(p1, s1) under a requested read schema of Decimal(p2, s2) with s2 < s1, the existing schema adapter falls through to Spark's Cast expression. Cast happily truncates fractional digits, producing wrong values silently. Spark's vectorized reader rejects this with SchemaColumnConvertNotSupportedException, and native_iceberg_compat already does the same via TypeUtil.checkParquetType. The native scan should match.

What changes are included in this PR?

native/core/src/parquet/schema_adapter.rs: in replace_with_spark_cast, add a guard before the existing branches that returns DataFusionError::Plan when both physical_type and target_type are Decimal128 and the target scale is smaller than the source scale.

The check is intentionally narrow:

How are these changes tested?

Added a focused test to ParquetReadSuite: native_datafusion rejects incompatible decimal precision/scale. It writes Decimal(10, 2) data, reads it under Decimal(5, 0) (scale narrowed from 2 to 0), forces spark.comet.scan.impl=native_datafusion and spark.sql.sources.useV1SourceList=parquet, and asserts collect() raises SparkException. Verified against ParquetReadV1Suite (44 tests, all pass; 1 pre-existing test ignored).

The behavior is also covered by the per-impl matrix added in #4087 (decimal(10,2) read as decimal(5,0): native_datafusion), whose assertion will need flipping from "succeeds" to "throws" once that PR merges.

@andygrove andygrove added correctness bug Something isn't working labels Apr 25, 2026
… scan

The native_datafusion Spark physical expression adapter previously fell
through to a Spark Cast for decimal-to-decimal type changes, which
silently rescales or truncates values that should have raised an error.
Mirror Spark's TypeUtil.isDecimalTypeMatched (Spark 3.x rule) by
rejecting reads where the target precision is smaller than the source
precision or the scales differ.

Closes apache#4089.
@andygrove andygrove force-pushed the fix-issue-4089-decimal-precision branch from 1194d82 to 99e1235 Compare April 26, 2026 13:04
@mbutrovich
Copy link
Copy Markdown
Contributor

In a case where we expect an exception to be generated anyway, can we catch this at CometScanRule rather than going all the way to serialization and native operators?

@andygrove
Copy link
Copy Markdown
Member Author

In a case where we expect an exception to be generated anyway, can we catch this at CometScanRule rather than going all the way to serialization and native operators?

I think the issue is that we do not know the types of all the parquet files until runtime?

@mbutrovich
Copy link
Copy Markdown
Contributor

mbutrovich commented Apr 26, 2026

I think the issue is that we do not know the types of all the parquet files until runtime?

IIRC from looking at this a while back, Spark has read the physical schema already, but thrown it away by the time our Comet rules run with no good way to get it again.

I'm not opposed to handling it this way, just wanted to think through if we could catch it earlier. It's also a fairly uncommon scenario.

@andygrove
Copy link
Copy Markdown
Member Author

I'm not opposed to handling it this way, just wanted to think through if we could catch it earlier. It's also a fairly uncommon scenario.

I do think this is an edge case that is fairly unlikely IRL because it only happens when the user provides a schema that is incompatible with the file schema

@mbutrovich
Copy link
Copy Markdown
Contributor

Thanks @andygrove! I double-checked Spark's logic and the scale-narrowing guard looks right for preventing silent truncation.

Spark's isDecimalTypeMatched (line 1169 of ParquetVectorUpdaterFactory.java) requires exact scale match, so scale widening like Decimal(10,2) -> Decimal(10,4) would also be rejected by Spark's vectorized reader but passes through here. The PR description explains why that's fine (lossless cast), which makes sense. Would it be worth noting that in the code comment so future readers don't have to check the PR description?

Per review feedback, document that scale widening (e.g. Decimal(10,2)
read as Decimal(10,4)) is also rejected by Spark's vectorized reader
via isDecimalTypeMatched but is allowed here because the cast is
lossless. Reference Spark's ParquetVectorUpdaterFactory check directly
in the comment so future readers don't need to consult the PR
description.
@andygrove
Copy link
Copy Markdown
Member Author

Expanded the comment in replace_with_spark_cast to document both the precision-only and scale-widening cases and to reference isDecimalTypeMatched directly. Pushed in fcbec76.

Copy link
Copy Markdown
Contributor

@mbutrovich mbutrovich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @andygrove!

…l-precision

# Conflicts:
#	native/core/src/parquet/schema_adapter.rs
#	spark/src/test/scala/org/apache/comet/parquet/ParquetReadSuite.scala
@andygrove andygrove merged commit 4b086bb into apache:main Apr 27, 2026
119 checks passed
@andygrove andygrove deleted the fix-issue-4089-decimal-precision branch April 27, 2026 22:41
@andygrove
Copy link
Copy Markdown
Member Author

Merged. Thanks @mbutrovich

andygrove added a commit to andygrove/datafusion-comet that referenced this pull request Apr 28, 2026
…4090, apache#4091

Cases 4 (Decimal(10,2)->Decimal(5,0)) and 6 (STRING->INT) now throw
SparkException on native_datafusion after the schema adapter rejection
fixes landed on main. Update assertions and the behavior matrix.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working correctness

Projects

None yet

Development

Successfully merging this pull request may close these issues.

native_datafusion: incompatible decimal precision/scale read silently succeeds without value validation

2 participants