Skip to content

test: enable ignored 4.0 tests, enable ansi mode#3454

Merged
parthchandra merged 4 commits intoapache:mainfrom
parthchandra:spark-4.0-support
Mar 17, 2026
Merged

test: enable ignored 4.0 tests, enable ansi mode#3454
parthchandra merged 4 commits intoapache:mainfrom
parthchandra:spark-4.0-support

Conversation

@parthchandra
Copy link
Copy Markdown
Contributor

enabling tests that fail in 4.0, and with ansi mode enabled to see what still fails

@parthchandra parthchandra marked this pull request as draft February 9, 2026 11:28
@parthchandra parthchandra changed the title [DRAFT] test: enable ignored 4.0 tests, enable ansi mode test: [DRAFT] enable ignored 4.0 tests, enable ansi mode Feb 9, 2026
@parthchandra parthchandra force-pushed the spark-4.0-support branch 2 times, most recently from eedc50a to e08426c Compare March 9, 2026 18:18
@parthchandra parthchandra marked this pull request as ready for review March 9, 2026 19:41
@parthchandra parthchandra changed the title test: [DRAFT] enable ignored 4.0 tests, enable ansi mode test: enable ignored 4.0 tests, enable ansi mode Mar 9, 2026
Comment thread dev/diffs/4.0.1.diff
Comment thread dev/diffs/4.0.1.diff Outdated
Comment on lines 3666 to 3670
+ */
+ protected def enableCometAnsiMode: Boolean = {
+ val v = System.getenv("ENABLE_COMET_ANSI_MODE")
+ v != null && v.toBoolean
+ if (v != null) v.toBoolean else true
+ }
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This method can be removed. We no longer have a config in Comet for this.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(of course, can be done in a separate PR)

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, nm, it looks like this is for enabling Spark ANSI mode, not Comet ANSI mode, so maybe the method name and env var name are just confusing. I will take another look at this tomorrow.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spark 4 Ansi mode is the default. This may be leftover from 3.5.

Copy link
Copy Markdown
Contributor

@kazuyukitanimura kazuyukitanimura left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @parthchandra

Comment thread dev/diffs/4.0.1.diff Outdated
Comment thread dev/diffs/4.0.1.diff Outdated
Comment thread dev/diffs/4.0.1.diff
@parthchandra
Copy link
Copy Markdown
Contributor Author

@kazuyukitanimura @andygrove thank you for looking at this PR. I found an issue with this diff. Let me change this to draft.

@parthchandra parthchandra marked this pull request as draft March 10, 2026 22:36
@parthchandra parthchandra marked this pull request as ready for review March 11, 2026 23:14
@parthchandra
Copy link
Copy Markdown
Contributor Author

@andygrove @kazuyukitanimura this is ready for review again.

Comment thread dev/diffs/4.0.1.diff Outdated
Comment on lines +401 to +412
- val aqePlanRoot = findNodeInSparkPlanInfo(inMemoryScanNode.get,
- _.nodeName.contains("ResultQueryStage"))
- aqePlanRoot.get.children.head.nodeName == "AQEShuffleRead"
+ aqeNode.get.children.head.nodeName == "AQEShuffleRead" ||
+ (aqeNode.get.children.head.nodeName.contains("WholeStageCodegen") &&
+ aqeNode.get.children.head.children.head.nodeName == "ColumnarToRow" &&
+ aqeNode.get.children.head.children.head.children.head.nodeName == "InputAdapter" &&
+ aqeNode.get.children.head.children.head.children.head.children.head.nodeName ==
+ "AQEShuffleRead")
+ // Spark 4.0 wraps results in ResultQueryStage. The coalescing indicator is AQEShuffleRead
+ // as the direct child of InputAdapter.
+ // AdaptiveSparkPlan -> ResultQueryStage -> WholestageCodegen ->
+ // CometColumnarToRow -> InputAdapter -> AQEShuffleRead (if coalesced)
+ val resultStage = aqeNode.get.children.head // ResultQueryStage
+ val wsc = resultStage.children.head // WholeStageCodegen
+ val c2r = wsc.children.head // ColumnarToRow or CometColumnarToRow
+ val inputAdapter = c2r.children.head // InputAdapter
+ inputAdapter.children.head.nodeName == "AQEShuffleRead"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new test is not equivalent because some of the nodeName tests are skipped.
Should the change be something like

...
(aqeNode.get.children.head.children.head.nodeName == "ColumnarToRow" ||
aqeNode.get.children.head.children.head.nodeName == "CometColumnarToRow") &&
...

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the check for the node names. Apart from the CometColumnarToRow, the test was failing because there is an additional ResultQuerryStage. in Spark 4.0

Comment thread dev/diffs/4.0.1.diff
Copy link
Copy Markdown
Member

@andygrove andygrove left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM Thanks @parthchandra

Copy link
Copy Markdown
Contributor

@kazuyukitanimura kazuyukitanimura left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @parthchandra

@parthchandra parthchandra merged commit 47a5fb6 into apache:main Mar 17, 2026
157 of 158 checks passed
@parthchandra
Copy link
Copy Markdown
Contributor Author

Merged. Thank you @andygrove @kazuyukitanimura

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants