Spark: Remove Spark 2 test assumptions for write projection#15823
Spark: Remove Spark 2 test assumptions for write projection#15823huaxingao merged 5 commits intoapache:mainfrom
Conversation
988f88d to
90ab25c
Compare
|
@Ruobing1997 I think there are more places where Spark 2 is assumed. Can you check and remove all of them? |
|
Of course! Let me take a look 👁️ |
|
seems the partial column write is not supported 🤔 seems only supported in 4.1 |
f5a629e to
7c9aeef
Compare
| @@ -433,10 +427,6 @@ public void testPartitionedFanoutCreateWithTargetFileSizeViaOption2() { | |||
|
|
|||
There was a problem hiding this comment.
If these tests don't work on 3.4/3.5/4.0, should we remove them from those versions entirely rather than keeping dead tests? And why these tests only work with Spark2 and Spark4.1?
There was a problem hiding this comment.
I agree with removing these tests... Seems dead code under the current logic 🤔
There was a problem hiding this comment.
And regarding
why these tests only work with Spark2 and Spark4.1?
I found this issue: https://issues.apache.org/jira/browse/SPARK-51290 for allowing partial column writes.
This fills default values for missing columns in DSv2 writes which seems why our tests pass on 4.1
And I have a question: I noticed this was enabled in Spark 4.1. Is there any chance this could be backported, or is it considered a new feature?
|
Thanks @huaxingao for approving the PR. Are we good to merge it? ❤️ |
|
Thanks @Ruobing1997 for the PR! Thanks @manuzhang for the review! |
|
😋 Looking forward to contributing more! |
Remove test Assumptions for Spark 2
Closes #15821