Skip to content

[spark] Fix required configuration check for Paimon SparkSession extensions#7365

Open
lsm1 wants to merge 1 commit intoapache:masterfrom
lsm1:features/fix-check-conf
Open

[spark] Fix required configuration check for Paimon SparkSession extensions#7365
lsm1 wants to merge 1 commit intoapache:masterfrom
lsm1:features/fix-check-conf

Conversation

@lsm1
Copy link

@lsm1 lsm1 commented Mar 8, 2026

Purpose

Fix false failures in Paimon required Spark configuration checks by making the check rely on the current SparkSession configuration (instead of a temporary/active SQLConf), so Spark SQL operations (including creating/querying views across session restarts) don’t incorrectly report missing spark.sql.extensions.

Tests

  • Spark SQL manual verification:
     create table paimon.default.p1 (
         k int,
         v string
     ) USING paimon
     tblproperties (
         'primary-key' = 'k'
     );
     insert into table paimon.default.p1 select 1,'2';
     create view v2 as  select * from paimon.default.p1;
     --- restart spark-sql session
     select * from v2;
  • UT: paimon-spark-ut added/updated cases for required conf checks with
    temporary SQLConf.

API and Format

Documentation

Generative AI tooling

@cxzl25
Copy link
Contributor

cxzl25 commented Mar 11, 2026

Could you please help take a look at this issue? It might be preventing Spark from querying views created in Paimon.

@Zouxxyy @JingsongLi

#5327

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants