-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-28343][SQL][TEST] Enabling cartesian product and ansi mode for PostgreSQL testing #25109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@maropu I didn't enable spark-sql> set spark.sql.parser.ansi.enabled=true;
spark.sql.parser.ansi.enabled true
spark-sql> select 1 as false;
Error in query:
no viable alternative at input 'false'(line 1, pos 12)
== SQL ==
select 1 as false
------------^^^
spark-sql> select 1 as minus;
Error in query:
no viable alternative at input 'minus'(line 1, pos 12)
== SQL ==
select 1 as minus
------------^^^Is this we expected? |
|
ur, looks bad.... I checkd the reserved keywords in Postgresql again; How about Oracle and the other databases? Can they accept that query, too? |
|
Test build #107509 has finished for PR 25109 at commit
|
|
Teradata, Vertica, Oracle, SQL Server and DB2 can accept these queries. |
|
retest this please |
|
oh, I see.... If so, I think Spark might need to accept these queries with ansi=true... |
|
IMO we need to make spark compatible with these DBMSs with ansi=true. So, how about setting ansi=true for the PostgreSQL tests by default and turning off the ansi mode for these queries like this? Then, filing a jira for these unsupported behaviours? |
|
Test build #107521 has finished for PR 25109 at commit
|
|
Test build #107626 has finished for PR 25109 at commit
|
|
Test build #107628 has finished for PR 25109 at commit
|
|
|
||
| SELECT '' AS five, q1, q2, q1 + q2 AS plus FROM INT8_TBL; | ||
| SELECT '' AS five, q1, q2, q1 - q2 AS minus FROM INT8_TBL; | ||
| SELECT '' AS five, q1, q2, q1 - q2 AS `minus` FROM INT8_TBL; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add SPARK-28349 before this line.
|
Thank you for swift update according to the decision, @wangyum ! cc @gatorsmile and @maropu . |
|
Test build #107634 has finished for PR 25109 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
… PostgreSQL testing ## What changes were proposed in this pull request? This pr enables `spark.sql.crossJoin.enabled` and `spark.sql.parser.ansi.enabled` for PostgreSQL test. ## How was this patch tested? manual tests: Run `test.sql` in [pgSQL](https://github.com/apache/spark/tree/master/sql/core/src/test/resources/sql-tests/inputs/pgSQL) directory and in [inputs](https://github.com/apache/spark/tree/master/sql/core/src/test/resources/sql-tests/inputs) directory: ```sql cat <<EOF > test.sql create or replace temporary view t1 as select * from (values(1), (2)) as v (val); create or replace temporary view t2 as select * from (values(2), (1)) as v (val); select t1.*, t2.* from t1 join t2; EOF ``` Closes apache#25109 from wangyum/SPARK-28343. Authored-by: Yuming Wang <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
This pr enables
spark.sql.crossJoin.enabledandspark.sql.parser.ansi.enabledfor PostgreSQL test.How was this patch tested?
manual tests:
Run
test.sqlin pgSQL directory and in inputs directory: