-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-29957][TEST] Reset MiniKDC's default enctypes to fit jdk8/jdk11 #26594
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
cc @wangyum @dongjoon-hyun |
|
Thank you @AngersZhuuuu |
|
ok to test |
|
Test build #114072 has finished for PR 26594 at commit
|
|
retest this please |
|
Test build #114083 has finished for PR 26594 at commit
|
|
strange..I can run these failed UT in local with jdk8/jdk11 success. But failed in Jenkines... |
|
Retest this please. |
For this. I can pass failed UT by run it alone, but will failed when run all UT together. |
|
@AngersZhuuuu . I also tested your PR locally. BTW, the first failure was |
|
BTW, if you can find a corresponding Apache Hadoop JIRA issue, that will be more persuasive |
pom.xml
Outdated
| <slf4j.version>1.7.16</slf4j.version> | ||
| <log4j.version>1.2.17</log4j.version> | ||
| <hadoop.version>2.7.4</hadoop.version> | ||
| <miniKdc.version>3.2.0</miniKdc.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, miniKdc -> minikdc. As you see, Spark-introduced properties are not camelcase like codahale and htmlunit. If needed, we may use -, but I prefer minikdc here in this case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW,
miniKdc->minikdc. As you see, Spark-introduced properties are are camelcase likecodahaleandhtmlunit. If needed, we may use-, but I preferminikdchere in this case.
Thank you for your careful explanation. Updated.
Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911 And in Flink: apache/flink#9622 And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read message stream failed, message can't match. |
|
Please summarize them into the PR description~ |
Done |
|
Hmm. It seems that I tested this only at Mac with AdoptOpenJDK8 and 11. How about your environment? I'm wondering if we need to validate this in the linux or not because of the following error message. |
|
Test build #114327 has finished for PR 26594 at commit
|
I test this in Mac with Oracle Jdk8 and jdk11. I can pass this UT when run it alone. My guess is that there were other tests that affected the environment variables |
|
Ya. Let me know if you find the root cause. Thanks. |
|
Test build #114329 has finished for PR 26594 at commit
|
|
Test build #114339 has finished for PR 26594 at commit
|
|
Test build #114340 has finished for PR 26594 at commit
|
|
Test build #114658 has finished for PR 26594 at commit
|
|
retest this please |
|
Test build #114677 has finished for PR 26594 at commit
|
|
I think the test comment is not yet resolved: https://github.com/apache/spark/pull/26594/files#r350680194 |
|
@gaborgsomogyi @wangyum How about current way ? |
|
Test build #114848 has finished for PR 26594 at commit
|
|
Test build #114850 has finished for PR 26594 at commit
|
|
Test build #114851 has finished for PR 26594 at commit
|
|
The approach looks good, there are still no tests... |
Here is in UT ..., I really don't know how to add UT for this, so handle it in if condition when there are no |
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala
Outdated
Show resolved
Hide resolved
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala
Outdated
Show resolved
Hide resolved
|
Test build #114884 has finished for PR 26594 at commit
|
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala
Show resolved
Hide resolved
| // scalastyle:off println | ||
| writer.println(krb5confStr) | ||
| // scalastyle:on println | ||
| writer.close() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we use Files.write(content, file, StandardCharsets.UTF_8) instead of println?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- val writer = new PrintWriter(kdc.getKrb5conf)
- // scalastyle:off println
- writer.println(krb5confStr)
- // scalastyle:on println
- writer.close()
+ Files.write(krb5confStr, kdc.getKrb5conf, StandardCharsets.UTF_8)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we use
Files.write(content, file, StandardCharsets.UTF_8)instead ofprintln?
Done thanks.
|
Test build #114926 has finished for PR 26594 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merged to master. Thank you, @AngersZhuuuu and all.
|
Thanks, everyone! This is a great fix! |
### What changes were proposed in this pull request? Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911 In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version). And in Flink: apache/flink#9622 Author show the reason why hadoop-2.7.2's `MminiKdc` failed with jdk11. Because new encryption types of `es128-cts-hmac-sha256-128` and `aes256-cts-hmac-sha384-192` (for Kerberos 5) enabled by default were added in Java 11. Spark with `hadoop-2.7's MiniKdc`does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure. And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read message stream failed, message can't match. ### Why are the changes needed? Support jdk11 with hadoop-2.7 ### Does this PR introduce any user-facing change? NO ### How was this patch tested? Existed UT Closes apache#26594 from AngersZhuuuu/minikdc-3.2.0. Lead-authored-by: angerszhu <[email protected]> Co-authored-by: AngersZhuuuu <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
What changes were proposed in this pull request?
Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version).
And in Flink: apache/flink#9622
Author show the reason why hadoop-2.7.2's
MminiKdcfailed with jdk11.Because new encryption types of
es128-cts-hmac-sha256-128andaes256-cts-hmac-sha384-192(for Kerberos 5) enabled by default were added in Java 11.Spark with
hadoop-2.7's MiniKdcdoes not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read message stream failed, message can't match.
Why are the changes needed?
Support jdk11 with hadoop-2.7
Does this PR introduce any user-facing change?
NO
How was this patch tested?
Existed UT