Skip to content

Conversation

@sunhaibotb
Copy link
Contributor

What is the purpose of the change

This pull request bumps MiniKdc to 3.2.0 to fix the failure of YARNSessionFIFOSecuredITCase on Java 11.

The failure of YARNSessionFIFOSecuredITCase is due to the failure of authentication when the yarn client requests access authorization of resource manager, and subsequent retries lead to test timeout. New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11, while the current version of MiniKdc used by Flink does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.

Brief change log

  • Bumps MiniKdc to 3.2.0

Verifying this change

This change is already covered by existing tests, such as YARNSessionFIFOSecuredITCase.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): (yes / no)
  • The public API, i.e., is any changed class annotated with @Public(Evolving): (yes / no)
  • The serializers: (yes / no / don't know)
  • The runtime per-record code paths (performance sensitive): (yes / no / don't know)
  • Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: (yes / no / don't know)
  • The S3 file system connector: (yes / no / don't know)

Documentation

  • Does this pull request introduce a new feature? (yes / no)
  • If yes, how is the feature documented? (not applicable / docs / JavaDocs / not documented)

@flinkbot
Copy link
Collaborator

flinkbot commented Sep 5, 2019

Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
to review your pull request. We will use this comment to track the progress of the review.

Automated Checks

Last check on commit 6b31336 (Wed Oct 16 08:34:21 UTC 2019)

Warnings:

  • 1 pom.xml files were touched: Check for build and licensing issues.
  • No documentation files were touched! Remember to keep the Flink docs up to date!

Mention the bot in a comment to re-run the automated checks.

Review Progress

  • ❓ 1. The [description] looks good.
  • ❓ 2. There is [consensus] that the contribution should go into to Flink.
  • ❓ 3. Needs [attention] from.
  • ❓ 4. The change fits into the overall [architecture].
  • ❓ 5. Overall code [quality] is good.

Please see the Pull Request Review Guide for a full explanation of the review process.

Details
The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands
The @flinkbot bot supports the following commands:

  • @flinkbot approve description to approve one or more aspects (aspects: description, consensus, architecture and quality)
  • @flinkbot approve all to approve all aspects
  • @flinkbot approve-until architecture to approve everything until architecture
  • @flinkbot attention @username1 [@username2 ..] to require somebody's attention
  • @flinkbot disapprove architecture to remove an approval you gave earlier

@flinkbot
Copy link
Collaborator

flinkbot commented Sep 5, 2019

CI report:

@zentol
Copy link
Contributor

zentol commented Sep 5, 2019

Have you verified that this works with all hadoop version we explicitly support?

@sunhaibotb
Copy link
Contributor Author

Yes. I have used Java 11 and the following hadoop versions to verify the flink-yarn-tests module, which works well. @zentol

  • Hadoop 2.4.1
  • Hadoop 2.6.5
  • Hadoop 2.7.5
  • Hadoop 2.8.3

@zentol
Copy link
Contributor

zentol commented Sep 5, 2019

minikdc isn't just used by flink-yarn-tests but also kafka and the filesystem connector, along with all users of SecureTestEnvironment. Did you verify these as well?

@sunhaibotb
Copy link
Contributor Author

sunhaibotb commented Sep 6, 2019

No, other modules using MiniKdc only were tested with the version 2.4.1 of Hadoop. Today I'll verify all.

@sunhaibotb
Copy link
Contributor Author

sunhaibotb commented Sep 6, 2019

I used Java 11 and the Hadoop versions explicitly supported by Flink to verify that the following modules, which depend on MiniKdc, work well.

  • flink-yarn-tests
  • flink-connectors/flink-connector-filesystem
  • flink-connectors/flink-connector-kafka-0.9
  • flink-connectors/flink-connector-kafka-base
  • flink-test-utils-parent/flink-test-utils

@zentol

@zentol
Copy link
Contributor

zentol commented Sep 6, 2019

Could you rebase the PR and remove the FailsOnJava11 category from the YARNSessionFIFOSecuredITCase?

@zentol zentol self-assigned this Sep 6, 2019
@sunhaibotb
Copy link
Contributor Author

I updated the PR and Travis is successful. Thanks for reviewing @zentol.

@sunhaibotb
Copy link
Contributor Author

@zentol is on vacation. Can you help me merge this PR @tillrohrmann ?

@zentol zentol merged commit d35f699 into apache:master Oct 2, 2019
@sunhaibotb sunhaibotb deleted the FLINK-13516 branch October 6, 2019 01:36
dongjoon-hyun pushed a commit to apache/spark that referenced this pull request Dec 6, 2019
### What changes were proposed in this pull request?

Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version).

And in Flink: apache/flink#9622
Author show the reason why hadoop-2.7.2's  `MminiKdc` failed with jdk11.
Because new encryption types of `es128-cts-hmac-sha256-128` and `aes256-cts-hmac-sha384-192` (for Kerberos 5) enabled by default were added in Java 11.
Spark with `hadoop-2.7's MiniKdc`does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.

And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is  read message stream failed, message can't match.

### Why are the changes needed?
Support jdk11 with hadoop-2.7

### Does this PR introduce any user-facing change?
NO

### How was this patch tested?
Existed UT

Closes #26594 from AngersZhuuuu/minikdc-3.2.0.

Lead-authored-by: angerszhu <[email protected]>
Co-authored-by: AngersZhuuuu <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
attilapiros pushed a commit to attilapiros/spark that referenced this pull request Dec 6, 2019
### What changes were proposed in this pull request?

Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version).

And in Flink: apache/flink#9622
Author show the reason why hadoop-2.7.2's  `MminiKdc` failed with jdk11.
Because new encryption types of `es128-cts-hmac-sha256-128` and `aes256-cts-hmac-sha384-192` (for Kerberos 5) enabled by default were added in Java 11.
Spark with `hadoop-2.7's MiniKdc`does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.

And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is  read message stream failed, message can't match.

### Why are the changes needed?
Support jdk11 with hadoop-2.7

### Does this PR introduce any user-facing change?
NO

### How was this patch tested?
Existed UT

Closes apache#26594 from AngersZhuuuu/minikdc-3.2.0.

Lead-authored-by: angerszhu <[email protected]>
Co-authored-by: AngersZhuuuu <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants