Skip to content

Conversation

@Myasuka
Copy link
Member

@Myasuka Myasuka commented Sep 13, 2016

What changes were proposed in this pull request?

Currently, if we build Spark, it will generate a spark-version-info.properties and merged into spark-core_2.11-*.jar. However, the script build/spark-build-info which generates this file can only be executed with bash environment.
Without this file, errors like below will happen when submitting Spark application, which break the whole submitting phrase at beginning.

RROR ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
    at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:394)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:247)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:759)
    at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
    at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:757)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.util.concurrent.ExecutionException: Boxed Error
    at scala.concurrent.impl.Promise$.resolver(Promise.scala:55)
    at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:47)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:244)
    at scala.concurrent.Promise$class.tryFailure(Promise.scala:112)
    at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:648)
Caused by: java.lang.ExceptionInInitializerError
    at org.apache.spark.package$.<init>(package.scala:91)
    at org.apache.spark.package$.<clinit>(package.scala)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:187)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:187)
    at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
    at org.apache.spark.SparkContext.logInfo(SparkContext.scala:76)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:187)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2287)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:822)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:814)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:814)
    at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
    at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:630)
Caused by: org.apache.spark.SparkException: Error while locating file spark-version-info.properties
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:75)
    at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:61)
    at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
    ... 19 more
Caused by: java.lang.NullPointerException
    at java.util.Properties$LineReader.readLine(Properties.java:434)
    at java.util.Properties.load0(Properties.java:353)
    at java.util.Properties.load(Properties.java:341)
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:64)
    ... 21 more

I add build/spark-build-info.ps1 to generate spark-version-info.properties file in Windows, and modify core/pom.xml and project/SparkBuild.scala to support this scenario when building Spark with maven or sbt.

How was this patch tested?

Tested on my local Windows 10 machine, which generated the spark-version-info.properties under core/target/extra-resources folder as expected.

@srowen
Copy link
Member

srowen commented Sep 13, 2016

I don't think we support building on Windows. I think it's fine to make minor changes to accommodate Windows no matter what, but this is non-trivial.

@felixcheung
Copy link
Member

felixcheung commented Sep 16, 2016

+@HyukjinKwon we do windows for R

@HyukjinKwon
Copy link
Member

HyukjinKwon commented Sep 16, 2016

Thanks for cc'ing me @felixcheung. Actually, I took a look and wanted to leave a comment but I didn't because I couldn't find a concrete reason or reference to support my opinion and also I know @srowen is an expert in this area.

FWIW and as I am already here - I tend to agree with Sean. I think it'd be nicer if all tests pass on Windows and pre-built releases work fine on Windows but it might not necessarily meam that we should support a proper build on Windows. Also, I know we mentioned that we support Windows in http://spark.apache.org/docs/latest/#downloading but I don't think this refers that we support to build Spark on Windows.

I am willing to take a close look if Sean/comitters approve and confirm that it is worth being added.

@SparkQA
Copy link

SparkQA commented Sep 16, 2016

Test build #3274 has finished for PR 15078 at commit ffd219b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@jodersky
Copy link
Member

How wide-spread is powershell on windows computers? I assume that modern versions ship with it, but I also heard that they come with a bash-like shell in the latest version. Does the current build work in the new environment?

@HyukjinKwon
Copy link
Member

HyukjinKwon commented Sep 28, 2016

FWIW, doesn't Power shell allows executing bash? (I haven't tried that by myself but I think I saw some usages and documentation before)

(Oh, this is what @jodersky pointed out right above)

@felixcheung
Copy link
Member

felixcheung commented Sep 29, 2016

I'm not sure older Windows versions have powershell installed by default..

It looks like Bash for Windows is Windows 10 only https://msdn.microsoft.com/en-us/commandline/wsl/install_guide

@HyukjinKwon
Copy link
Member

From the discussion above, I would like to stay against this PR.

@Myasuka
Copy link
Member Author

Myasuka commented Sep 29, 2016

@jodersky Actually, BashOnWindows only exists on Windows10 Anniversary Update. Moreover, this feature needs to be manually enabled to take effects.

And I don't think this bash environment in BashOnWindows worked very well and stably so that could be used in production built in the short term.

@srowen
Copy link
Member

srowen commented Oct 4, 2016

Let's close this PR for now.

srowen added a commit to srowen/spark that referenced this pull request Oct 12, 2016
@asfgit asfgit closed this in eb69335 Oct 12, 2016
@guoxiaolongzte
Copy link

guoxiaolongzte commented Apr 21, 2017

In Linux
mvn -Dtest=none -DwildcardSuites=org.apache.spark.deploy.rest.StandaloneRestSubmitSuite test
I want to know why?thank you!@HyukjinKwon
Report errors:
StandaloneRestSubmitSuite:
*** RUN ABORTED ***
java.lang.ExceptionInInitializerError:
at org.apache.spark.package$.(package.scala:91)
at org.apache.spark.package$.(package.scala)
at org.apache.spark.deploy.rest.RestSubmissionClient.constructSubmitRequest(RestSubmissionClient.scala:179)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply$mcV$sp(StandaloneRestSubmitSuite.scala:58)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply(StandaloneRestSubmitSuite.scala:54)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply(StandaloneRestSubmitSuite.scala:54)
at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
...
Cause: org.apache.spark.SparkException: Error while locating file spark-version-info.properties
at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:75)
at org.apache.spark.package$SparkBuildInfo$.(package.scala:61)
at org.apache.spark.package$SparkBuildInfo$.(package.scala)
at org.apache.spark.package$.(package.scala:91)
at org.apache.spark.package$.(package.scala)
at org.apache.spark.deploy.rest.RestSubmissionClient.constructSubmitRequest(RestSubmissionClient.scala:179)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply$mcV$sp(StandaloneRestSubmitSuite.scala:58)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply(StandaloneRestSubmitSuite.scala:54)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply(StandaloneRestSubmitSuite.scala:54)
at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
...
Cause: java.lang.NullPointerException:
at java.util.Properties$LineReader.readLine(Properties.java:434)
at java.util.Properties.load0(Properties.java:353)
at java.util.Properties.load(Properties.java:341)
at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:64)
at org.apache.spark.package$SparkBuildInfo$.(package.scala:61)
at org.apache.spark.package$SparkBuildInfo$.(package.scala)
at org.apache.spark.package$.(package.scala:91)
at org.apache.spark.package$.(package.scala)
at org.apache.spark.deploy.rest.RestSubmissionClient.constructSubmitRequest(RestSubmissionClient.scala:179)
at org.apache.spark.deploy.rest.StandaloneRestSubmitSuite$$anonfun$1.apply$mcV$sp(StandaloneRestSubmitSuite.scala:58)
...

@HyukjinKwon
Copy link
Member

@guoxiaolongzte, Let's ask a question to mailing list. Up to my knowledge, we should build first before running tests properly and maybe the errors are related with it.

zifeif2 pushed a commit to zifeif2/spark that referenced this pull request Nov 22, 2025
Closes apache#15303
Closes apache#15078
Closes apache#15080
Closes apache#15135
Closes apache#14565
Closes apache#12355
Closes apache#15404

Author: Sean Owen <[email protected]>

Closes apache#15451 from srowen/CloseStalePRs.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants