-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-26362][CORE] Remove 'spark.driver.allowMultipleContexts' to disallow multiple creation of SparkContexts #23311
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
adding @srowen, @JoshRosen, @rxin |
This comment has been minimized.
This comment has been minimized.
|
Honestly I think we can remove this. It's been bad practice for years, and keeping the support means it stays in Spark for years. This mode doesn't really work. |
|
Yea, I actually wanted to remove this but made it deprecated in case some people have a different view. +1 for just removing out. Let me update it tomorrow if there's no comment against just removing out. |
|
+1 on removing it.
…On Thu, Dec 13, 2018 at 8:00 AM, Hyukjin Kwon < ***@***.*** > wrote:
Yea, I actually wanted to remove this but made it deprecated in case some
people have a different view. +1 for just removing out. Let me update it
tomorrow if there's no comment against just removing out.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub (
#23311 (comment) ) , or mute
the thread (
https://github.com/notifications/unsubscribe-auth/AATvPB8S8zqiOC60JGg8DD72EFHs8M4kks5u4nmtgaJpZM4ZR4oS
).
|
…eation of SparkContexts
This comment has been minimized.
This comment has been minimized.
Both are not public APIs. I'm going to exclude this in MiMa. |
|
cc @jiangxb1987 |
This comment has been minimized.
This comment has been minimized.
|
+1 on remove it, and IMHO might be better to deprecate it in active 2.x version line. |
This comment has been minimized.
This comment has been minimized.
|
retest this please |
|
Test build #100126 has finished for PR 23311 at commit
|
|
Test build #100131 has finished for PR 23311 at commit
|
|
retest this please |
|
Test build #100142 has finished for PR 23311 at commit
|
|
Merged to master. |
|
late LGTM :) |
…sallow multiple creation of SparkContexts ## What changes were proposed in this pull request? Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243. Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243. I believe It should be good timing now to remove this configuration. ## How was this patch tested? Each doc was manually checked and manually tested: ``` $ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true ... scala> new SparkContext() org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at: org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939) ... org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2435) at scala.Option.foreach(Option.scala:274) at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2432) at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2509) at org.apache.spark.SparkContext.<init>(SparkContext.scala:80) at org.apache.spark.SparkContext.<init>(SparkContext.scala:112) ... 49 elided ``` Closes apache#23311 from HyukjinKwon/SPARK-26362. Authored-by: Hyukjin Kwon <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
…sallow multiple creation of SparkContexts ## What changes were proposed in this pull request? Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243. Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243. I believe It should be good timing now to remove this configuration. ## How was this patch tested? Each doc was manually checked and manually tested: ``` $ ./bin/spark-shell --conf=spark.driver.allowMultipleContexts=true ... scala> new SparkContext() org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at: org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939) ... org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2435) at scala.Option.foreach(Option.scala:274) at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2432) at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2509) at org.apache.spark.SparkContext.<init>(SparkContext.scala:80) at org.apache.spark.SparkContext.<init>(SparkContext.scala:112) ... 49 elided ``` Closes apache#23311 from HyukjinKwon/SPARK-26362. Authored-by: Hyukjin Kwon <[email protected]> Signed-off-by: Hyukjin Kwon <[email protected]>
…ingle JVM In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
…ingle JVM In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
…ingle JVM In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
…ingle JVM In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
…ingle JVM In Spark3, the option to execute multiple Spark contexts in a single JVM is not supported anymore (see apache/spark#23311). As this method was not recommended for a long time (see https://issues.apache.org/jira/browse/SPARK-2243), I removed this option also for Spark 2.4. The InProcessContextSupervisor (previously LocalContextSupervisor) still exists but should only be used for testing and local development and not in a production environment.
What changes were proposed in this pull request?
Multiple SparkContexts are discouraged and it has been warning for last 4 years, see SPARK-4180. It could cause arbitrary and mysterious error cases, see SPARK-2243.
Honestly, I didn't even know Spark still allows it, which looks never officially supported, see SPARK-2243.
I believe It should be good timing now to remove this configuration.
How was this patch tested?
Each doc was manually checked and manually tested: