Skip to content

Commit 5048b02

Browse files
Remove spylon-kernel from all images. (#1729)
* Remove scala * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Remove scala from web * Remove scala from specifics * Remove scala and spylon Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent c9c7ba8 commit 5048b02

File tree

7 files changed

+4
-101
lines changed

7 files changed

+4
-101
lines changed

all-spark-notebook/Dockerfile

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -42,11 +42,3 @@ RUN arch=$(uname -m) && \
4242
mamba clean --all -f -y && \
4343
fix-permissions "${CONDA_DIR}" && \
4444
fix-permissions "/home/${NB_USER}"
45-
46-
# Spylon-kernel
47-
RUN mamba install --quiet --yes 'spylon-kernel' && \
48-
mamba clean --all -f -y && \
49-
python -m spylon_kernel install --sys-prefix && \
50-
rm -rf "/home/${NB_USER}/.local" && \
51-
fix-permissions "${CONDA_DIR}" && \
52-
fix-permissions "/home/${NB_USER}"

all-spark-notebook/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Jupyter Notebook Python, Scala, R, Spark Stack
1+
# Jupyter Notebook Python, R, Spark Stack
22

33
[![docker pulls](https://img.shields.io/docker/pulls/jupyter/all-spark-notebook.svg)](https://hub.docker.com/r/jupyter/all-spark-notebook/)
44
[![docker stars](https://img.shields.io/docker/stars/jupyter/all-spark-notebook.svg)](https://hub.docker.com/r/jupyter/all-spark-notebook/)

docs/using/selecting.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -175,15 +175,14 @@ communities.
175175
[Dockerfile commit history](https://github.com/jupyter/docker-stacks/commits/master/all-spark-notebook/Dockerfile) |
176176
[Docker Hub image tags](https://hub.docker.com/r/jupyter/all-spark-notebook/tags/)
177177

178-
`jupyter/all-spark-notebook` includes Python, R, and Scala support for Apache Spark.
178+
`jupyter/all-spark-notebook` includes Python and R support for Apache Spark.
179179

180180
- Everything in `jupyter/pyspark-notebook` and its ancestor images
181181
- [IRKernel](https://irkernel.github.io/) to support R code in Jupyter notebooks
182182
- [rcurl](https://cran.r-project.org/web/packages/RCurl/index.html),
183183
[sparklyr](https://spark.rstudio.com),
184184
[ggplot2](https://ggplot2.tidyverse.org)
185185
packages
186-
- [spylon-kernel](https://github.com/vericast/spylon-kernel) to support Scala code in Jupyter notebooks
187186

188187
### Image Relationships
189188

docs/using/specifics.md

Lines changed: 1 addition & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ docker run -it --rm jupyter/pyspark-notebook:spark-2.4.7 pyspark --version
7676

7777
### Usage Examples
7878

79-
The `jupyter/pyspark-notebook` and `jupyter/all-spark-notebook` images support the use of [Apache Spark](https://spark.apache.org/) in Python, R, and Scala notebooks.
79+
The `jupyter/pyspark-notebook` and `jupyter/all-spark-notebook` images support the use of [Apache Spark](https://spark.apache.org/) in Python and R notebooks.
8080
The following sections provide some examples of how to get started using them.
8181

8282
#### Using Spark Local Mode
@@ -144,24 +144,6 @@ sdf_len(sc, 100, repartition = 1) %>%
144144
# 5050
145145
```
146146

147-
##### Local Mode in Scala
148-
149-
Spylon kernel instantiates a `SparkContext` for you in variable `sc` after you configure Spark
150-
options in a `%%init_spark` magic cell.
151-
152-
```python
153-
%%init_spark
154-
# Configure Spark to use a local master
155-
launcher.master = "local"
156-
```
157-
158-
```scala
159-
// Sum of the first 100 whole numbers
160-
val rdd = sc.parallelize(0 to 100)
161-
rdd.sum()
162-
// 5050
163-
```
164-
165147
#### Connecting to a Spark Cluster in Standalone Mode
166148

167149
Connection to Spark Cluster on **[Standalone Mode](https://spark.apache.org/docs/latest/spark-standalone.html)** requires the following set of steps:
@@ -235,24 +217,6 @@ sdf_len(sc, 100, repartition = 1) %>%
235217
# 5050
236218
```
237219

238-
##### Standalone Mode in Scala
239-
240-
Spylon kernel instantiates a `SparkContext` for you in variable `sc` after you configure Spark
241-
options in a `%%init_spark` magic cell.
242-
243-
```python
244-
%%init_spark
245-
# Configure Spark to use a local master
246-
launcher.master = "spark://master:7077"
247-
```
248-
249-
```scala
250-
// Sum of the first 100 whole numbers
251-
val rdd = sc.parallelize(0 to 100)
252-
rdd.sum()
253-
// 5050
254-
```
255-
256220
### Define Spark Dependencies
257221

258222
```{note}

tests/all-spark-notebook/data/local_spylon.ipynb

Lines changed: 0 additions & 51 deletions
This file was deleted.

tests/all-spark-notebook/test_spark_notebooks.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
@pytest.mark.parametrize(
1616
"test_file",
1717
# TODO: add local_sparklyr
18-
["local_pyspark", "local_spylon", "local_sparkR", "issue_1168"],
18+
["local_pyspark", "local_sparkR", "issue_1168"],
1919
)
2020
def test_nbconvert(container: TrackedContainer, test_file: str) -> None:
2121
"""Check if Spark notebooks can be executed"""

tests/base-notebook/test_packages.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,6 @@
5555
"pytables": "tables",
5656
"scikit-image": "skimage",
5757
"scikit-learn": "sklearn",
58-
"spylon-kernel": "spylon_kernel",
5958
# R
6059
"randomforest": "randomForest",
6160
"rcurl": "RCurl",

0 commit comments

Comments
 (0)