Skip to content

Commit 8e804c5

Browse files
committed
IPython low-level output capture and forward documentation
1 parent 9529e3d commit 8e804c5

File tree

1 file changed

+21
-0
lines changed

1 file changed

+21
-0
lines changed

docs/using/specifics.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,27 @@ This page provides details about features specific to one or more images.
1212
Note every new spark context that is created is put onto an incrementing port (ie. 4040, 4041, 4042, etc.), and it might be necessary to open multiple ports.
1313
For example: `docker run -d -p 8888:8888 -p 4040:4040 -p 4041:4041 jupyter/pyspark-notebook`.
1414

15+
#### IPython low-level output capture and forward
16+
17+
Spark images (`pyspark-notebook` and `all-spark-notebook`) have been configured to disable IPython low-level output capture and forward system-wide.
18+
The rationale behind this choice is that Spark logs can be verbose, especially at startup when Ivy is used to load additional jars.
19+
Those logs are still available but only in the container's logs.
20+
21+
If you want to make them appear in the notebook, you can overwrite the configuration in a user level IPython kernel profile.
22+
To do that you have to uncomment the following line in your `~/.ipython/profile_default/ipython_kernel_config.py` and restart the kernel.
23+
24+
```Python
25+
c.IPKernelApp.capture_fd_output = True
26+
```
27+
28+
If you have no IPython profile you can initiate a fresh one by running the following command.
29+
30+
```bash
31+
ipython profile create
32+
# [ProfileCreate] Generating default config file: '/home/jovyan/.ipython/profile_default/ipython_config.py'
33+
# [ProfileCreate] Generating default config file: '/home/jovyan/.ipython/profile_default/ipython_kernel_config.py'
34+
```
35+
1536
### Build an Image with a Different Version of Spark
1637

1738
You can build a `pyspark-notebook` image (and also the downstream `all-spark-notebook` image) with a different version of Spark by overriding the default value of the following arguments at build time.

0 commit comments

Comments
 (0)