Skip to content

Commit 30d03e8

Browse files
committed
fix.
1 parent 916fa8b commit 30d03e8

File tree

4 files changed

+112
-33
lines changed

4 files changed

+112
-33
lines changed

docs/source/conf.py

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -15,20 +15,20 @@
1515
from pathlib import Path
1616
from typing import TYPE_CHECKING
1717

18-
import pytask
18+
import pytask_parallel
1919

2020
if TYPE_CHECKING:
2121
import sphinx
2222

2323

2424
# -- Project information ---------------------------------------------------------------
2525

26-
project = "pytask"
26+
project = "pytask_parallel"
2727
author = "Tobias Raabe"
2828
copyright = f"2020, {author}" # noqa: A001
2929

3030
# The version, including alpha/beta/rc tags, but not commit hash and datestamps
31-
release = version("pytask")
31+
release = version("pytask_parallel")
3232
# The short X.Y version.
3333
version = ".".join(release.split(".")[:2])
3434

@@ -74,7 +74,7 @@
7474
copybutton_prompt_text = r"\$ |>>> |In \[\d\]: "
7575
copybutton_prompt_is_regexp = True
7676

77-
_repo = "https://github.com/pytask-dev/pytask"
77+
_repo = "https://github.com/pytask-dev/pytask-parallel"
7878
extlinks = {
7979
"pypi": ("https://pypi.org/project/%s/", "%s"),
8080
"issue": (f"{_repo}/issues/%s", "#%s"),
@@ -86,6 +86,7 @@
8686
"click": ("https://click.palletsprojects.com/en/8.0.x/", None),
8787
"coiled": ("https://docs.coiled.io/", None),
8888
"dask": ("https://docs.dask.org/en/stable/", None),
89+
"distributed": ("https://distributed.dask.org/en/stable/", None),
8990
"python": ("https://docs.python.org/3.10", None),
9091
}
9192

@@ -142,15 +143,13 @@ def linkcode_resolve(domain: str, info: dict[str, str]) -> str: # noqa: C901, P
142143

143144
linespec = f"#L{lineno}-L{lineno + len(source) - 1}" if lineno else ""
144145

145-
fn = os.path.relpath(fn, start=Path(pytask.__file__).parent)
146+
fn = os.path.relpath(fn, start=Path(pytask_parallel.__file__).parent)
146147

147-
if "+" in pytask.__version__:
148-
return (
149-
f"https://github.com/pytask-dev/pytask/blob/main/src/pytask/{fn}{linespec}"
150-
)
148+
if "+" in pytask_parallel.__version__:
149+
return f"https://github.com/pytask-dev/pytask-parallel/blob/main/src/pytask_parallel/{fn}{linespec}"
151150
return (
152-
f"https://github.com/pytask-dev/pytask/blob/"
153-
f"v{pytask.__version__}/src/pytask/{fn}{linespec}"
151+
f"https://github.com/pytask-dev/pytask-parallel/blob/"
152+
f"v{pytask_parallel.__version__}/src/pytask_parallel/{fn}{linespec}"
154153
)
155154

156155

docs/source/custom_executors.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Custom Executors
22

3-
```{important}
3+
```{caution}
44
The interface for custom executors is rudimentary right now. Please, give some feedback
55
if you managed to implement a custom executor or have suggestions for improvement.
66
@@ -9,7 +9,7 @@ could be helpful to other people. Start by creating an issue or a draft PR.
99
```
1010

1111
pytask-parallel allows you to use your parallel backend as long as it follows the
12-
interface defined by {class}`concurrent.futures.Executor`.
12+
interface defined by {class}`~concurrent.futures.Executor`.
1313

1414
In some cases, adding a new backend can be as easy as registering a builder function
1515
that receives some arguments (currently only `n_workers`) and returns the instantiated

docs/source/dask.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,22 @@
11
# Dask
22

3-
```{important}
3+
```{caution}
44
Currently, the dask backend can only be used if your workflow code is organized in a
55
package due to how pytask imports your code and dask serializes task functions
66
([issue](https://github.com/dask/distributed/issues/8607)).
77
```
88

99
Dask is a flexible library for parallel and distributed computing. You probably know it
1010
from its {class}`dask.dataframe` that allows lazy processing of big data. Here, we use
11-
{class}`dask.distributed` that provides an interface similar to
12-
{class}`concurrent.futures.Executor` to parallelize our execution.
11+
{mod}`distributed` that provides an interface similar to
12+
{class}`~concurrent.futures.Executor` to parallelize our execution.
1313

1414
There are a couple of ways in how we can use dask.
1515

1616
## Local
1717

18-
By default, using dask as the parallel backend will launch a {class}`dask.LocalCluster`
19-
with processes on your local machine.
18+
By default, using dask as the parallel backend will launch a
19+
{class}`distributed.LocalCluster` with processes on your local machine.
2020

2121
`````{tab-set}
2222
````{tab-item} CLI

docs/source/quickstart.md

Lines changed: 95 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -15,32 +15,56 @@ $ conda install -c conda-forge pytask-parallel
1515

1616
## Usage
1717

18-
To parallelize the execution of your workflow using the default backend,
19-
[loky](https://loky.readthedocs.io/), pass an integer greater than 1 or `'auto'` to the
20-
command-line interface. By default, only one worker is used.
18+
When the plugin is only installed and pytask executed, the tasks are not run in
19+
parallel.
20+
21+
For parallelization with the default backend [loky](https://loky.readthedocs.io/), you need to launch multiple workers.
22+
23+
`````{tab-set}
24+
````{tab-item} CLI
25+
:sync: cli
2126
2227
```console
23-
$ pytask -n 2
24-
$ pytask --n-workers 2
28+
pytask -n 2
29+
pytask --n-workers 2
2530
2631
# Starts os.cpu_count() - 1 workers.
27-
$ pytask -n auto
32+
pytask -n auto
2833
```
34+
````
35+
````{tab-item} Configuration
36+
:sync: configuration
37+
38+
```toml
39+
[tool.pytask.ini_options]
40+
n_workers = 2
41+
42+
# Starts os.cpu_count() - 1 workers.
43+
n_workers = "auto"
44+
```
45+
````
46+
`````
2947

3048
To use a different backend, pass the `--parallel-backend` option. The following command
3149
will execute the workflow with one worker and the loky backend.
3250

51+
`````{tab-set}
52+
````{tab-item} CLI
53+
:sync: cli
54+
3355
```console
3456
pytask --parallel-backend loky
3557
```
36-
37-
The options can also be specified in the configuration file.
58+
````
59+
````{tab-item} Configuration
60+
:sync: configuration
3861
3962
```toml
4063
[tool.pytask.ini_options]
41-
n_workers = 2
4264
parallel_backend = "loky"
4365
```
66+
````
67+
`````
4468

4569
## Backends
4670

@@ -54,7 +78,9 @@ If you parallelize the execution of your tasks using two or more workers, do not
5478

5579
### loky
5680

57-
There are multiple backends available. The default is the backend provided by loky which aims to be a more robust implementation of {class}`multiprocessing.pool.Pool` and in {class}`concurrent.futures.ProcessPoolExecutor`.
81+
There are multiple backends available. The default is the backend provided by loky which
82+
aims to be a more robust implementation of {class}`~multiprocessing.pool.Pool` and in
83+
{class}`~concurrent.futures.ProcessPoolExecutor`.
5884

5985
```console
6086
pytask --parallel-backend loky
@@ -66,14 +92,49 @@ explanation of what CPU- or IO-bound means.)
6692

6793
### `concurrent.futures`
6894

69-
You can use the values `threads` and `processes` to use the {class}`concurrent.futures.ThreadPoolExecutor` or the {class}`concurrent.futures.ProcessPoolExecutor` respectively.
95+
You can use the values `threads` and `processes` to use the
96+
{class}`~concurrent.futures.ThreadPoolExecutor` or the
97+
{class}`~concurrent.futures.ProcessPoolExecutor` respectively.
7098

71-
The `ThreadPoolExecutor` might be an interesting option for you if you have many IO-bound tasks and you do not need to create many expensive processes.
99+
The {class}`~concurrent.futures.ThreadPoolExecutor` might be an interesting option for
100+
you if you have many IO-bound tasks and you do not need to create many expensive
101+
processes.
102+
103+
`````{tab-set}
104+
````{tab-item} CLI
105+
:sync: cli
72106
73107
```console
74108
pytask --parallel-backend threads
109+
```
110+
````
111+
````{tab-item} Configuration
112+
:sync: configuration
113+
114+
```toml
115+
[tool.pytask.ini_options]
116+
parallel_backend = "threads"
117+
```
118+
````
119+
`````
120+
121+
`````{tab-set}
122+
````{tab-item} CLI
123+
:sync: cli
124+
125+
```console
75126
pytask --parallel-backend processes
76127
```
128+
````
129+
````{tab-item} Configuration
130+
:sync: configuration
131+
132+
```toml
133+
[tool.pytask.ini_options]
134+
parallel_backend = "processes"
135+
```
136+
````
137+
`````
77138

78139
```{important}
79140
Capturing warnings is not thread-safe. Therefore, warnings cannot be captured reliably
@@ -82,18 +143,37 @@ when tasks are parallelized with `--parallel-backend threads`.
82143

83144
### dask + coiled
84145

85-
dask and coiled together provide the option to execute your workflow on cloud providers like AWS, GCP or Azure. Check out the [dedicated guide](dask.md) if you are interested in that.
146+
dask and coiled together provide the option to execute your workflow on cloud providers
147+
like AWS, GCP or Azure. Check out the [dedicated guide](dask.md) if you are interested
148+
in that.
86149

87150
Using the default mode, dask will spawn multiple local workers to process the tasks.
88151

152+
`````{tab-set}
153+
````{tab-item} CLI
154+
:sync: cli
155+
89156
```console
90157
pytask --parallel-backend dask
91158
```
159+
````
160+
````{tab-item} Configuration
161+
:sync: configuration
162+
163+
```toml
164+
[tool.pytask.ini_options]
165+
parallel_backend = "dask"
166+
```
167+
````
168+
`````
92169

93170
### Custom executors
94171

95-
You can also use any custom executor that implements the {class}`concurrent.futures.Executor` interface. Read more about it in [](custom_executors.md).
172+
You can also use any custom executor that implements the
173+
{class}`~concurrent.futures.Executor` interface. Read more about it in
174+
[](custom_executors.md).
96175

97176
```{important}
98-
Please, consider contributing your executor to pytask-parallel if you believe it could be helpful to other people. Start by creating an issue or a draft PR.
177+
Please, consider contributing your executor to pytask-parallel if you believe it could
178+
be helpful to other people. Start by creating an issue or a draft PR.
99179
```

0 commit comments

Comments
 (0)