Skip to content

Concurrent futures: max_task_per_child doesn't work #115831

Closed as not planned
Closed as not planned
@tiedl25

Description

@tiedl25

Bug report

Bug description:

If I set a value for max_tasks_per_child in the ProcessPoolExecutor, there are no new processes created.
After a process has finished the tasks, it is killed, but instead of a new process is created nothing happens and the application is stuck.
I found a temporary solution: Add self._broken as condition in the following function in process.py

    def _adjust_process_count(self):
        # if there's an idle process, we don't need to spawn a new one.
        if self._idle_worker_semaphore.acquire(blocking=False) and self._broken:
            return

        process_count = len(self._processes)
        if process_count < self._max_workers:
            # Assertion disabled as this codepath is also used to replace a
            # worker that unexpectedly dies, even when using the 'fork' start
            # method. That means there is still a potential deadlock bug. If a
            # 'fork' mp_context worker dies, we'll be forking a new one when
            # we know a thread is running (self._executor_manager_thread).
            #assert self._safe_to_dynamically_spawn_children or not self._executor_manager_thread, 'https://github.com/python/cpython/issues/90622'
            self._spawn_process()

CPython versions tested on:

3.11

Operating systems tested on:

Linux

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions