You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I set a value for max_tasks_per_child in the ProcessPoolExecutor, there are no new processes created.
After a process has finished the tasks, it is killed, but instead of a new process is created nothing happens and the application is stuck.
I found a temporary solution: Add self._broken as condition in the following function in process.py
def_adjust_process_count(self):
# if there's an idle process, we don't need to spawn a new one.ifself._idle_worker_semaphore.acquire(blocking=False) andself._broken:
returnprocess_count=len(self._processes)
ifprocess_count<self._max_workers:
# Assertion disabled as this codepath is also used to replace a# worker that unexpectedly dies, even when using the 'fork' start# method. That means there is still a potential deadlock bug. If a# 'fork' mp_context worker dies, we'll be forking a new one when# we know a thread is running (self._executor_manager_thread).#assert self._safe_to_dynamically_spawn_children or not self._executor_manager_thread, 'https://github.com/python/cpython/issues/90622'self._spawn_process()
CPython versions tested on:
3.11
Operating systems tested on:
Linux
The text was updated successfully, but these errors were encountered:
Bug report
Bug description:
If I set a value for max_tasks_per_child in the ProcessPoolExecutor, there are no new processes created.
After a process has finished the tasks, it is killed, but instead of a new process is created nothing happens and the application is stuck.
I found a temporary solution: Add self._broken as condition in the following function in process.py
CPython versions tested on:
3.11
Operating systems tested on:
Linux
The text was updated successfully, but these errors were encountered: