Skip to content

Deadlock in create_subprocess_exec when using Semaphore and asyncio.subprocess.PIPE #115787

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Timmmm opened this issue Feb 21, 2024 · 1 comment
Labels
type-bug An unexpected behavior, bug, or error

Comments

@Timmmm
Copy link

Timmmm commented Feb 21, 2024

Bug report

Bug description:

import asyncio

async def _stream_subprocess(id: int, command: "list[str]"):
    proc = await asyncio.create_subprocess_exec(*command, stdout=asyncio.subprocess.PIPE)
    await proc.wait()
    print(f"{id}: Done")


async def run(id: int, command: "list[str]"):
    print(f"{id}: Running")
    await _stream_subprocess(id, command)
    print(f"{id}: Throwing")

    raise RuntimeError("failed")


async def run_with_parallelism_limit(id: int, command: "list[str]", limit: asyncio.Semaphore):
    async with limit:
        print(f"{id}: Calling run")
        await run(id, command)
        print(f"{id}: Run finished")


async def main():
    sem = asyncio.Semaphore(1)
    await asyncio.gather(
        run_with_parallelism_limit(0, ["python", "--version"], sem),
        run_with_parallelism_limit(1, ["python", "--version"], sem),
    )


if __name__ == "__main__":
    asyncio.run(main())

Output:

0: Calling run
0: Running
0: Done
0: Throwing
1: Calling run
1: Running

And then it deadlocks. If you press Ctrl-C it seems to be stuck in Runner._cancel_all_tasks(). Unfortunately Debugpy couldn't even interrupt it at that point.

If you remove stdout=asyncio.subprocess.PIPE then it doesn't deadlock.

CPython versions tested on:

3.9, 3.11

Operating systems tested on:

Linux, Windows

@graingert
Copy link
Contributor

graingert commented Oct 22, 2024

The problem seems to be that asyncio.gather is leaking tasks when one throws and then asyncio.run is running Runner._cancel_all_tasks, if you use a TaskGroup this doesn't have an issue:

import asyncio


async def _stream_subprocess(id: int, command: "list[str]"):
    proc = await asyncio.create_subprocess_exec(
        *command, stdout=asyncio.subprocess.PIPE
    )
    await proc.wait()
    print(f"{id}: Done")


async def run(id: int, command: "list[str]"):
    print(f"{id}: Running")
    await _stream_subprocess(id, command)
    print(f"{id}: Throwing")

    raise RuntimeError("failed")


async def run_with_parallelism_limit(
    id: int, command: "list[str]", limit: asyncio.Semaphore
):
    async with limit:
        print(f"{id}: Calling run")
        await run(id, command)
        print(f"{id}: Run finished")


async def main():
    sem = asyncio.Semaphore(1)
    async with asyncio.TaskGroup() as tg:
        tg.create_task(run_with_parallelism_limit(0, ["python", "--version"], sem))
        tg.create_task(run_with_parallelism_limit(1, ["python", "--version"], sem))


if __name__ == "__main__":
    asyncio.run(main())

This issue is a duplicate of #103847

@graingert graingert closed this as not planned Won't fix, can't repro, duplicate, stale Oct 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type-bug An unexpected behavior, bug, or error
Projects
None yet
Development

No branches or pull requests

2 participants