Skip to content

🐛 [Bug] Failed to use stoi to get the engine ID in TRTEngine function #982

Closed
@bowang007

Description

@bowang007

Bug Description

Got this error:

INFO: [Torch-TensorRT - Debug Build] - [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +132, now: CPU 0, GPU 540 (MiB)
terminate called after throwing an instance of 'std::invalid_argument'
  what():  stoi
Aborted (core dumped)

Debugged and found that it fails at this line: https://github.com/NVIDIA/Torch-TensorRT/blob/c95229144432d96f4bdaa71fb1d242242d42bc29/core/runtime/TRTEngine.cpp#L63

To Reproduce

MaskRCNN model from detectron2.

Expected behavior

should figure out why we cannot find a number from this engine.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingrelease: patchThis change needs to go out as a patch to the current version

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions