-
Notifications
You must be signed in to change notification settings - Fork 365
fix: support c10::List type of output in MarkOutputs #419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: inocsin <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a bit concerned because say we have a TensorList Value %out1
. The output of the TensorRT engine could have like 10 outputs associated with that one Value. If there is an additional Tensor Value output %out2
the output of the engine will have 11 tensors but we don't really have a way to determine which are supposed to be in a list and which should be by themselves. Right now the operator interface for all engines is trt::execute_engine(TRTEngine, Tensor[]) -> Tensor[]
. @peri044 I think we discussed this regarding tuples as well. We would need to store more information about output formatting and change the graph construction step in compile.cpp
to take the output of trt::execute_engine
and reconstruct the expected output format, but that is a much larger PR i believe.
core/conversion/conversion.cpp
Outdated
ctx->logger, "Marking Output " << out->debugName() << " named " << name << " in engine (ctx.MarkOutput)"); | ||
ctx->num_outputs += 1; | ||
} else { | ||
TRTORCH_THROW_ERROR("Unknown output type. Only a single tensor or a TensorList type is supported."); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should make this Error a bit more clear, something like
Unsupported output type, only Tensors or unwrapped collections of Tensors can be marked as engine outputs but found type: [Print type here]
core/conversion/conversion.cpp
Outdated
@@ -173,24 +173,34 @@ void AddInputs(ConversionCtx* ctx, at::ArrayRef<const torch::jit::Value*> inputs | |||
} | |||
#endif | |||
} | |||
void MarkOutputsOfIvalue(ConversionCtx* ctx, c10::IValue out_ivalue, const torch::jit::Value* out) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
rename to MarkIValueOutputs
Signed-off-by: inocsin <[email protected]>
Currently this PR can be put on hold as it involves a broader change across other components like graph synthesis as described in #428 |
This PR has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days |
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes # (issue)
Type of change
Please delete options that are not relevant and/or add your own.
Support c10::Listc10::IValue type of output type in void MarkOutputs(ConversionCtx ctx, at::ArrayRef<const torch::jit::Value> outputs)**
Checklist: