File tree Expand file tree Collapse file tree 1 file changed +2
-2
lines changed
Expand file tree Collapse file tree 1 file changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -184,7 +184,7 @@ inputs. Each inference thread invokes a JIT interpreter that executes the ops
184184of a model inline, one by one. This parameter sets the size of this thread
185185pool. The default value of this setting is the number of cpu cores. Please refer
186186to [ this] ( https://pytorch.org/docs/stable/notes/cpu_threading_torchscript_inference.html )
187- document for learning how to set this parameter properly.
187+ document on how to set this parameter properly.
188188
189189The section of model config file specifying this parameter will look like:
190190
@@ -204,7 +204,7 @@ within the ops (intra-op parallelism). This can be useful in many cases, includi
204204element-wise ops on large tensors, convolutions, GEMMs, embedding lookups and
205205others. The default value for this setting is the number of CPU cores. Please refer
206206to [ this] ( https://pytorch.org/docs/stable/notes/cpu_threading_torchscript_inference.html )
207- document for learning how to set this parameter properly.
207+ document on how to set this parameter properly.
208208
209209The section of model config file specifying this parameter will look like:
210210
You can’t perform that action at this time.
0 commit comments