TF-TRT failed to build Engine in JetsonNano

65 views
Skip to first unread message

Roberto Canale

unread,
Feb 25, 2021, 8:21:25 AM2/25/21
to TensorFlow Community Testing

Hello, I am using a JetsonNano with JetPack 4.3, Tensorflow 2.3.1 and Tensorrt 7.1.3
I have a Keras model that i covnerted to a TF-TRT model

When performing inference on the model, I get the following error:

TF-TRT Warning: Engine creation for PartitionedCall/TRTEngineOp_0_0 failed. The native segment will be used instead. Reason: Internal: Failed to build TensorRT engine

During Inference i get:

W tensorflow/compiler/tf2tensorrt/kernels/trt_engine_op.cc:629] TF-TRT Warning: Engine retrieval for input shapes: [[1,100,68,3]] failed. Running native segment for PartitionedCall/TRTEngineOp_0_0

what does it mean?

It seems like TRT is not building engines but the inference works the same.
I have performed the same inference on another PC (TF-2.4.1 and TRT 7.2) and I do not get this error. However, I have compared the inference results between the Keras and TF-TRT model and they are the same (both with the error on JetsonNano and without the error on PC)

Why are my results the same? How do I solve this? Thank you!

Reply all
Reply to author
Forward
0 new messages