The shape inference of "tf.Select","tf.Tile" and so on didn‘t work well causing some incompatible type.

13 views
Skip to first unread message

mofheka

unread,
Mar 18, 2021, 10:12:11 PM3/18/21
to MLIR
raw dialect(800 is batch size):
%275 = "tf.Tile"(%266, %274) {device = ""} : (tensor<800x1xi1>, tensor<2xi32>) -> tensor<800x?xi1>
%276 = "tf.ZerosLike"(%271) {device = ""} : (tensor<?x256xf32>) -> tensor<?x256xf32>
%277 = "tf.Select"(%275, %276, %271) {device = ""} : (tensor<800x?xi1>, tensor<?x256xf32>, tensor<?x256xf32>) -> tensor<800x256xf32>

The pass thrown the error:
/home/hejia/Documents/RhinoCradle-mlir/build/tem_graphdef/tem_new_graph_def.tf_dialect.mlir:283:12: error: 'mhlo.select' op inferred type(s) 'tensor<?x256xf32>' are incompatible with return type(s) of operation 'tensor<800x256xf32>'
    %277 = "tf.Select"(%275, %276, %271) {device = ""} : (tensor<800x?xi1>, tensor<?x256xf32>, tensor<?x256xf32>) -> tensor<800x256xf32>
           ^
/home/hejia/Documents/RhinoCradle-mlir/build/tem_graphdef/tem_new_graph_def.tf_dialect.mlir:283:12: note: see current operation: %3489 = "mhlo.select"(%679, %685, %678) : (tensor<800x256xi1>, tensor<?x256xf32>, tensor<?x256xf32>) -> tensor<800x256xf32>


Reply all
Reply to author
Forward
0 new messages