Thanks,
Ron
def _serving_input_fn(cont_feature_columns, cat_feature_columns):
cont_feature_placeholders = {column : tf.placeholder(dtype = tf.float64, shape = [None, 1], name = column) for column in cont_feature_columns}
cat_feature_placeholders = {column : tf.placeholder(dtype = tf.string, shape = [None, 1], name = column) for column in cat_feature_columns}
feature_placeholders = dict(list(cont_feature_placeholders.items()) + list(cat_feature_placeholders.items()))
features = {column : tensor for column, tensor in feature_placeholders.items()}
label = None
return InputFnOps(features, label, feature_placeholders)
where the shape of the tensor is [None, 1].
When I try to use the Tensorflow Serving Java API, I can't seem to get it to work with this savedmodel/DNNClassifierIris.
This is kind of out of scope of JPMML but I need to be able to show the difference in scores (if any) between Tensorflow Serving and JPMML along with any performance differences.
Any help sorting out how to build the TensorProto input for this model would be great...
Thanks,
Ron