How to Create SavedModel in tensorflow 2 with functions mapping to Classify and Prediction endpoints

1,028 views
Skip to first unread message

Ryan Wheeler

unread,
Jan 25, 2020, 4:32:13 PM1/25/20
to Discuss
A similar question was asked recently but I didn't see a clear answer -- 

Previously when exporting Estimators to SavedModels we could control the behavior on what was returned from the gRPC endpoints of tensorflow serving -- namley classify could be used to return the prediction scores as well as the label vocabulary. 

1) I am specifically looking how to create a tf2 SavedModel that can return the label_vocabulary and scores like `_classification_output` previously did, Below is a simple example with two different tf functions simulating the behavior. In a real model I would use the label vocabulary and lookup tables much like the heads in estimator. 

2) I am looking for how to call different signature defs created from a SavedModel and have tensorflow serving return different results based on which signature_def is passed in 

A simple example: 

import tensorflow as tf 

class MyModel(tf.Module):
    def __init__(self):
        pass
        
    @tf.function(input_signature=(tf.TensorSpec([None], dtype=tf.int32),))
    def predict(self, x):
        return x + 1
        
    @tf.function(input_signature=(tf.TensorSpec([None], dtype=tf.int32),))
    def classify(self, x):
        return (x+1, "foo")

model = MyModel()
export_dir = "test/00000123"
if os.path.exists(export_dir):
    shutil.rmtree(export_dir)
    
signatures = {tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY : 
                model.predict.get_concrete_function(tf.constant([1])),
              "classify": model.classify.get_concrete_function(tf.constant([1]))}

tf.saved_model.save(model, export_dir, signatures=signatures)


When inspecting this model with `saved_model_cli`:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['__saved_model_init_op']:
  The given SavedModel SignatureDef contains the following input(s):
  The given SavedModel SignatureDef contains the following output(s):
    outputs['__saved_model_init_op'] tensor_info:
        dtype: DT_INVALID
        shape: unknown_rank
        name: NoOp
  Method name is:

signature_def['classify']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['x'] tensor_info:
        dtype: DT_INT32
        shape: (-1)
        name: classify_x:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output_0'] tensor_info:
        dtype: DT_INT32
        shape: (-1)
        name: PartitionedCall:0
    outputs['output_1'] tensor_info:
        dtype: DT_STRING
        shape: ()
        name: PartitionedCall:1
  Method name is: tensorflow/serving/predict

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['x'] tensor_info:
        dtype: DT_INT32
        shape: (-1)
        name: serving_default_x:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output_0'] tensor_info:
        dtype: DT_INT32
        shape: (-1)
        name: PartitionedCall_1:0
  Method name is: tensorflow/serving/predict

Defined Functions:
  Function Name: 'classify'
    Option #1
      Callable with:
        Argument #1
          x: TensorSpec(shape=(None,), dtype=tf.int32, name='x')

  Function Name: 'predict'
    Option #1
      Callable with:
        Argument #1
          x: TensorSpec(shape=(None,), dtype=tf.int32, name='x')

I noticed a few things -- 

The classify signature_def is still mapped to `Method name is: tensorflow/serving/predict`.

Question 1) Is it possible to export a tensorflow function to a different tensorflow serving gRPC enpoint (this was straightforward in estimator 

Question 2) is it possible to curl the above `classify` signature df 

When curling the default signature:

(.venv) ❱ curl -d '{"signature_name": "serving_default","instances": [1]}' \
{
    "predictions": [1.0
    ]
}%

However curling the `classify` signature exported in the signatures of the saved model
(.venv) ❱ curl -d '{"signature_name": "classify","instances": [1]}' \
{ "error": "Serving signature name: \"classify\" not found in signature def" }%
Tensorflow serving reports that the signature_name `classify` is not found in the list of signature_defs, however the saved model CLI shows otherwise. 


Taylor Smith

unread,
Feb 7, 2020, 12:17:37 PM2/7/20
to Discuss
Any solution to this?

Paige Bailey

unread,
Feb 8, 2020, 4:38:25 PM2/8/20
to Taylor Smith, Allen Lavoie, Kathy Wu, Discuss
Adding +Allen Lavoie and +Kathy Wu to help answer.

On Fri, Feb 7, 2020 at 9:17 AM Taylor Smith <tgsmit...@gmail.com> wrote:
Any solution to this?

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/a742fb11-ad01-4009-a0c4-81f1cc32af93%40tensorflow.org.


--

Paige Bailey   

Product Manager (TensorFlow)

@DynamicWebPaige

webp...@google.com


 

Ryan Wheeler

unread,
Feb 8, 2020, 5:11:46 PM2/8/20
to Discuss
A co worker (Taylor above) hashed some of this out on Friday using tensorflow nightly (it was not possible in tensorflow 2) :

class MyModel(tf.Module):
    def __init__(self, vocabulary_path):
        super(MyModel, self).__init__()
        
        initializer = tf.lookup.TextFileInitializer(
            vocabulary_path,
            tf.string,
            tf.lookup.TextFileIndex.WHOLE_LINE,
            tf.int64,
            tf.lookup.TextFileIndex.LINE_NUMBER)
        self.table = tf.lookup.StaticVocabularyTable(initializer, num_oov_buckets=1)
        
    @tf.function(input_signature=[tf.TensorSpec([None], dtype=tf.string)])
    def predict(self, x):
        print("Tracing predict")
        val = self.table.lookup(x)
        return {"prediction": val}
        
    @tf.function(input_signature=[tf.TensorSpec([None], dtype=tf.string)])
    def classify(self, x):
        print("Tracing classify")
        val = self.table.lookup(x)
        return {"classification": val + 100}
        
        
vocabulary_path = "/tmp/vocab.txt"
with open(vocabulary_path, "w") as vocabulary_file:
    vocabulary_file.write("a\nb\nc\n")
    
model = MyModel(vocabulary_path)

export_dir = "my-sick-servable/00000123"
if os.path.exists(export_dir):
    shutil.rmtree(export_dir)
    
input_spec = tf.TensorSpec([None], dtype=tf.string)
pred_fn = model.predict.get_concrete_function(input_spec)
clf_fn = model.classify.get_concrete_function(input_spec)

# Post 2.1.0?
options = tf.saved_model.SaveOptions(function_aliases={
    'classify': model.classify,
    'predict': model.predict,
})
    
signatures = {
    tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY: pred_fn,
    "classify": clf_fn,
    "predict": pred_fn,
}

tf.saved_model.save(model, export_dir, signatures=signatures, options=options)

 This verified that we are able to control the key value pairs of tensors returned from tensorflow serving (in the same manner estimator did in tensorflow 1.X

import json import requests headers = {"content-type": "application/json"} def classify(instances): data = json.dumps({"signature_name": "classify", "instances": instances}) return requests.post('http://localhost:8501/v1/models/guh:predict', data=data) def predict(instances): data = json.dumps({"signature_name": "predict", "instances": instances}) return requests.post('http://localhost:8501/v1/models/guh:predict', data=data)
print(f"'classify' signature: {json.loads(classify(['a', 'b']).content)}") print(f"'predict' signature: {json.loads(predict(['a', 'b']).content)}") # 'classify' signature: {'predictions': [100, 101]} # 'predict' signature: {'predictions': [0, 1]}


We are hoping that we can wrap a keras model in a tf.Module that provides an interface that will allow us to create tensorflow 2 keras models with the same gRPC signature as estimators (allowing them to be interchanged in production with no client side changes), something like: 

import tensorflow as tf
import os
import shutil
import tensorflow as tf

class ServingExporter(tf.Module):
    """
    A wrapper around a keras model that provides 
    a compatible signature for tenssorflow serving
    """
    def __init__(self, model, label_vocabulary):
        super(MyModel, self).__init__()
        self.classes = tf.constant(label_vocabulary)
        self.model = model

    @tf.function(input_signature=[tf.TensorSpec(None, dtype=tf.string)])
    def predict(self, x, training=False):
        print("Tracing predict")
        predictions = self.model(x, training)
        return {"scores": val, "classes": self.classes}


    
model = # a trained keras model 
compat_model = ServingExporter(model, ["foo", "bar", "qux", "baz"])

export_dir = "my-sick-servable/00000123"
if os.path.exists(export_dir):
    shutil.rmtree(export_dir)
    
input_spec = tf.TensorSpec([None], dtype=tf.string)
clf_fn = compat_model.classify.get_concrete_function(input_spec)

# Post 2.1.0?
options = tf.saved_model.SaveOptions(function_aliases={
    'predict': compat_model.predict,
})
    
signatures = {
    tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY: pred_fn,
}

tf.saved_model.save(compat_model, export_dir, signatures=signatures, options=options)


Question: Are the above the correct idioms for explicitly controlling what is returned in Tensorflow serving? 



On Saturday, February 8, 2020 at 3:38:25 PM UTC-6, Paige Bailey wrote:
Adding +Allen Lavoie and +Kathy Wu to help answer.

On Fri, Feb 7, 2020 at 9:17 AM Taylor Smith <tgsmit...@gmail.com> wrote:
Any solution to this?

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dis...@tensorflow.org.

Allen Lavoie

unread,
Feb 10, 2020, 4:24:29 PM2/10/20
to Ryan Wheeler, Discuss
Response inline

No, function_aliases will not do anything for TensorFlow Serving's method names (classify/regress). It's a new thing TPU-related thing.


So if you need to change method names, the export process can be export -> change method names -> serve.

For background, there's some disagreement about where this information lives going forward (roughly does it need to live in the SavedModel and be configured at export, or is it part of TensorFlow Serving's configuration).
 


On Saturday, February 8, 2020 at 3:38:25 PM UTC-6, Paige Bailey wrote:
Adding +Allen Lavoie and +Kathy Wu to help answer.

On Fri, Feb 7, 2020 at 9:17 AM Taylor Smith <tgsmit...@gmail.com> wrote:
Any solution to this?

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dis...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/a742fb11-ad01-4009-a0c4-81f1cc32af93%40tensorflow.org.


--

Paige Bailey   

Product Manager (TensorFlow)

@DynamicWebPaige

webp...@google.com


 

--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/d06979de-24e3-4bda-9732-f5c12c969723%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages