!docker run -t --rm -p 8501:8501 \
-v "/home/jupyter/.../saved_models/:/models/ea/1" \
-e MODEL_NAME=ea \
tensorflow/serving
--
You received this message because you are subscribed to the Google Groups "TensorFlow Extended (TFX)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfx+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/6cf19616-6bd9-4383-a5fe-fce6eb73eec9%40tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/CADGZekrEaQZ3uAaDyjxYvGGMaaqAX72Jd-T82Xkko70fb3P-RA%40mail.gmail.com.
I think this is just an interaction between docker and the notebook. “Entering event loop” means the model server is up and ready to field requests.I think you need to run the docker command with the -d switch to run the container in daemon mode so it completes and allows the next cell to run.
On Mon, Jun 1, 2020 at 21:00 'Irene Giannoumis' via TensorFlow Extended (TFX) <t...@tensorflow.org> wrote:
+Pedram Pejman to see if he can help with this
On Mon, Jun 1, 2020 at 8:55 PM Cassie Leong <cassi...@lendlease.com> wrote:
Hi everyone, I've been having an issue when I run the docker to create the REST port for model serving that is generated from TFX trainer output.--I run the following codes in the GCP Notebook but the cell never ends / successfully executed. It just takes a long time and I never see the end of it. I just wonder is it normal to take a long time to run or is there something that I missed? It always stops at this line "[evhttp_server.cc : 238] NET_LOG: Entering the event loop ..."
!docker run -t --rm -p 8501:8501 \
-v "/home/jupyter/.../saved_models/:/models/ea/1" \
-e MODEL_NAME=ea \
tensorflow/serving
You received this message because you are subscribed to the Google Groups "TensorFlow Extended (TFX)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to t...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/6cf19616-6bd9-4383-a5fe-fce6eb73eec9%40tensorflow.org.
--
You received this message because you are subscribed to the Google Groups "TensorFlow Extended (TFX)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to t...@tensorflow.org.
To unsubscribe from this group and stop receiving emails from it, send an email to tfx+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/77ff51e1-4df9-48d0-bd6d-06c9cd815d7e%40tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/77ff51e1-4df9-48d0-bd6d-06c9cd815d7e%40tensorflow.org.
To unsubscribe from this group and stop receiving emails from it, send an email to tfx+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/00444880-382a-4e53-b9f9-1ea7f1e9ee91%40tensorflow.org.
!curl -d '{"inputs": {"examples": [{"inputs/0": ["A"], "inputs/1": ["B"]}]}}' \
-X POST http://localhost:8502/v1/models/ea:predict
{ "error": "JSON Value: {\n \"Country_Code_xf\": [\n \"US\"\n ],\n \"Project_Type_xf\": [\n \"Delivery\"\n ]\n} not formatted correctly for base64 data" }
$ saved_model_cli show --dir /path/to/saved_model_dir --all
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: serving_default_examples:0
The given SavedModel SignatureDef contains the following output(s):
outputs['outputs'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict
WARNING:tensorflow:From /opt/conda/lib/python3.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1786: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
Defined Functions:
Function Name: '__call__'
Option #1
Callable with:
Argument #1
DType: list
Value: [TensorSpec(shape=(None, 3), dtype=tf.float32, name='inputs/0'), TensorSpec(shape=(None, 8), dtype=tf.float32, name='inputs/1')]
Argument #2
DType: bool
Value: True
Argument #3
DType: NoneType
Value: None
Without knowing details of your specific setup and code, generally speaking you will base64 encode requests that would otherwise be sent up as bytes (say a serialized structure, image data, etc.). So I think that you probably do want to base64 encode what you’re sending up based on what I think you’re saying.Other types of data have native representations in JSON - how TF types map to JSON is documented here:It really depends on your serving function - you can see exactly what’s expected from your model using the saved_model_cli:$ saved_model_cli show --dir /path/to/saved_model_dir --allThat will tell you what the serving function signature(s) looks like, and you can use that to determine what you need to send in your request to the model server.Saved Model CLI details can be found here:
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/00444880-382a-4e53-b9f9-1ea7f1e9ee91%40tensorflow.org.
To unsubscribe from this group and stop receiving emails from it, send an email to tfx+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/e9113357-496f-439b-b4a5-5a49d5602e9b%40tensorflow.org.
It’s pretty tough to debug what’s happening without a working example, but your model does take a string argument, and based on what you’re saying and the error, that implies a base64 encoded input. Try taking the chunk of data it appears to expect (Country Code/project type) and base64 encoding it first, then passing it up as as row data.{“instances”: [{“b64”: “<base64 encoded string of your structured data>“}] }Check out the resnet example I posted earlier, which does this with image data.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/e9113357-496f-439b-b4a5-5a49d5602e9b%40tensorflow.org.
!docker run -t --rm -d -p 8502:8501 \
-v "gs://hostedkfp-default-4hco57fcpj/tfx_pipeline_output/egress_access_pipeline/Trainer/model/92/serving_model_dir/:/models/ea2/1" \
-e MODEL_NAME=ea2 \
tensorflow/serving
docker: Error response from daemon: invalid mode: /models/ea2/2.
See 'docker run --help'.
To unsubscribe from this group and stop receiving emails from it, send an email to tfx+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfx/2fb19a4e-9dfc-4683-bf06-d25297ebe5db%40tensorflow.org.