--
You received this message because you are subscribed to the Google Groups "What-If Tool" group.
To unsubscribe from this group and stop receiving emails from it, send an email to what-if-tool...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/62ebba49-24e4-48b5-b76a-ae01fcd27c77%40googlegroups.com.
| ||||||||||
Hi Vaibhav,Yes, you can use WIT with any python predictor. The API you are looking for is "set_custom_predict_fn()". Here is an example notebook that uses this API to set keras models for text prediction: WIT text demo. Please let me know if you have any further questions!Best
On Mon, Aug 19, 2019 at 4:13 PM <vsin...@gmail.com> wrote:
Hi,--I am wondering if WitConfigBuilder can be used with other ML libraries than Tensorflow.The API seems to have methods like set_estimator_and_feature_spec() that accept TF style classifier and feature specs only.Regards,Vaibhav Singh.
You received this message because you are subscribed to the Google Groups "What-If Tool" group.
To unsubscribe from this group and stop receiving emails from it, send an email to what-i...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/62ebba49-24e4-48b5-b76a-ae01fcd27c77%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to what-if-tool...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/f373907a-48fa-48cd-842b-2381c310dca6%40googlegroups.com.
You are correct that currently this method only accepts tf.Examples. So you will need to convert your data to tf.Examples to pass to WitConfigBuilder, then convert them back from tf.Examples in your custom_predict_fn to send to your model. I know its not ideal. In the future, I could imagine a change where if you provide JSON objects to WitConfigBuilder instead of tf.Examples, then the custom_predict_fn is provided the examples in JSON format as opposed to tf.Example. But currently that isn't how it works.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/f373907a-48fa-48cd-842b-2381c310dca6%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to what-if-tool...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/dc09f638-9c1c-44c3-aafc-1cc84ad437d1%40googlegroups.com.

Do you see filled in inference results in the bottom left when you click on an example dot in the visualization? Do you see any error text in the top-right of the visualization?
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/dc09f638-9c1c-44c3-aafc-1cc84ad437d1%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to what-if-tool...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/0fb24f3a-68b5-4334-ba25-0d013b707d5f%40googlegroups.com.

The top-right shows an error occurred during calling of your function "Columns must be supplied". It's possible that your custom predict function isn't converting the tf.Examples to the appropriate format before sending them to the model. Perhaps print out your examples in your custom prediction function after converting from tf.Example but before sending them to the model to see if the format has an issue? As far as what the custom prediction function should return you should have a 2D array of numbers such as:[[example0Class0Score, example0Class1Score], [example1Class0Score, example1Class1Score]], ...]
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/0fb24f3a-68b5-4334-ba25-0d013b707d5f%40googlegroups.com.
To unsubscribe from this group and stop receiving emails from it, send an email to what-if-tool...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/92774ee5-8fc0-4d92-96b6-d3c21c8b3ae5%40googlegroups.com.
Thanks for the update! Glad that column issue is taken care of.From the snapshot you sent of the standalone test, it seems your custom_predict_fn is still not returning its data in the right format. It will always be passed an array of examples, much like your first call "custom_predict(test_examples)". But your custom predict looks to be returning its results as a 1-D array of numbers as shown in your screenshot. Instead, it needs to return a 2-D array of numbers. The outer array should have one entry for each example in the provided test_examples.So if test_examples was of length 2 (meaning it contains two examples to run through the model), and your model was binary classification, and for the first example it returned a score for the positive class of .3 and for the second it returned a score for the positive class of .9, then the returned result should be formatted as [[0.7, 0.3], [0.1, 0.9]].If instead, your model is a regression model (so it only returns a single number for each example, not a set of class scores), then your 1-D array is the right format for the return of the custom_predict_fn but you must call .set_model_type('regression') on WitConfigBuilder to set it to regression model mode, instead of the default classification model mode.
To view this discussion on the web visit https://groups.google.com/d/msgid/what-if-tool/92774ee5-8fc0-4d92-96b6-d3c21c8b3ae5%40googlegroups.com.