Understanding failure of loadFrozenModel

736 views
Skip to first unread message

Amaru T W

unread,
May 15, 2018, 7:47:12 AM5/15/18
to TensorFlow.js Discussion
So currently i'm trying to get the pipeline TF-Model -> export -> convert -> load & predict with tfjs going. Currently i'm in the load & predict phase. I have the  MobileNet demo working and am now trying to do the same with my own trained model. I'm using the MNIST CNN TF tutorial as code-basis for my model in python and when i try to load the model with loadFrozenModel in tfjs , i get the following error in the console:

Unhandled promise rejection Error: "Constructing tensor of shape (800) should match the length of values (723)"

So first, i need to say, i'm new to JS as i have a ML background and still need to get used how things work in the JS-World. So maybe i need to handle the promise properly but where is the error getting the values of 800 and 723 from? 

Nikhil Thorat

unread,
May 15, 2018, 9:45:32 AM5/15/18
to Amaru T W, TensorFlow.js Discussion, Ping Yu
Could you show us some code / link us to the model?

--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfjs/b36c1497-9646-431c-b932-ade762d78133%40tensorflow.org.

Erwin Carpio

unread,
May 15, 2018, 9:55:35 AM5/15/18
to TensorFlow.js Discussion
I'm also having trouble with loadFrozenModel(modelURL, weightsURL);

the files pb and weights.json files get loaded but I get this error...

node_modules\core-js\library\modules\_object-dp.js:9 Unhandled promise rejection TypeError: Cannot read property 'loadWeights' of undefined
    at FrozenModel.<anonymous> (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:446)
    at step (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:340)
    at Object.next (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:321)
    at fulfilled (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:312)

it could be the shards (the shards however are already in the same folder as the weights.json and .pb files).
I've also tried changing the "paths" in the weights.json file with both absolute and relative paths but it still throws the error.

Nikhil Thorat

unread,
May 15, 2018, 10:04:40 AM5/15/18
to Erwin Carpio, TensorFlow.js Discussion
If you could post a link to the model / code that would be really helpful! :)

--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.

Erwin Carpio

unread,
May 15, 2018, 10:11:48 AM5/15/18
to TensorFlow.js Discussion
Here's a link to my github repo for the index.js for the currently non-working project i'm tryng to write. It's a mix-match currently of the various tutorials and files from tensorflow.js examples. Just trying to figure out how to get it to work before I start on my own project idea.


Btw, I used the retrain.py for the emoji scavenger hunt google experiment to train and build the model using my python env environment (din't use docker but the nice devs at emoji scavenger hunt forums were nice enough to link me to the original retrain.py which I plugged into my python tensorflow-gpu environment). The model works in python.

I then converted the model (with tensorflow_converter) into the weights.json, model.pb file and the shards and put them in a model folder.

Now with tensorflow.js, 
I used an async function to load the converted model:

async function runMod(){
  const MODEL_URL = "model/tensorflowjs_model.pb";
  const WEIGHTS_URL = "model/weights_manifest.json";

  console.log(MODEL_URL);
  console.log(WEIGHTS_URL);
  console.log('before loadFrozenModel');

  const model = await loadFrozenModel(MODEL_URL,WEIGHTS_URL);

  console.log('after loadFrozenModel');


}

runMod();

All the console.logs work except the one after

  const model = await loadFrozenModel(MODEL_URL,WEIGHTS_URL);

so I've narrowed it down to this line. then I get the error;

node_modules\core-js\library\modules\_object-dp.js:9 Unhandled promise rejection TypeError: Cannot read property 'loadWeights' of undefined
    at FrozenModel.<anonymous> (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:446)
    at step (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:340)
    at Object.next (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:321)
    at fulfilled (node_modules\@tensorflow\tfjs-layers\dist\callbacks.js:312)


so could it still be the shards?

Nikhil Thorat

unread,
May 15, 2018, 11:17:26 AM5/15/18
to Erwin Carpio, TensorFlow.js Discussion

--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.

Amaru T W

unread,
May 15, 2018, 11:46:52 AM5/15/18
to TensorFlow.js Discussion, tuem...@googlemail.com, pi...@google.com
On Tuesday, May 15, 2018 at 3:45:32 PM UTC+2, Nikhil Thorat wrote:
Could you show us some code / link us to the model?

So the code in my js file is just the following, you can use the model-links provided in there:
 
import * as tf from '@tensorflow/tfjs';
import {loadFrozenModel} from '@tensorflow/tfjs-converter';
const MODEL_FILE_URL = 'https://dl.dropboxusercontent.com/s/n59dk9kkdnomu5b/tensorflowjs_model.pb';
const WEIGHT_MANIFEST_FILE_URL = 'https://dl.dropboxusercontent.com/s/n2tb8by6rr8mlyg/weights_manifest.json';
const model =(async () => {
await loadFrozenModel(MODEL_FILE_URL, WEIGHT_MANIFEST_FILE_URL) })(); 

I saved the model in python with tf.train.Saver() and freezed the model with:
def freeze_saved_model():
with tf.Session() as sess:
saver = tf.train.import_meta_graph(some_filepath + '.meta', clear_devices=True)
saver.restore(sess, some_model_folder_path)
graph = tf.get_default_graph()
input_graph_def = graph.as_graph_def()
output_node_names='output'
output_graph_def = tf.graph_util.convert_variables_to_constants(sess, input_graph_def, output_node_names.split(","))
output_graph = "NeuralNetwork/model/mnist-model.pb"
with tf.gfile.GFile(output_graph, "wb") as f:
f.write(output_graph_def.SerializeToString()) 

Nikhil Thorat

unread,
May 15, 2018, 12:41:33 PM5/15/18
to Amaru T W, TensorFlow.js Discussion, Ping Yu
I think the issue is the PB file has a content-type of text. You may have to change dropbox settings so this is a binary file.

--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.

Erwin Carpio

unread,
May 16, 2018, 5:12:48 AM5/16/18
to TensorFlow.js Discussion, radedj...@gmail.com
Hi, uninstalled tensorflow converter and used version 0.2.0. here's the new error...


node_modules\core-js\library\modules\es6.promise.js:110 Unhandled promise rejection Error: Constructing tensor of shape (512) should match the length of values (146)
    at Object.assert (node_modules\@tensorflow\tfjs-core\dist\util.js:57)
    at new Tensor (node_modules\@tensorflow\tfjs-core\dist\tensor.js:159)
    at Function.Tensor.make (node_modules\@tensorflow\tfjs-core\dist\tensor.js:175)
    at Object.ArrayOps.tensor (node_modules\@tensorflow\tfjs-core\dist\ops\array_ops.js:68)
    at node_modules\@tensorflow\tfjs-core\dist\weights_loader.js:148
    at Array.forEach (<anonymous>)
    at node_modules\@tensorflow\tfjs-core\dist\weights_loader.js:130
    at Array.forEach (<anonymous>)
    at Object.<anonymous> (node_modules\@tensorflow\tfjs-core\dist\weights_loader.js:115)
    at step (node_modules\@tensorflow\tfjs-core\dist\weights_loader.js:32)
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+uns...@tensorflow.org.

Erwin Carpio

unread,
May 16, 2018, 5:21:37 AM5/16/18
to TensorFlow.js Discussion, radedj...@gmail.com

UPDATE:

after ver 0.2.0, I checked the emoji scavenger hunt github repo. They were using version 0.2.1 so i used that instead and here's the new error.

Unhandled promise rejection TypeError: Cannot read property 'loadWeights' of undefined
    at FrozenModel.<anonymous> (node_modules\@tensorflow\tfjs-converter\dist\executor\frozen_model.js:145)
    at step (node_modules\@tensorflow\tfjs-converter\dist\executor\frozen_model.js:32)
    at Object.next (node_modules\@tensorflow\tfjs-converter\dist\executor\frozen_model.js:13)
    at fulfilled (node_modules\@tensorflow\tfjs-converter\dist\executor\frozen_model.js:4)

maybe the different errors per version can help. Thanks again for the advice and links.

Amaru T W

unread,
May 17, 2018, 5:57:24 PM5/17/18
to TensorFlow.js Discussion, tuem...@googlemail.com, pi...@google.com
So i set up a server to serve the files.


The pb is served as a binary now, and the error slightly changes!

"Constructing tensor of shape (800) should match the length of values (210)"

So yeah, again, where do these values come from? 

To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+uns...@tensorflow.org.

Erwin Carpio

unread,
May 18, 2018, 9:31:36 AM5/18/18
to TensorFlow.js Discussion, radedj...@gmail.com

hmmm so I've finally gotten my tensorflow python model from mobilenet converted and loaded onto the browser

No more errors there . Removing all the babel and bundlers and using raw javascript now.


i followed your link at 


https://github.com/tensorflow/tfjs/issues/292


and just downgraded to tensorflow-converter vers 0.2.0




no errors except when I do a model.predict(); here's the code.


https://pastebin.com/szH21LeE


error log says model.predict is not a function. 
I did a console.log(model); and it's not empty so the model loads. Just don't know why I can't predict with it. So I checked the Object.prototype of model and I don't see a predict function. Any suggestions? thanks in advance.


On Tuesday, May 15, 2018 at 11:17:26 PM UTC+8, Nikhil Thorat wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+uns...@tensorflow.org.

Nikhil Thorat

unread,
May 18, 2018, 9:45:32 AM5/18/18
to Erwin Carpio, TensorFlow.js Discussion, Ping Yu
Hi Erwin,

Unfortunately there is some API differences when we load the frozen model from when we load the layers models. For frozenModels, we use "execute", check out this guide: https://github.com/tensorflow/tfjs-converter

We're going to consolidate these APIs soon and get the converter into the union package so this confusion goes away.

+Ping

To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.

Erwin Carpio

unread,
May 18, 2018, 10:48:22 AM5/18/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
Hi Nikhil, thanks for the really fast reply, 
you're all probably busy with the project.
Really appreciate that you take the time to answer questions.

oh oki, I tried to model.execute and got an error....


the changed code is this... 


            async function loadModal() {
                const model = await tf_converter.loadFrozenModel(MODEL_URL, WEIGHTS_URL);
             

                const predictedClass = tf.tidy(() => {

                    const img = document.getElementById('pic');


                    console.log(img);
      
                    console.log(model);

                    const predictions = model.execute({input: tf.fromPixels(img)});

                    return predictions.print();

                });

                console.log(predictedClass);


            }
            loadModal();



and the error is this....



tfjs-co...@0.2.0:1 Uncaught (in promise) Error: Tensor is disposed.
    at e.throwIfDisposed (tfjs-co...@0.2.0:1)
    at e.as4D (tfjs-co...@0.2.0:1)
    at e.conv2d (tfjs-co...@0.2.0:1)
    at tfjs-co...@0.2.0:1
    at Object.e.tidy (tfjs-co...@0.2.0:1)
    at Object.r.value [as conv2d] (tfjs-co...@0.2.0:1)
    at Object.r.executeOp (tfjs-co...@0.2.0:1)
    at Object.r.executeOp (tfjs-co...@0.2.0:1)
    at tfjs-co...@0.2.0:1
    at Array.reduce (<anonymous>)






maybe it can help you or others.... more power to this project...

by the way, I noticed your mobilenet demo on github is up (i used to get a 404 page). Is that the new one then? or do you suggest I just wait for the new union release as the API's are likely to change during unification of the code? when you say union release that means tf-core, tfjs and tf_converter will all be released under one package as tfjs.js? thanks.



Erwin Carpio

unread,
May 19, 2018, 8:07:38 AM5/19/18
to TensorFlow.js Discussion, tuem...@googlemail.com
decided to switch over to KERAS wrapper with tensorflow backend.... seems more straight forward...retrained MobileNetV2, saved to KERAS .h5, converted to model.json, loaded onto browser.... now getting an error;


node_modules\core-js\library\modules\es6.promise.js:110 Unhandled promise rejection Error: Constructing tensor of shape (144) should match the length of values (137)
    at Object.a [as assert] (tfjs.js:1)
    at new e (tfjs.js:1)
    at Function.e.make (tfjs.js:1)
    at Object.e.tensor (tfjs.js:1)
    at tfjs.js:1
    at Array.forEach (<anonymous>)
    at tfjs.js:1
    at Array.forEach (<anonymous>)
    at Object.<anonymous> (tfjs.js:1)
    at r (tfjs.js:1)


could it be my image sizes? I batch resized my training and validation data to 244x244 pixels but it seems some got converted to 244 x 180 or 240 x 244 to keep aspect ratios. or is it something else?

On Tuesday, May 15, 2018 at 7:47:12 PM UTC+8, Amaru T W wrote:

Erwin Carpio

unread,
May 20, 2018, 11:18:17 AM5/20/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
hi, just wanted to ask if the tfjs file served at

got updated or change last night or may19-20 eastern time. around midnight heheh....

I recently ditched all my package bundlers, removed babel and re-wrote much of my project in raw javascript with direct script tags to tfjs. then i switched to keras as well and saved/converted the h5 file to the model.json and shards. 

Running with raw javascript just works! it started predicting... then at around midnight last night in my time zone... it suddenly stopped working.
new error asked if I had suddenly changed my backend during the program since old converted files will no longer work with the new tensorflow....
luckily i manually saved a copy of 


and linked to that local copy.

my javascript started working again. so was there an updated to the served cdn file? is this the new UNION file that will consolidate the model.execute and model.predict api issues as well as the compatibility and dependency issues of tf-converter with tf-core?

Thanks for any advice. just want to know if it will be worthwhile to redownload or upgrade my python tf and converter, retrain and reconvert to use the new served js or just stick to the old one for now if it's not yet the new union package. 


Nikhil Thorat

unread,
May 20, 2018, 1:22:57 PM5/20/18
to Erwin Carpio, TensorFlow.js Discussion, Ping Yu
Just curious, what is the error, and where is it happening? Could you post a code pen with the breaking example?

Yes, the link you have always updates to the latest version. If you want to avoid it automatically updating, you should pin your version, like this (note the version in the URL):
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tf...@0.10.3"></script>

We're making a few changes to the way we bundle. Previously every script bundle had its own symbol, now they all augment a "tf" object. So "tf.loadFrozenModel" is available when you load the new converter (not yet published, I commented on the other PR).



To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+unsubscribe@tensorflow.org.

Kevin Scott

unread,
May 22, 2018, 6:36:35 PM5/22/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
I have a similar question: it looks like when I convert a Keras saved model (hdf5), I get:

group1-shard1of1
group2-shard1of1
group3-shard1of1
model.json

I'm trying to understand how to use these alongside loadFrozenModel. From what I can tell, loadFrozenModel expects separate model and weight files, yet my conversion only produces a model with shards.

Am I missing something somewhere? I'm not super familiar with hdf5 files but I'd love to learn more!

Erwin Carpio

unread,
May 23, 2018, 7:19:54 AM5/23/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
hi, I think you're trying to use loadFrozenModel for a saved Keras model. 
but loadFrozenModel is for savedModel directory format from vanilla or raw tensorflow.
see this link

When you get the converted Keras H5 model to the model.json you must use tf.loadModel 
see this link for KERAS.

so basically

tf.loadModel for KERAS models

and loadFrozenModel for vanilla tensorflow models saved in the savedModel directory format.

Erwin Carpio

unread,
May 23, 2018, 7:31:32 AM5/23/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
Hi Nikhil Thorat

sorry I took so long to reply. Got burried in work at clinic/hospital. Haven't been able to touch the code in days.

Here's the error.


Uncaught (in promise) Error: WebGL backend: No data found for this tensor. Did you change your backend in the middle of the program? New backends can't use Tensors created with previous backends
    at t.throwIfNoData (tfjs:1)
    at t.uploadToGPU (tfjs:1)
    at tfjs:1
    at Array.map (<anonymous>)
    at t.compileAndRun (tfjs:1)
    at t.add (tfjs:1)
    at ip.engine.runKernel.a (tfjs:1)
    at t.runKernel (tfjs:1)
    at t.add (tfjs:1)
    at tfjs:1

will try to git push the code in a while....

Erwin Carpio

unread,
May 23, 2018, 8:00:55 AM5/23/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
Hi, here's the raw2.html I'm currently using devoid of any bundlers using raw javascript.


in the HEAD of the HTML code i've currently set it to the latest tfjs

    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>

This produces the error log I showed earlier.

but if I use the old tfjs which i backed up on local disk and also pushed to the git repo in the root/tf folder
using

<script type="text/javascript" src="./tf/tfjs.js"></script>


the page runs and predicts again.

It takes a really long time to load the model and shards though even if the model is in my own local computer.

Hope the error log helps.
Real sorry for the delayed reply.

Kevin Scott

unread,
May 23, 2018, 10:01:36 AM5/23/18
to TensorFlow.js Discussion, radedj...@gmail.com, pi...@google.com
Thank you Erwin for your help, I missed that reference. it looks to be working now.

Cheers,
Kevin

Amaru T W

unread,
May 31, 2018, 5:50:33 AM5/31/18
to TensorFlow.js Discussion
Funny how my thread got hijacked and my question unanswered. I'm gonna throw this in again:

So i set up a server to serve the files.
http://18.217.47.130/tensorflowjs_model.pb
http://18.217.47.130/weights_manifest.json
The pb is served as a binary now, and the error slightly changes!

So yeah, again, where do these values come from? 
"Constructing tensor of shape (800) should match the length of values (210)"
Reply all
Reply to author
Forward
0 new messages