Load LSTM works fine with browser tfjs, but not tfjs in node.js

585 views
Skip to first unread message

Jessica Van Brummelen

unread,
Sep 26, 2018, 9:20:19 PM9/26/18
to TensorFlow.js Discussion
Hi there,

I'm trying to run a pre-trained LSTM network with node.js so that I can send http requests to it and generate text.

The pretrained network is currently being loaded from Amazon S3, where I stored it. This works fine when I use it in the browser, but when I try to load the exact same model from S3 in node.js (with the same version of tfjs --> 0.13.1 and the exact same line of code), I get the following error:
RangeError: byte length of Float32Array should be a multiple of 4
    at typedArrayConstructByArrayBuffer (<anonymous>)
    at new Float32Array (native)
    at _loop_1 (/work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-core/dist/io/io_utils.js:102:30)
    at Object.decodeWeights (/work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-core/dist/io/io_utils.js:132:9)
    at /work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-layers/dist/models.js:137:58
    at step (/work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
    at Object.next (/work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
    at fulfilled (/work/lstm_model_loader_v2/node_modules/@tensorflow/tfjs/node_modules/@tensorflow/tfjs-layers/dist/models.js:14:58)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:182:7)
(node:8433) UnhandledPromiseRejectionWarning: Error: Load model first.
    at LoadableLSTMTextGenerator.lstmLayerSizes (/work/lstm_model_loader_v2/ui.js:85:13)
    at Server.<anonymous> (/work/lstm_model_loader_v2/ui.js:358:61)
    at <anonymous>
    at process._tickCallback (internal/process/next_tick.js:182:7)
(node:8433) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:8433) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

The line of code is this one: this.model = await tf.loadModel('https://s3.amazonaws.com/...restOfPath...');

From the error message, it looks like the weight file for my model is the wrong size. The exact same model works fine using the same version of tfjs and essentially the same code to load the model when running in the browser, though, so the weight file should be the correct size.

I tried this with and without requiring tfjs-node, and I get the same error message both ways. 


Someone else ran into the "byte length multiple of 4" error here: https://github.com/tensorflow/tfjs/issues/218
I tried changing my version of tfjs to 0.10.3, like one user did, but I get the same error.

Anyone know how to fix this?

Jessica Van Brummelen

unread,
Sep 26, 2018, 9:29:28 PM9/26/18
to TensorFlow.js Discussion
The pretrained model is attached.
nietzsche.weights.bin
nietzsche.json

Shanqing Cai

unread,
Sep 26, 2018, 10:22:09 PM9/26/18
to TensorFlow.js Discussion
Thanks for attaching the json and binary weight files. I tested it on my machine and it seems to work. Below is my package.json:

```
{
  "dependencies": {
    "@tensorflow/tfjs-node": "0.1.17"
  }
}
```

And below is my js file: 

```js
const tf = require('@tensorflow/tfjs');
require('@tensorflow/tfjs-node');

(async () => {
  const model = await tf.loadModel('file://./nietzsche/nietzsche.json');
  model.summary();
})();
```

The model seems to be loaded properly and the printed summary looks like:

_________________________________________________________________
Layer (type)                 Output shape              Param #   
=================================================================
lstm_LSTM1 (LSTM)            [null,128]                109056    
_________________________________________________________________
dense_Dense1 (Dense)         [null,84]                 10836     
=================================================================
Total params: 119892
Trainable params: 119892
Non-trainable params: 0
_________________________________________________________________


Can you try the same and see if it works for you?

Thanks,
Shanqing

Jessica Van Brummelen

unread,
Sep 27, 2018, 10:46:54 AM9/27/18
to TensorFlow.js Discussion
Thanks for testing, Shanquing.

I tried the same, and it worked, so I started with that base code and rewrote the code I originally had on top of it. 

Now I'm getting an error on this line:
tf.multinomial(preds, 1, null, true).dataSync()[0];

that says "ERROR: Failed to generate text: TF Node backend does not support normalized logits passed to multinomial, Error: TF Node backend does not support normalized logits passed to multinomial
    at NodeJSKernelBackend.multinomial (/work/test_lstm_weights/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:862:19)..."

Does that mean that with the node version of tfjs, I can't normalize logits?

The same line works fine with the regular version of tfjs. The 'preds' variable is the same in both versions:
preds:Tensor
    [0.000011, 7e-7, 0.0000261, ..., 6e-7, 0.0000011, 8e-7]

(The entire function is here:
function sample(preds, temperature) {
  return tf.tidy(() => {
    const logPreds = tf.div(tf.log(preds), temperature);
    const expPreds = tf.exp(logPreds);
    const sumExpPreds = tf.sum(expPreds);
    preds = tf.div(expPreds, sumExpPreds);
          console.log('preds: '+preds);
    // Treat preds a the probabilites of a multinomial distribution and
    // randomly draw a sample from the distribution.
    return tf.multinomial(preds, 1, null, true).dataSync()[0];
  });
}
)

I'm not super familiar with deep learning and tensorflow (this code is based on the lstm example here), but my guess is that I should not be doing this (which looks like it's normalizing the preds) when using tfjs-node:
const logPreds = tf.div(tf.log(preds), temperature);
    const expPreds = tf.exp(logPreds);
    const sumExpPreds = tf.sum(expPreds);
    preds = tf.div(expPreds, sumExpPreds);

What would be a way around this? Can I just directly feed preds into tf.multinomial, or would that ruin the lstm? Can I use the normal tfjs instead of tfjs-node so that I can still normalize?

Shanqing Cai

unread,
Sep 27, 2018, 10:49:49 AM9/27/18
to jess.van...@gmail.com, Nick Kreeger, TensorFlow.js Discussion
+Nick Kreeger , who may have more thoughts on this. But this seems to be a current limitation of tf.multinomial in tfjs-node:

--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+uns...@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfjs/d1cacc77-d582-4640-989d-354ceef711d9%40tensorflow.org.


--
---
Shanqing Cai
Software Engineer
Google

Jessica Van Brummelen

unread,
Sep 27, 2018, 10:54:10 AM9/27/18
to TensorFlow.js Discussion
Edit: I tried removing what I thought was the "normalization" code  (const logPreds = tf.div(tf.log(preds), temperature);
    const expPreds = tf.exp(logPreds);
    const sumExpPreds = tf.sum(expPreds);
    preds = tf.div(expPreds, sumExpPreds);
)

but I still get the same error: 
ERROR: Failed to generate text: TF Node backend does not support normalized logits passed to multinomial, Error: TF Node backend does not support normalized logits passed to multinomial

Why can't I use tf.multinomial with tf node vs regular tf?

Shanqing Cai

unread,
Sep 27, 2018, 11:01:21 AM9/27/18
to jess.van...@gmail.com, TensorFlow.js Discussion
Again, Nick may know more background about this limitation. 

But if you have normalized probabilities, it shouldn't be too hard to convert them to unnormalied log-probabilities (i.e., logits). Just do log10(probs + epsilon).


--
You received this message because you are subscribed to the Google Groups "TensorFlow.js Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfjs+uns...@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfjs/.

Jessica Van Brummelen

unread,
Sep 27, 2018, 11:26:27 AM9/27/18
to TensorFlow.js Discussion, jess.van...@gmail.com
"But if you have normalized probabilities, it shouldn't be too hard to convert them to unnormalied log-probabilities (i.e., logits). Just do log10(probs + epsilon)."

Makes sense. Thanks for your help, Shanqing!


Reply all
Reply to author
Forward
0 new messages