Though firebase deployment and auth were simple enough to follow (thanks to your codelab), they apparently are not acceptable as firebase URL would be still be public in my case. In app script, the URL is only accessible from within the company. Any suggestions on the below approach on how to properly load the weights (.bin) file using Drive and app-script?
I was able to load the model.json from Drive and print the summary like so,
google.script.run.withSuccessHandler(modelJsonDataStr => {
const modelJson = new File([modelJsonDataStr], "model.json", { type: "application/json" })
const model = await tf.loadLayersModel(tf.io.browserFiles([modelJson]))
model.summary();
}).withFailureHandler(onFailure).loadModelFromDrive();
and in the g-script file,
function loadModelFromDrive() {
const f = DriveApp.getFileById("modeljson-file-id-from-drive")
const modeljson = f.getAs("application/json").getDataAsString();
return modeljson
}
_______________
But I'm unable to convert the
Blob or
Byte[] returned by g-script to a type that
decodeWeights() of tf.loadLayersModel accepts. I tried something like,
google.script.run.withSuccessHandler(modelFiles => {
const modelJson = new File([modelFiles[0]], "model.json", {type: "application/json"})
const modelWeights = new File([new Blob(modelFiles[1])], "group1-shard1of1.bin")
const model = tf.loadLayersModel(tf.io.browserFiles([modelJson, modelWeights])).then(model => {
model.summary();
})
}).withFailureHandler(onFailure).loadModelFromDrive();
and updated the loadModelFromDrive to return both these files,
function loadModelFromDrive() {
const f = DriveApp.getFileById("1xboldJQ9rcaYF-Y6JxptrhI5fLFjoHoR")
const modeljson = f.getAs("application/json").getDataAsString();
const f2 = DriveApp.getFileById("1WGP-rc6gy0gRdGMFmTyl88LDjFeN7PHs")
const weights = f.getBlob().getBytes()
return [modeljson, weights]
}
tf.io.decodeWeights doesn't like the native JS
Blob or
Float32Array types. All the dtypes in my model are "float32" only.
Sorry for such a long message. I think this is another interesting way to deploy model, which sadly is the only possibility in my case :)
Thanks in advance for the help.