Saving the interpreter for multiple model runs C++

36 views
Skip to first unread message

Privatsphäre

unread,
Jan 12, 2023, 2:48:31 AM1/12/23
to TensorFlow Lite
Hi everyone

In my project I want to evaluate a model in streaming fashion. The idea is write a class and  initialize the model & interpreter in the constructor, save it in a class and then use it every time i want to infer the model. My problem is now that the interpreter is not copyable. Since I can only save a pointer to the interpreter, the interpreter itself gets deleted once the constructor is finished and I'm left with a dangling pointer. 
Did I miss some fundamental point here or does the model have to be loaded inside each function that runs inference?
Thx for your help.

Privatsphäre

unread,
Jan 26, 2023, 6:44:04 AM1/26/23
to TensorFlow Lite, Privatsphäre
I solved my problem and learned some things on the way: 
1. The interpreter is given as a std::unique_ptr<tflite::Interpreter>. The special thing about unique pointers is that the objects they point to don't get deleted once the constructor has finished, but only when the pointer gets deleted.
2. You not only have to save the interpreter pointer in your class but also the resolver and the model. 
So if you store those three items in your class as variables you can instanciate them in the constructor and use them in your class functions. 
Hope this helps someone

Reply all
Reply to author
Forward
0 new messages