--
You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/cccb049a-5099-49da-9852-b45b78b1db98n%40tensorflow.org.
On Wed, Apr 5, 2023 at 4:01 PM Nikolas Stavrou <nikolas....@gmail.com> wrote:I've been experimenting with the following model-personalization demo app for on-device training.
examples/lite/examples/model_personalization at master · tensorflow/examples (github.com)
It seems that in every cycle of sample gathering and then training, the trainingSamples of all the previous cycles remain in the list that we use to train our model. This means that every time we perform on-device training, we use all of the samples taken so far within our app.
However, I've found that the inference in a class incremental scenario is totally wrong for the newer class something which I do not understand why it occurs because we train every time with all of our samples.
If anyone wants to replicate this, gather some samples for class 1 and some for class 2, train and perform inference. Then gather samples for class 3 and train again. Performing inference will show that class 3 is classified wrongly as class 1 or 2 and the loss is always high above 1 (around 4,5 if you gather 10 samples for each class).
I do not understand why this wrong inference occurs, to my understanding so far it seems like it shouldn't wrongly classify the new class.
Any help is greatly appreciated, thanks!
--
You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+unsubscribe@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/cccb049a-5099-49da-9852-b45b78b1db98n%40tensorflow.org.
--Best,Haoliang
Did anyone manage to get a look at this? I've been trying to find out what is wrong with the training in the past days but I couldn't figure it out.
On Thursday, April 6, 2023 at 2:12:30 AM UTC+3 haol...@google.com wrote:
On Wed, Apr 5, 2023 at 4:01 PM Nikolas Stavrou <nikolas....@gmail.com> wrote:I've been experimenting with the following model-personalization demo app for on-device training.
examples/lite/examples/model_personalization at master · tensorflow/examples (github.com)
It seems that in every cycle of sample gathering and then training, the trainingSamples of all the previous cycles remain in the list that we use to train our model. This means that every time we perform on-device training, we use all of the samples taken so far within our app.
However, I've found that the inference in a class incremental scenario is totally wrong for the newer class something which I do not understand why it occurs because we train every time with all of our samples.
If anyone wants to replicate this, gather some samples for class 1 and some for class 2, train and perform inference. Then gather samples for class 3 and train again. Performing inference will show that class 3 is classified wrongly as class 1 or 2 and the loss is always high above 1 (around 4,5 if you gather 10 samples for each class).
I do not understand why this wrong inference occurs, to my understanding so far it seems like it shouldn't wrongly classify the new class.
Any help is greatly appreciated, thanks!
--
You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/cccb049a-5099-49da-9852-b45b78b1db98n%40tensorflow.org.
--Best,Haoliang