Hi,
I trained a DiagGMM by using bob.kaldi. However the outcome result seems very strange.
I used the command: dubm = bob.kaldi.ubm_train(totalFeats[start: end],
diag_gmm_file.name, num_threads=32, num_gauss=2048, num_gselect=30, num_iters=2).
The outcome is '<DiagGMM> \n<GCONSTS> [ -72.59447 ]\n<WEIGHTS> [ 1 ]\n<MEANS_INVVARS> [\n -1.511659 -0.1555125 -0.008951003 0.05670118 0.1720264 0.2406914 0.2597786 0.2638524 0.1748993 0.106583 0.1136944 0.1390899 0.1352641 -4.13353 -0.2818888 -0.2965984 -0.160472 -0.05361678 -0.02457095 -0.02159077 -0.06387884 -0.111318 -0.1029774 -0.1627036 -0.1235721 0.008362792 -11.37353 1.006093 0.2833084 -0.6071019 -1.05838 -1.570514 -2.553714 -2.565301 -1.72359 -0.7188698 -1.586562 -1.097332 -0.4600658 ]\n<INV_VARS> [\n 0.5525933 0.01848102 0.02025793 0.01650471 0.02186424 0.01786565 0.0188241 0.01790475 0.01741467 0.02175118 0.02217225 0.01677749 0.0278025 59.74311 1.362482 0.9980228 0.6581228 0.4102016 0.2891341 0.2889878 0.2717075 0.2777747 0.2747622 0.3782768 0.2506975 0.2198423 238.6761 9.48506 5.835298 4.105416 3.175651 2.020387 2.20149 1.828745 1.847537 1.939827 2.304343 1.621249 1.636107 ]\n</DiagGMM> \n'
From my understanding, it should at least contain 2048 parameters. Anyone know what goes wrong here?
Thank you.