using kaldi plda model

270 views
Skip to first unread message

mili lali

unread,
Jul 21, 2020, 2:22:48 PM7/21/20
to bob-devel
Hi dears
I want to use the Kaldi pre-train plda model (e.g. https://kaldi-asr.org/models/m7)
Is it possible to use the Kaldi plda model in bob.kaldi? I want to don't train the plda model with bob and using Kaldi model.
assume I have two i-vectors (in python list or NumPy arrays or something else in python, I extract sector and don't use bob) and want to compute plda score of them with PLDA Kaldi pre-train.  Do you have any ideas?
best regards

Saeed Sarfjoo

unread,
Jul 21, 2020, 5:06:29 PM7/21/20
to bob-devel
Dear Lali,

Bob.kaldi expected to see the PLDA in the ASCII format. You can use bob.kaldi.io.read_plda to load the plda in dictionary format. Then you must create the plda array like:

from bob.kaldi.io import read_plda
plda_dic
=read_plda(<plda-path>)
plda
=[]
plda
.append(str(plda_dic['plda_transform']))
plda
.append(str(plda_dic['plda_mean']))

Similarly, you can use ```ivector-copy-plda --binary=false <plda> <plda-ascii>``` from Kaldi to convert the plda model to ASCII format.

Best,
Saeed  

mili lali

unread,
Jul 22, 2020, 5:07:10 AM7/22/20
to bob-devel
Dear Saeed
Many Thanks 
1- sorry, this is actually for file file? I mean, I must load globalmean as ASCII format?

2-What about using bob to compute plda-score in case that we have two vectors?
assume we have two vectors as enroll and eval e.g. enroll = [1 1 1 1 ]; eval = [2 2 2 2] and want to compute score between them. can explain how can do this?

I see this
This is like Kaldi code and assumes load enroll and test ivectors from a file

but I think this is a better example:
In this line, you load vectors as a NumPy array. is it true?

best regards

Saeed Sarfjoo

unread,
Jul 22, 2020, 10:44:22 AM7/22/20
to bob-devel
Dear Lali,

The explanation was based on https://gitlab.idiap.ch/bob/bob.kaldi/-/blob/master/bob/kaldi/test/test_ivector.py#L111 sample. Here, you need to load the PLDA and global mean in the ASCII format. Enroll and trial vectors are loaded as NumPy array.

Regards

mili lali

unread,
Aug 13, 2020, 10:48:19 AM8/13/20
to bob-devel
Dear Saeed
I used Kaldi pre-train plda model (e.g. https://kaldi-asr.org/models/m7)
as you mention I convert plda and global mean to ASCII form using ```ivector-copy-plda --binary=false <plda> <plda-ascii>``` 

I found that the plda_score input format must something like this: 
enrolled = "spk0 [4.659209  1.27825  3.346075  -1.472408 ]" a string

test_feats
= np.array([-0.33075494 ,-0.34912944 ,0.63107318 ,-1.39356291 ] a NumPy array



and I use script to compute plda score:
plda_load = open("plda.ascii", 'r').read()
mean_vect
= open("mean.vec.ascii", 'r').read()
score
= bob.kaldi.plda_score(test_feats, enrolled, plda_load, mean_vect)

but the score always is -1. 
I load Kaldi plda in test_ivector.py instead of default plda like above, but the score is just -1
how can I fix it?
I think I don't use the transform matrix.
best regards

Saeed Sarfjoo

unread,
Aug 16, 2020, 5:16:18 AM8/16/20
to bob-devel
Dear Lali,

In Kaldi, each plda model  contains `plda_mean`, `plda_transform`, and `plda_psi`. You can check them in the ASCII format of plda model. Dimension of the enrolled speaker must match to the `plda_mean` vector dim. In the script which you shared, dimension of enrolled speaker is 4. For loading the pre-trained plda model, you must consider the dimension mismatch.  

Regards,
Saeed 

mili lali

unread,
Aug 18, 2020, 10:59:26 PM8/18/20
to bob-devel
Dear Saeed
Thanks for your reply.
I attach a sample of vectors and a very simple code to use bob and Kaldi pre-train plda model for computing plda score. the vectors have 512 dim and extracted by Kladi x-vector pre-train model.
I use this model to extract vectors and as you said convert plda and mean.vect to ASCII and open them in code. but the plda score is -1.
can you help me? what wrong? how can fix it? 

best regards
xvectors_sample
test_plda_bob.py

Saeed Sarfjoo

unread,
Aug 21, 2020, 7:05:46 AM8/21/20
to bob-devel
Dear Lali,

The x-vector Kaldi model uses LDA transformation to reduce the dimension of x-vectors from 512 to 200. In this case,  enroll and test feats in `bob.kaldi.plda_score` must be transformed features with dim 200. 

Best,
Saeed  

mili lali

unread,
Aug 21, 2020, 5:08:14 PM8/21/20
to bob-devel
Dear Saeed,
sorry about any inconvenience.
the transform matrix dimension is 513*200, how can apply it on vectors? the dimension of output vectors is 512.

mili lali

unread,
Aug 22, 2020, 1:50:08 PM8/22/20
to bob-devel
The last row of the transformer matrix is offset and must add to result of multiple of vector and transformer matrix.
best regards

mili lali

unread,
Aug 22, 2020, 2:51:36 PM8/22/20
to bob-devel
Dear Saeed,
I use transformer matrix and reduce diminsion of xvectors from 512 to 200. but I still get plda score equal -1. Would you mind helping me?
here is my input of plda_score
enrolled = sk002438 [-2.54789627e+00 -1.69172817e+00  1.46697910e+00  1.20184582e+00
  9.49332493e-01 -2.30867350e+00 -1.34768028e+00 -1.34134001e+00
  9.23807905e-01  1.10245750e+00 -4.43008033e-01  7.59145847e-01
 -2.35323420e-01 -1.81755603e+00 -4.23935731e-01 -1.68902522e+00
  2.46977616e+00  1.17500991e+00 -1.75817799e+00  2.38570631e-01
 -6.67718138e-01  1.63639274e+00  2.91453668e+00  1.53120914e+00
 -7.90558376e-01 -7.71323462e-01 -3.14103727e+00 -8.89651416e-02
  1.05396928e+00  1.92321549e+00 -1.45113443e-01 -5.45329709e-01
 -2.05854519e+00  2.76661217e+00  1.18002842e+00  2.99245767e+00
 -2.31015379e-01  1.41589249e+00 -2.15400068e-01  1.13959426e+00
  2.24209995e+00 -1.12206722e-01 -6.07572388e-01  8.85497625e-01
  2.67944273e+00  1.07758177e+00  1.41450142e+00  7.38213319e-01
  1.54851159e+00 -1.96954567e+00 -3.58014937e-01  4.62214623e-01
  3.82445261e-01  1.89613504e+00  2.24490882e+00  1.05456955e+00
  7.28349369e-01 -1.24256142e+00  3.70416870e+00  6.87483368e-01
  1.18763205e+00  5.99799927e-01 -4.04588878e-01  1.25418538e+00
 -1.02168433e+00  7.21106858e-01  1.20855345e+00  1.92457137e-01
 -9.31458608e-01  3.94060026e-01  1.18738501e+00  6.44213320e-01
  4.67387311e+00  7.08447771e-02 -2.86914381e-01 -6.62501657e+00
  4.90822483e-01  9.94122211e-01  2.59556132e+00  1.34884606e+00
  5.75357885e-01  1.88478323e-01 -2.19009248e+00  1.63932270e+00
  2.60297901e+00 -4.55971582e-01  1.91419353e+00  3.98512460e+00
  1.52898341e+00 -6.46974715e-01 -4.45602888e+00  5.25466717e-01
 -2.20718275e+00 -1.31090763e-01  1.86305806e-01 -1.29310779e-01
 -4.71685546e+00  5.36982798e-01  6.06081561e-01  1.30334696e+00
 -4.46757281e-01 -1.44792573e+00  2.33895374e-01 -2.15868805e+00
 -1.63323463e+00  1.15442352e+00  4.64179994e+00  2.86205594e+00
  5.02178401e-02  1.24763415e+00  3.24890277e+00 -2.11436345e+00
 -4.86568577e+00 -1.53156501e+00 -1.00600179e+00  7.65497809e-02
 -5.13548732e+00 -1.89018330e+00 -4.73918467e+00  4.19342169e+00
  2.53986777e+00  1.00598299e+00  1.40226424e+00 -7.48726463e-01
 -2.50670915e+00  2.53909621e-01 -5.41647701e+00 -4.15450806e+00
 -1.09574777e+00 -9.23120353e-01  5.59587303e-01  1.93861418e+00
 -3.76394250e-01  1.52402014e+00  5.63678123e+00  3.14859479e+00
 -1.74659951e+00  3.90769867e+00  2.01412450e+00  2.58523287e+00
 -2.82420187e+00 -4.51356963e+00  4.09245947e-01 -3.08143204e+00
 -2.39498022e+00  4.60904912e+00 -1.61617615e+00 -2.87023336e+00
  5.63116948e+00 -4.69380430e+00  6.82683199e-03  1.92411739e-01
  1.55167950e+00 -1.32695601e+00  8.96458929e-02 -2.09703116e+00
 -3.50641285e+00  2.10018232e+00  5.94756201e+00 -3.24881367e+00
 -1.32070789e-01 -4.84707208e+00 -1.91869124e+00 -2.76857401e-01
 -2.79284789e-01 -7.62560879e+00 -1.49193153e+00  5.01588305e-01
 -4.16258476e+00  8.76587334e+00  2.98523513e+00  6.78809055e+00
  4.95563058e+00  4.40534260e-01 -7.12565648e+00 -6.90691559e-01
  1.21932589e+00 -6.54977914e+00  8.18034470e-01 -5.75203157e+00
  3.91972559e+00  5.63001167e+00  1.64146978e+00  1.09654726e+00
  1.58676121e+00 -7.05986069e+00 -3.67048359e-01  7.02736155e+00
  6.61939674e-01 -2.86325951e+00  4.99162249e+00 -3.67112459e+00
 -1.09258809e+00  1.09465276e-01 -2.49123885e+00 -4.05174644e-02
 -5.65250160e+00  9.83903338e+00 -9.88026700e-02 -3.13279726e+00]

test_feats = [-2.54789627e+00 -1.69172817e+00  1.46697910e+00  1.20184582e+00
  9.49332493e-01 -2.30867350e+00 -1.34768028e+00 -1.34134001e+00
  9.23807905e-01  1.10245750e+00 -4.43008033e-01  7.59145847e-01
 -2.35323420e-01 -1.81755603e+00 -4.23935731e-01 -1.68902522e+00
  2.46977616e+00  1.17500991e+00 -1.75817799e+00  2.38570631e-01
 -6.67718138e-01  1.63639274e+00  2.91453668e+00  1.53120914e+00
 -7.90558376e-01 -7.71323462e-01 -3.14103727e+00 -8.89651416e-02
  1.05396928e+00  1.92321549e+00 -1.45113443e-01 -5.45329709e-01
 -2.05854519e+00  2.76661217e+00  1.18002842e+00  2.99245767e+00
 -2.31015379e-01  1.41589249e+00 -2.15400068e-01  1.13959426e+00
  2.24209995e+00 -1.12206722e-01 -6.07572388e-01  8.85497625e-01
  2.67944273e+00  1.07758177e+00  1.41450142e+00  7.38213319e-01
  1.54851159e+00 -1.96954567e+00 -3.58014937e-01  4.62214623e-01
  3.82445261e-01  1.89613504e+00  2.24490882e+00  1.05456955e+00
  7.28349369e-01 -1.24256142e+00  3.70416870e+00  6.87483368e-01
  1.18763205e+00  5.99799927e-01 -4.04588878e-01  1.25418538e+00
 -1.02168433e+00  7.21106858e-01  1.20855345e+00  1.92457137e-01
 -9.31458608e-01  3.94060026e-01  1.18738501e+00  6.44213320e-01
  4.67387311e+00  7.08447771e-02 -2.86914381e-01 -6.62501657e+00
  4.90822483e-01  9.94122211e-01  2.59556132e+00  1.34884606e+00
  5.75357885e-01  1.88478323e-01 -2.19009248e+00  1.63932270e+00
  2.60297901e+00 -4.55971582e-01  1.91419353e+00  3.98512460e+00
  1.52898341e+00 -6.46974715e-01 -4.45602888e+00  5.25466717e-01
 -2.20718275e+00 -1.31090763e-01  1.86305806e-01 -1.29310779e-01
 -4.71685546e+00  5.36982798e-01  6.06081561e-01  1.30334696e+00
 -4.46757281e-01 -1.44792573e+00  2.33895374e-01 -2.15868805e+00
 -1.63323463e+00  1.15442352e+00  4.64179994e+00  2.86205594e+00
  5.02178401e-02  1.24763415e+00  3.24890277e+00 -2.11436345e+00
 -4.86568577e+00 -1.53156501e+00 -1.00600179e+00  7.65497809e-02
 -5.13548732e+00 -1.89018330e+00 -4.73918467e+00  4.19342169e+00
  2.53986777e+00  1.00598299e+00  1.40226424e+00 -7.48726463e-01
 -2.50670915e+00  2.53909621e-01 -5.41647701e+00 -4.15450806e+00
 -1.09574777e+00 -9.23120353e-01  5.59587303e-01  1.93861418e+00
 -3.76394250e-01  1.52402014e+00  5.63678123e+00  3.14859479e+00
 -1.74659951e+00  3.90769867e+00  2.01412450e+00  2.58523287e+00
 -2.82420187e+00 -4.51356963e+00  4.09245947e-01 -3.08143204e+00
 -2.39498022e+00  4.60904912e+00 -1.61617615e+00 -2.87023336e+00
  5.63116948e+00 -4.69380430e+00  6.82683199e-03  1.92411739e-01
  1.55167950e+00 -1.32695601e+00  8.96458929e-02 -2.09703116e+00
 -3.50641285e+00  2.10018232e+00  5.94756201e+00 -3.24881367e+00
 -1.32070789e-01 -4.84707208e+00 -1.91869124e+00 -2.76857401e-01
 -2.79284789e-01 -7.62560879e+00 -1.49193153e+00  5.01588305e-01
 -4.16258476e+00  8.76587334e+00  2.98523513e+00  6.78809055e+00
  4.95563058e+00  4.40534260e-01 -7.12565648e+00 -6.90691559e-01
  1.21932589e+00 -6.54977914e+00  8.18034470e-01 -5.75203157e+00
  3.91972559e+00  5.63001167e+00  1.64146978e+00  1.09654726e+00
  1.58676121e+00 -7.05986069e+00 -3.67048359e-01  7.02736155e+00
  6.61939674e-01 -2.86325951e+00  4.99162249e+00 -3.67112459e+00
 -1.09258809e+00  1.09465276e-01 -2.49123885e+00 -4.05174644e-02
 -5.65250160e+00  9.83903338e+00 -9.88026700e-02 -3.13279726e+00]



enrolled is a string like above
test_feats is a numpy array 

and use this to compute plda score

plda_load = open("model-spk-xvextor/xvectors_train//plda.ascii", 'r').read()
mean_vect = open("model-spk-xvextor/xvectors_train/mean.vec.ascii", 'r').read()

transform_file = np.loadtxt(fname = 'model-spk-xvextor/xvectors_train/transform.mat.ascii')
transform = transform_file[:,:-1]
offset = transform_file[:,-1]
spk = "
sk002438"
enrolled = np.matmul(transform,vector)+offset
enrolled  = np.array_str(enrolled )
enrolled = spk_enroll + " "  + enrolled 

test_feats = np.matmul(transform,xvector_test)+offset
dist = bob.kaldi.plda_score(test_feats, enrolled, plda_load, mean_vect)


Saeed Sarfjoo

unread,
Aug 24, 2020, 1:38:38 PM8/24/20
to bob-...@googlegroups.com
Dear Lali,

`mean_vect` is global mean with dim 512 and must be subtracted from x-vectors before LDA transformation. `globalmean` in bob.kaldi.plda_score must match with x-vector dim after LDA transformation (200 in your case). I prepared the sample code for your case: 

import sys
import numpy as np
import bob.kaldi
import tempfile
import os
import ast


def convert_line_to_list(line):
#"convert line in xvector file to a list"
x = line.split()[2:]
x = ' '.join([str(elem) for elem in x])
x = ast.literal_eval(x.strip())
return x

def test_kaldi_plda():
xvector_list = "test_data/xvectors_sample"
xvector = open(xvector_list, "r")
xvector = xvector.readlines()

xvector_test = convert_line_to_list(xvector[0])
xvector_test = np.array (xvector_test)
vector = convert_line_to_list(xvector[1])
vector = np.array (vector)
transform_file = np.loadtxt(fname = 'test_data/transform.mat.ascii')
plda_load = open("test_data/plda.ascii", 'r').read()
mean_vect = np.loadtxt(fname ="test_data/mean.vec")
# Length normalization
vector=vector/np.sum(np.square(vector))
# Global mean subtraction
vector=vector-mean_vect
# Length normalization
vector=vector/np.sum(np.square(vector))
# Length normalization
xvector_test=xvector_test/np.sum(np.square(xvector_test))
# Global mean subtraction
xvector_test = xvector_test-mean_vect
# Length normalization
xvector_test=xvector_test/np.sum(np.square(xvector_test))
transform = transform_file[:,:-1]
offset = transform_file[:,-1]
spk_enroll = "spk0"
enrolled = np.matmul(transform,vector)+offset
enrolled = np.array_str(enrolled )
enrolled = spk_enroll + " " + enrolled.replace('\n','').replace('[', '[ ').replace(']', ' ]')
test_feats = np.matmul(transform,xvector_test)+offset
mean_start_idx=plda_load.find('[')
mean_end_idx=plda_load.find(']')
plda_mean = plda_load[mean_start_idx:mean_end_idx+1]

score = bob.kaldi.plda_score(test_feats, enrolled, plda_load, plda_mean)
print ("PLDA score: " + str(score))

if __name__ == '__main__':
test_kaldi_plda()


In the code, it is important to keep `spk_enroll = "spk0"`

Best,
Saeed
--
-- You received this message because you are subscribed to the Google Groups bob-devel group. To post to this group, send email to bob-...@googlegroups.com. To unsubscribe from this group, send email to bob-devel+...@googlegroups.com. For more options, visit this group at https://groups.google.com/d/forum/bob-devel or directly the project website at http://idiap.github.com/bob/
---
You received this message because you are subscribed to a topic in the Google Groups "bob-devel" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/bob-devel/pMHjITNRqXs/unsubscribe.
To unsubscribe from this group and all its topics, send an email to bob-devel+...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/bob-devel/f4406be5-9c63-4b10-9d4b-fd3795259a1eo%40googlegroups.com.


--
Seyyed Saeed Sarfjoo, PhD
Postdoc Researcher
Idiap Research Institute
Centre du Parc, Rue Marconi 19,
PO Box 592, CH-1920 Martigny
saeed....@idiap.ch
Tel:  +41 27 721 77 85

mili lali

unread,
Aug 25, 2020, 8:40:40 AM8/25/20
to bob-devel
wow
Dear Saeed, many thanks
really useful code. 
But I get only plda score equal to 85.78. In both target and impostor trials. My data is bad or the code has some problems?

best regards

mili lali

unread,
Aug 26, 2020, 6:08:39 PM8/26/20
to bob-devel
Dear Saeed 
I checked results in Kaldi here some of plda scores and results however I don't know why results in bob are not well and only are equal 85.!

sk004878_d7218b31-1dd1-4c01-9122-e93ab08b670c sk004878_d7218b31-1dd1-4c01-9122-e93ab08b670c 61.85819
sk004878_e70a2a1d-d19a-4036-be03-fe0ccbbd793b sk004878_e70a2a1d-d19a-4036-be03-fe0ccbbd793b 56.25875
sk004878_ee9acd72-9d53-4759-b5f0-2e58dff2aed8 sk004878_ee9acd72-9d53-4759-b5f0-2e58dff2aed8 58.02261
sk000003_0ea83414-5856-4867-b3c1-39a48809f4c1 sk002657_f88f4dc3-2b61-4bfa-a285-bfc712f2c477 -12.17074
sk002392_4442df4d-c85c-4a7e-9713-db9fca284b8c sk002662_0de06e7a-6846-459e-b1f7-c83323461a2d -25.86587
sk002392_63bd0de9-f503-4ade-92aa-df4db202ec61 sk002662_0e623086-edb1-42be-9486-70d9cbd6b182 -22.9812
sk002392_8c7b751e-2a0f-494d-8ce7-629f8a6eeaa5 sk002662_21d02adc-22c4-4a3d-bad8-f10561d52d4a -37.42211
sk002402_04e355aa-a5d2-4f24-a2c2-a888832433a8 sk002662_35b06c8d-edb8-46c4-b222-d96f4d7cbbe8 -4.655671
sk002402_6088eb99-8abe-47c0-a109-965ee70586b2 sk002662_3dc21347-4ae6-4b67-821c-e48d40bdefe5 -11.55437
LOG (compute-eer[5.5.162~5-ca32c]:main():compute-eer.cc:136) Equal error rate is 0%, at threshold 51.892
minDCF is 0.0000 at threshold 18.1885 (p-target=0.01, c-miss=1,c-fa=1)
minDCF is 0.0000 at threshold 18.1885 (p-target=0.001, c-miss=1,c-fa=1)





Saeed Sarfjoo

unread,
Aug 27, 2020, 10:58:16 AM8/27/20
to bob-...@googlegroups.com
Dear Lali,

The sequence of applying the length normalization and scaling must match the PLDA training setup. The default setup of Kaldi PLDA training is different from bob.kaldi. In Kaldi, we must apply length normalization after LDA transformation and scale the normalized vector with the square root of LDA dim (200 in your case). I re-write the code based on default Kaldi setup:

import sys
import numpy as np
import bob.kaldi
import tempfile
import os
import ast


def convert_line_to_list(line):
#"convert line in xvector file to a list"
x = line.split()[2:]
x = ' '.join([str(elem) for elem in x])
x = ast.literal_eval(x.strip())
return x

def test_kaldi_plda():
xvector_list = "test_data/xvectors_sample"
xvector = open(xvector_list, "r")
xvector = xvector.readlines()

xvector_test = convert_line_to_list(xvector[0])
xvector_test = np.array (xvector_test)
vector = convert_line_to_list(xvector[1])
vector = np.array (vector)
transform_file = np.loadtxt(fname = 'test_data/transform.mat.ascii')
plda_load = open("test_data/plda.ascii", 'r').read()
mean_vect = np.loadtxt(fname ="test_data/mean.vec.ascii")

# Global mean subtraction
vector=vector-mean_vect

# Global mean subtraction
xvector_test = xvector_test-mean_vect

transform = transform_file[:,:-1]
offset = transform_file[:,-1]
spk_enroll = "spk0"
enrolled = np.matmul(transform,vector)+offset

# Length normalization
enrolled=enrolled*np.sqrt(enrolled.shape[0])/np.sqrt(np.sum(np.square(enrolled)))
enrolled = np.array_str(enrolled )
enrolled = spk_enroll + " " + enrolled.replace('\n','').replace('[', '[ ').replace(']', ' ]')
test_feats = np.matmul(transform,xvector_test)+offset

plda_mean = ["0"]*enrolled.shape[0]
plda_mean = '[ ' + ' '.join(plda_mean) + ' ]'

score = bob.kaldi.plda_score(test_feats, enrolled, plda_load, plda_mean)
print ("PLDA score: " + str(score))

if __name__ == '__main__':
test_kaldi_plda()


Here, for the following trials

0ea83414-5856-4867-b3c1-39a48809f4c1 b317e33f-6063-4b12-ad84-8888e78ae49b 
0ea83414-5856-4867-b3c1-39a48809f4c1 f3497145-a0ac-4ab3-bb11-f8bb4c50385b 
b317e33f-6063-4b12-ad84-8888e78ae49b f3497145-a0ac-4ab3-bb11-f8bb4c50385b 

The PLDA scores are -11.30643, -11.31262, and 52.36016, respectively that match the default Kaldi with the following script:

ivector-plda-scoring --normalize-length=true \
"ivector-copy-plda --smoothing=0.0 test_data/plda - |" \
"ark:ivector-subtract-global-mean test_data/mean.vec ark:test_data/xvectors.ark ark:- | transform-vec test_data/transform.mat ark:- ark:- | ivector-normalize-length ark:- ark:- |" \
"ark:ivector-subtract-global-mean test_data/mean.vec ark:test_data/xvectors.ark ark:- | transform-vec test_data/transform.mat ark:- ark:- | ivector-normalize-length ark:- ark:- |" \
"cat 'test_data/trails' | cut -d\ --fields=1,2 |" test_data/scores_test || exit 1; 


Best,
Saeed
--
-- You received this message because you are subscribed to the Google Groups bob-devel group. To post to this group, send email to bob-...@googlegroups.com. To unsubscribe from this group, send email to bob-devel+...@googlegroups.com. For more options, visit this group at https://groups.google.com/d/forum/bob-devel or directly the project website at http://idiap.github.com/bob/
---
You received this message because you are subscribed to the Google Groups "bob-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bob-devel+...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/bob-devel/4143e09a-ac08-4502-a502-d06ccab8c78co%40googlegroups.com.

mili lali

unread,
Aug 27, 2020, 4:28:26 PM8/27/20
to bob-devel
Dear Saeed
Many thanks. 
Reply all
Reply to author
Forward
0 new messages