Hi there,
I'm currently working on a graduate student's project for which I use the pySMT API.
So first, thanks a lot for your work on pysmt it really makes working with SMT much easier!
As part of my project, I create in every loop a new solver and a bunch of assertions corresponding to the parameters of the respective loop.
I then measure in every loop the time it takes the solver to find the solution and I investigate on how these time measures behave in different loops.
Now my first problem is to reliably measure this time.
I currently do it in the following way:
t_start = clock()
solution_found = solver.solve()
t_end = clock()
delta_t = t_end - t_start
where clock() is taken from the python library 'time' and delta_t is my final measure.
Is this the right way to do it or is there some internal time measurement inside pySMT?
And then I have a second more severe problem:
The time measures in the individual loops are not same depending on the ordering of the loop!
For example, when I loop over the parameters [2,4,6,10,15,20,30] the time measure in the loop of parameter '30' deviates significantly from when I loop over the parameters [30,20,15,10,6,4,2] and look at the same time measure (for parameter '30').
Apparently, information from previous loops is somehow saved although I create a new solver instance in each loop...
You have any idea where this might come from?
Looking forward to your help! Thanks!
Best,
Matthias