Hi all,
I expanded upon the knapsack example in the documentation and yielded a strange phenomenon that I am not sure is intentional or not. I am looking for some verification that results are still guaranteed valid.
I expanded the possible knapsack items to 150, and iterated on the solution until it yields 100 valid solutions. In iteratively adding constraints against Y values of prior solutions to ensure variance in next solutions, I noticed that in some cases, the Y value of given solution, which is supposed to be binary 1 or 0, actually had values that were fractional. Interestingly, the yielded solutions with fractional Y values still appear valid. However, more interestingly, if I throw away any solution that has Y values that are not EXACTLY = 1, and continue to iterate to 100 solutions, the program executes an order of magnitude longer, BUT, the average result values yielded are slightly higher! This seems intuitive, but I am wondering if this is known, and is an intentional tradeoff between speed and optimality, or a bug of some kind that happens to yield valid but sub optimal results.
Thanks,
Edward