IPC Learning Track Results

17 views
Skip to first unread message

Mark "mak" Roberts

unread,
Oct 6, 2014, 12:45:36 PM10/6/14
to ipc2014-...@googlegroups.com
Greetings, Competitors,

At long last, I present the results for the 2014 International Planning Competition Learning Track.  You may recall the proposed tracks included a Quality subtrack that followed the previous competition format and an Integrated Execution subtrack that was cancelled due to too few competitors.  Final details concerning the competitors as well as the full results will be posted soon at: http://www.cs.colostate.edu/~ipc2014/ 

Of the 14 planners from eight teams who expressed an interest in competing for the Quality track, 11 planners from seven teams competed in the final evaluation.  Of these 11, five registered as basic solvers that used only one algorithm to solve problems; details on the basic solver definition are provided in the competition rules.

Competitors were judged on 6 domains chosen from previous competitions: elevators, floortile, nomystery, parking, spanner, and transport.  At the start of a six-week learning stage, competitors were provided generators for these domains, a representative set of training problems, and guidelines for the evaluation distributions. Errors in the domains and generators were corrected at this point.  After the learning stage was complete, competitors were provided runs from selected training problems to ensure that their planner was performing as expected.  Problems found in those runs were corrected before collecting the final results.

For the final evaluation, 5 problems from each domain were randomly generated from the distributions, resulting in 30 problem instances.  The planners were run on the EC2 cloud compute platform with the support of a generous grant from Amazon Web Services; each compute platform had a compute equivalent of 2 cores and 3.75 GB memory and ran Ubuntu 12.04 LTS.  To account for variations in the actual computing resources on the cloud platform, each planning system was run 30 times with and without domain knowledge on each problem instance.   There are three categories of awards, each with a first, second, and third place.

The overall best quality award compares planners on the quality of the best plan they produced for each problem.  The awards for best overall quality (out of a possible score of 30) go to: 

First Place - MIPlan (Quality: 21.88)
Second Place - Cedalion (Quality: 19.98)
Third Place - SMAC (Quality: 17.45)

The best learner award compares planners on the learning delta between their overall improvement on plan quality when knowledge was applied over when it was not.  To ensure that the baseline performance without knowledge was fair, any problem solved by seven or more planners was removed from this calculation, resulting in 24 problem instances. The awards for best learner go to: 

First Place - Cedalion (Adjusted Quality Delta: 10.40)
Second Place - Eroller (Adjusted Quality Delta: 9.97)
Third Place - SMAC (Adjusted Quality Delta: 9.18)

Finally, the best basic solver award goes to the planning system that used only a single core algorithm.  The awards for best core solver go to:

First Place - SMAC (Quality: 17.45)
Second Place - LLama (Quality: 14.30)
Third Place - Eroller (Quality: 12.51)

Congratulations to the winners of the 2014 Learning track of the IPC!

Further details about the competitors, problems, and final results will be posted soon at: http://www.cs.colostate.edu/~ipc2014/.

mak
Reply all
Reply to author
Forward
0 new messages