there is a new euroc-c2s1-simulator_1.0.16_i386.deb package and a euroc_c2s1_20141027183539.ova vm which might help debugging "Illegal Instruction" errors. update is not needed and only recommended for teams getting this error.
please download & install this package within the simservVM or your native setup. (or download & import the new simservVM from euroc_c2s1_20141030165227.ova).
there is a new euroc-c2s1-simulator_1.0.24_i386.deb which specifies a surface material id for the gripper base and should make the gripper hack more robust against missing bullet contact points.
please download & install this package within the simservVM or your native setup. (the simservVM from euroc_c2s1_20141030165227.ova does not have this update).
there is a new euroc-c2s1-scenes_1.0.25_i386.deb which adds contact-specifications for additional surface combinations (for cylinder with new rolling friction).
please download & install this package within the simservVM or your native setup. (the simservVM from euroc_c2s1_20141030165227.ova does not have this update).
there is a new euroc-c2s1-simulator_1.0.26_i386.deb. we fixed a printf-format-string used to log stop conditions (this could have caused segfaults when setting new stop_conditions), gzserver should now be able to write core dumps to /tmp, we made sure to reset the state of the "gripper hack" - this might help to avoid objects being detached and then immediately attached again.
please download & install this package within the simservVM or your native setup. (the new simservVM from euroc_c2s1_20141105140755.ova does also include this update).
there is a new euroc-c2s1-simulator_1.0.27_i386.deb. it fixes a possible race condtion when closing the logfile. this update includes no other changes and is optional.
we released euroc-c2s1-scenes_1.0.26_i386.deb which only includes the scenes as used in the final evaluation. please refer to the email "EUROC Challenge 2 Evaluation Results" from Christoph Borst (Mon, 8 Dec 2014 15:37:17 +0100):
... However, we want to make you aware that a one to one comparison with the final evaluation is not easily possible as one would need to reproduce the exact evaluation setup (host cpu on simulator & challenger VM, used simulator GPU, network timing between simulator and challenger VM...) those small differences should be no threat to a robust challenger VM which this challenge wants to promote but will cause slightly different results on different setups.
We also want to highlight once more that for the final evaluation the process had to run fully automatic without any user interference. This was a strict requirement by the EC so that we cannot manipulate results. This means for the evaluation the simulator and your challenge VM has been started and from then all depended on your startup script which had to autonomously start all tasks. You will only see scores for runs/tasks where we did find at least one logfile after the run. If you are missing scores for tasks this is probably because your solution did not start those tasks. ...
when interested, download and install this package either on your native setup or your simserv vm.