Hackathon

Kaggle Competition

We will open the competition on Kaggle on Thursday, 13th of August at 12.00 o’clock CEST (noon).
All details (descriptions, data) can be found here: https://kaggle.com/c/summerschool2020.
Deadline for submissions is Wednesday, 2nd of September, 11.59 o’clock CEST (noon).

How to use Kaggle for the Hackathon


With the ever-increasing volume of transported goods worldwide an enhanced efficiency in logistics processes is getting more and more important. Typically, this results in a higher level of automation for inter- and intra-logistics. The chair for material flow and warehousing at the TU Dortmund University hosts an intra-logistics campus as a testbed for evaluating automatization solutions. At this testbed material flows via autonomous systems can be emulated and optimized via Machine Learning. As part of the Summer School, a hackathon provides you with access to positioning data, application of machine learning to this data and ultimately controlling a robot based on your predictions.



Machine Learning Task

For controlling robots in a warehouse scenario, knowledge of their positions is necessary for centralized control of material flows. While commercial solutions for positioning like camera or beacon systems are available of the shelf, these solutions are expensive, need complicated setups and may not even be able to cover the whole warehouse, e.g. during shadowing from racks and shelves. The experimental lab at TU Dortmund University features a unique sensor flow, consisting of a regular grid of sensor nodes, i.a. with vibration and magnetometer sensors. The time series of these sensor measurements will be provided to the Summer School participants, each measurement accompanied by the exact position of the robot provided via a commercial positioning system as the label.

You may use whatever Machine Learning algorithm and language you prefer for training a model predicting the x/y position in the local coordinate systems. Teams with the best models will get access to the real robot transportation system and solely control the robots based on the sensor measurements and their prediction.


Live Event - Control your robots

On friday, 4th of September, the teams with the best model's prediction for the robot’s position will get the chance to use these predictions for controlling the robot in the live environment. We will provide you with a simple REST API for querying recent sensor floor data and commanding the robot via pushing directional request.

You will have to guide the robot through an obstacle course in minimum time with minimum errors.

The final winning team will get an invitation to TU Dortmund University (travel costs covered) to visit the logistics lab (as soon as possible again), present their approach at the solution and the chance to jointly write a publication about the whole process with the competition's researchers.


Robot live control

Drive a robot from the start through provided waypoints to the end position using location information from the sensor floor using the model you build in task 1 during the Kaggle competition.

Qualification

Up to 5 teams from the Kaggle competition will be allowed to participate in this task. A team or individual is allowed to participate in this task.

Scoring

The shortest time a team takes to drive from start to end position is the winner. The winning team should arrive at all waypoints before finishing at the end position. The jury will decide the final score for each team after evaluating the developed code for this event.

The winning team will be invited to Dortmund to present the summer school hackathon solution to the local researchers at TU Dortmund and will be invited to collaborate in a publication with the results of the summer school hackathon.

Duration

Description

The starting point will be random, but with the same distance from the next goal to be fair to all teams. The robot will always start at a fixed orientation, and there will be no feedback on the orientation after that. The waypoints have a fixed position, but each team has to reach them in a different order. The number of waypoints will be 6. There will be no obstacles on the floor.

floor hackathon task-2

A team will be allowed to restart from their starting position upon request from the event's moderators. There will be no time penalty, but the time elapsed will include the time for the restart. The restart will put the robot at the start position and already arrived waypoints will still be counted, you will only be required to drive to the remaining waypoints.

Robot API Documentation

How to control the robot?

The robot is a ROS based mobile robot that is tracked using a Vicon motion tracking system. We have abstracted the mobility functions to a simple web-based REST API that allows you to control the robot from anywhere in the world. The task's objective is to enable you to create a navigation stack using a global localization source. A global localization source will enable you to localize the robot within the whole area of the hall. The sensor floor provides you real-time information that can be accessed using the API interface and can freely move the robot. For making things easier considering the short time and the multi-disciplinary knowledge required to complete the task, we have also provided an example code that does the individual parts. You can use the example code as such or improvise by taking the code parts as building blocks for your solution.


As it is mentioned, the API is simple and you have three possible actions.

Forward:

The robot can move forward taking an argument which is a number interpreted by the server as meters. This is passed onto the robot and the robot will move.

Turn:

Since the robot can turn freely in any direction, you are also provided with the ability to turn the robot with the action turn robot. The number here is interpreted by the server in angles.

For a complex motion planner, you can use these two basic kinematic abstractions to move the robot wherever you desire inside the hall. Since the robot is localised using a Vicon Motion Capturing system, the meter distances are tracked precisely but there is an allowable margin for error due to the physical dynamics of the robots’ motors and its interaction with the floor.

go_to_relative

If you wish to only move the robot around without talking into consideration the orientation of the robot, we have created a third action, where you can tell the robot to move in meters relative to its current position.

Data: current_values

The sensor floor real-time data is available through the API as well. It is generated almost every 6 seconds and it is already the pre-processed data that you can directly use in your machine learning model.

waypoint_status

Finally, a GET request is implemented to know the current status of your waypoints. You are asked to follow the vector provided to you in the order. The waypoint recorder will only make it positive for your trial if you pass through the waypoint as per definition. You can always access the status of the waypoints using this API. You can also integrate this into your code to allow the robot to pursue the next waypoint once you have arrived one automatically.

All of this API is documented in postman and you can easily switch to your target language to develop your navigation stack using the sensor floor real-time data.

https://documenter.getpostman.com/view/1411058/T1LV7i4v

We have made it easier for you to program the sensor floor based navigation stack using localisation and motion planning of the robot using a simple python code. You are not limited to our creativity and you are free to choose to implement the solution however you wish to in any language you prefer. We chose the web based REST API interface especially to allow for this flexibility. The link to the sample code is available below and all of the updated comments are in the file.

competition_example.py