Challenge-2019 on

Object Detection in Aerial Images

June 16, 2019, Long Beach, California.

Overview

We propose three detection tasks. Task1 is to detect instances with oriented bounding boxes. Task2 is to detect instances with horizontal bounding boxes. You can use the provided train/val data to train and validate your detector. Validation data may also be used for training when submitting results on the test set. External datas of any form is allowed. But must be reported during submission. Fine-tuning models that are pretrained on ImageNet or COCO are also allowed.

Task1 - Detection with oriented bounding boxes

The purpose of this task is to localize the ground object instances with an oriented bounding box. The oriented bounding box follows the same format with the original annotation {(x i, y i), i = 1,2,3,4}.

Submission Format

You will be asked to submit a zip file (example_Task1.zip) containing results for all test images to evaluate your results. The results are stored in 16 files, "Task1_plane.txt, Task1_storage-tank.txt, ...", each file contains all the results for a specific category. Each file is in the following format:

Evaluation Protocol

The evaluation protocol for oriented bounding box is a little different from the protocol in the original PASCAL VOC. We use the intersection over the union area of two polygons(ground truth and prediction) to calculate the IoU. The rest follows the PASCAL VOC.

Task2 - Detection with horizontal bounding boxes

Detecting object with horizontal bounding boxes is usual in many previous contests for object detection. The aim of this task is to accurately localize the instance in terms of horizontal bounding box with (x, y, w, h) format. In the task, the ground truths for training and testing are generated by calculating the axis-aligned bounding boxes over original annotated bounding boxes.

Submission Format

You will be asked to submit a zip file (example_Task2.zip) containing results for all test images to evaluate your results. The results are stored in 16 files, "Task2_plane.txt, Task2_storage-tank.txt, ...", each file contains all the results for a specific category. The format of the results is:

Evaluation Protocol

The evaluation protocol for horizontal bounding boxes follows the PASCAL VOC benchmark, which uses mean Average Precision( mAP) as the primary metric.