3. Create submissions

In this notebook we will explore the datasets of the AnDi 2 challenge and show how to properly create a submission. We will consider that you have already read the paper and know the basics of the challenge. This notebook has three sections:

Reading videos and trajectories: the public data

First things first: you need to download the public data to perform the predictions available in the competition’s Codalab webpage (link not yet available). We will showcase here how to do so in the Development phase data. For that, go to the competition, then Participate > Files and download the Public Data for Phase #1 Development. Once unzipped, the dataset should have the following file structure:

└─── track_1 (videos)
|   │
|   └─── exp_Y
│        │
│        └─── videos_fov_X.tiff (video for each fov)
└─── track_2 (trajectories)
    └─── exp_Y
        └─── traj_fov_X.csv (trajectories for each FOV) 

where Y goes from 0 to 9 and X from 0 to 29. This means that we have 10 experiments with 30 FOVs each.

public_data_path = 'public_data/' # make sure the folder has this name or change it

Track 1: videos

Track 1 focuses on videos. Each of the tiffs contains a video mimicking a typical single particle tracking experiment (see the paper for details). Let’s load one of the videos. You can do so with the following function:

from andi_datasets.utils_videos import import_tiff_video
import matplotlib.pyplot as plt
import numpy as np

video = import_tiff_video(public_data_path+'track_1/exp_0/videos_fov_1.tiff')

The first frame contains the label of the VIP particles. The pixel indicates the initial position of the particle and the value provides the particle’s index.

VIP particles are the ones you will need to characterize in the single trajectory task (see more on this below).


To access the indices of the VIP particles, take the unique values of the initial frame:

array([ 16,  18,  19,  20,  21,  22,  24,  27,  28,  29, 255], dtype=uint8)

From the previous numbers, 255 is the background and the rest are the indices of the VIP particles.

We can visualize the videos with the built-in function play_video.

from andi_datasets.utils_videos import play_video

Let’s see how the video looks like! play_video expects an object with shape (num_frames, pixels, pixels, channels). In this case, we will need to add an additional axis for the channels (grayscale).

You may need to install pillow (simply pip install pillow) to play the following video.