Kinect@Home: Crowdsourced RGB-D data Rasmus Göransson, Alper Aydemir and Patric Jensfelt
Overview What? ● Why? ● How? ● Where? ● When? ●
What? Kinect@Home have collected nearly 1000 RGB-D video ● recordings from users all around the world We have selected 54 of these recordings to be released ● as the first dataset
Why? Algorithms are getting better ● It is time to give them better challenges ● Recordings made in-house are biased ● Crowdsourcing provides real world data with less bias ●
Why?
Why?
How? Data collection needs to be simple for the users ● Users need a reason to collect data = motivation ●
How? Data collection needs to be simple for the users ● Users need a reason to collect data = motivation ● Upload recordings with web browser ●
How? Data collection needs to be simple for the users ● Users need a reason to collect data = motivation ● Upload recordings with web browser ● Perform 3D reconstruction in the cloud ●
How? Data collection needs to be simple for the users ● Users need a reason to collect data = motivation ● Upload recordings with web browser ● Perform 3D reconstruction in the cloud ● Present the results within minutes ●
How?
How?
How?
What about ground truth? There is no ground truth ●
What about ground truth? There is no ground truth ● ...but the dataset is still useful! ●
How can it be used? By comparing different algorithms ● Kinfu vs. key point based (SURF) pose estimation ●
How can it be used? Kinfu
How can it be used? SURF
How can it be used? Kinfu
How can it be used? SURF
How can it be used?
Where? Download the dataset with the 54 recordings from: ● www.kinectathome.com/datasets
When? NOW! :) ● This dataset and future releases will be available from ● www.kinectathome.com/datasets
What about ...? Questions please! ●
Recommend
More recommend