This point cloud dataset is captured by 10 "synchronized" Kinects (named the Kinoptic Studio) installed in the Panoptic Studio
There is no way to perfectly synchronize multiple kinects, but we accurately aligned them by some hardware modifications for time-stamping.
The 3D point cloud is generated by merging the depth maps from the multiple Kinects captured within a time interval (+-15msec). This means that if the human motion is fast, there exist misalignment.
The kinect data is captured with other 500+ RGB cameras, and they are sharing timespace and 3D world coordinate. So this point cloud output can be used with the output of RGB cameras (e.g. RGB videos and 3D skeletons).
For more hardware details, please see our tutorial page.
Data Summary (For Each Seqeunce)
10 synchronized RGB+D videos
3D point clouds from the 10 RGB+D videos
31 synchronized HD videos from other viewpoints for the same scenes
Calibration parameters for 10 RGB+D cameras and 31 HD Cameras
Sync table for all RGB+D and HD videos
Optional: you can also use 480 synchronized VGA videos for the same scenes, if you can tolerate the huge data size.
Data Description
The basic data types are synchronized RGB videos from 10 kinects with corresponding 10 depth files.
You can easily generate point cloud using our KinopticStudio Toolbox in our github page.
Note that the downloaded Kinect videos are not synchronized by themselves, but our toolbox uses a time table to align them. So you cannot directly use the extracted images from kinect videos.
Note that the frame index of the 3D point clouds from our toolbox is defined by the HD camera frame indexing of the panoptic studio. This means that you can use all HD videos (which are already synchronized, in contrast to the kinect videos) as corresponding RGB images.
Use this script to download all data in this version.
Example Sequences
A "SocialGame" Sequence: 160422_haggling1
A "Range of Motion" Sequence: 171026_pose1
A "Toddler" Sequence: 160906_ian1
A "Toddler" Sequence: 160401_ian1
A "Musical Instrument" Sequence: 171026_cello2
An "Office" Sequence: 170407_office2
A "Musical Instrument" Sequence: 161029_flute1
Reference
@article{Joo_2017_TPAMI,
title={Panoptic Studio: A Massively Multiview System for Social Interaction Capture},
author={Joo, Hanbyul and Simon, Tomas and Li, Xulong and Liu, Hao and Tan, Lei and Gui, Lin and Banerjee, Sean and Godisart, Timothy Scott and Nabbe, Bart and Matthews, Iain and Kanade, Takeo and Nobuhara, Shohei and Sheikh, Yaser},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2017}
}