Skip navigation

Synthetic Datasets

Authors: Sebastian Koch
Yurii Piadyk
Markus Worchel
Marc Alexa
Claudio Silva
Denis Zorin
Daniele Panozzo
Issue Date: 2021
Abstract: Images of a real scene taken with a camera commonly differ from synthetic images of a virtual replica of the same scene, despite advances in light transport simulation and calibration. By explicitly co-developing the scanning hardware and rendering pipeline we are able to achieve negligible per-pixel difference between the real image taken by the camera and the synthesized image on geometrically complex calibration object with known material properties. This approach provides an ideal test-bed for developing data-driven algorithms in the area of 3D reconstruction, as the synthetic data is indistinguishable from real data and can be generated at large scale. Pixel-wise matching also provides an effective way to quantitatively evaluate data-driven reconstruction algorithms.
Description: Synthetic scans for 1000 different objects from the ABC dataset and 7 textured 3D printed objects (Dodo, Vessel/Bird, House, Radio, Sculpture/Avocado, Chair, Vase). Each scan includes the rendered images of all patterns (in HDR and LDR format), the decoded correspondences, the ground truth depth map and the reconstructed depth map as well as the reconstructed point cloud. The dataset is split into multiple chunks, each chunk containing the data of 50 objects.
Rights: CC BY 4.0 License
Appears in Collections:Hardware Design and Accurate Simulation for Benchmarking of 3D Reconstruction Algorithms

Files in This Item:
File Description SizeFormat 
abc_objects.zip25.5 MBUnknownView/Open
abc_scans.zip41.46 GBUnknownView/Open
textured_objects.zip508.43 MBUnknownView/Open

Items in FDA are protected by copyright, with all rights reserved, unless otherwise indicated.