<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
  <channel>
    <title>FDA Collection: Short Description</title>
    <link>http://hdl.handle.net/2451/62251</link>
    <description>Short Description</description>
    <pubDate>Sun, 05 Apr 2026 13:24:28 GMT</pubDate>
    <dc:date>2026-04-05T13:24:28Z</dc:date>
    
    <item>
      <title>Benchmarks</title>
      <link>http://hdl.handle.net/2451/63309</link>
      <description>Title: Benchmarks
Authors: Sebastian Koch; Yurii Piadyk; Markus Worchel; Marc Alexa; Claudio Silva; Denis Zorin; Daniele Panozzo
Abstract: Images of a real scene taken with a camera commonly differ from synthetic images of a virtual replica of the same scene, despite advances in light transport simulation and calibration. By explicitly co-developing the scanning hardware and rendering pipeline we are able to achieve negligible per-pixel difference between the real image taken by the camera and the synthesized image on geometrically complex calibration object with known material properties. This approach provides an ideal test-bed for developing data-driven algorithms in the area of 3D reconstruction, as the synthetic data is indistinguishable from real data and can be generated at large scale. Pixel-wise matching also provides an effective way to quantitatively evaluate data-driven reconstruction algorithms.
Description: Benchmark data derived from the synthetic scans to run the denoising, surface reconstruction and shape completion benchmarks. Next to the benchmark data, there is also the collected data used for the different experiments from the paper.</description>
      <pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2451/63309</guid>
      <dc:date>2021-01-01T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Synthetic Datasets</title>
      <link>http://hdl.handle.net/2451/63308</link>
      <description>Title: Synthetic Datasets
Authors: Sebastian Koch; Yurii Piadyk; Markus Worchel; Marc Alexa; Claudio Silva; Denis Zorin; Daniele Panozzo
Abstract: Images of a real scene taken with a camera commonly differ from synthetic images of a virtual replica of the same scene, despite advances in light transport simulation and calibration. By explicitly co-developing the scanning hardware and rendering pipeline we are able to achieve negligible per-pixel difference between the real image taken by the camera and the synthesized image on geometrically complex calibration object with known material properties. This approach provides an ideal test-bed for developing data-driven algorithms in the area of 3D reconstruction, as the synthetic data is indistinguishable from real data and can be generated at large scale. Pixel-wise matching also provides an effective way to quantitatively evaluate data-driven reconstruction algorithms.
Description: Synthetic scans for 1000 different objects from the ABC dataset and 7 textured 3D printed objects (Dodo, Vessel/Bird, House, Radio, Sculpture/Avocado, Chair, Vase). Each scan includes the rendered images of all patterns (in HDR and LDR format), the decoded correspondences, the ground truth depth map and the reconstructed depth map as well as the reconstructed point cloud. The dataset is split into multiple chunks, each chunk containing the data of 50 objects.</description>
      <pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2451/63308</guid>
      <dc:date>2021-01-01T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Calibration Data</title>
      <link>http://hdl.handle.net/2451/63307</link>
      <description>Title: Calibration Data
Authors: Sebastian Koch; Yurii Piadyk; Markus Worchel; Marc Alexa; Claudio Silva; Denis Zorin; Daniele Panozzo
Abstract: Images of a real scene taken with a camera commonly differ from synthetic images of a virtual replica of the same scene, despite advances in light transport simulation and calibration. By explicitly co-developing the scanning hardware and rendering pipeline we are able to achieve negligible per-pixel difference between the real image taken by the camera and the synthesized image on geometrically complex calibration object with known material properties. This approach provides an ideal test-bed for developing data-driven algorithms in the area of 3D reconstruction, as the synthetic data is indistinguishable from real data and can be generated at large scale. Pixel-wise matching also provides an effective way to quantitatively evaluate data-driven reconstruction algorithms.
Description: Calibration data for physical scans including camera and projector calibrations. A sample scan of the flat calibration board is also included for accuracy testing.</description>
      <pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2451/63307</guid>
      <dc:date>2021-01-01T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Physical Scans</title>
      <link>http://hdl.handle.net/2451/63306</link>
      <description>Title: Physical Scans
Authors: Sebastian Koch; Yurii Piadyk; Markus Worchel; Marc Alexa; Claudio Silva; Denis Zorin; Daniele Panozzo
Abstract: Images of a real scene taken with a camera commonly differ from synthetic images of a virtual replica of the same scene, despite advances in light transport simulation and calibration. By explicitly co-developing the scanning hardware and rendering pipeline we are able to achieve negligible per-pixel difference between the real image taken by the camera and the synthesized image on geometrically complex calibration object with known material properties. This approach provides an ideal test-bed for developing data-driven algorithms in the area of 3D reconstruction, as the synthetic data is indistinguishable from real data and can be generated at large scale. Pixel-wise matching also provides an effective way to quantitatively evaluate data-driven reconstruction algorithms.
Description: Physical scans of 3 (Pawn, Rook and Shapes) machined white calibration objects and 7 (Dodo, Vessel/Bird, House, Radio, Sculpture/Avocado, Chair, Vase) 3D printed color textured test objects. Calibration objects have also been scanned with ambient lights off, including an additional scan of the flat plane object for material calibration. Rotating stage calibration was performed before and after scanning the objects for cross-checking for any disturbances. Additional stage calibration was performed for a pair of Pawn scans with matte and glossy material coating. Collections of patterns used are also included.</description>
      <pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2451/63306</guid>
      <dc:date>2021-01-01T00:00:00Z</dc:date>
    </item>
  </channel>
</rss>

