Skip to main content

Warpingfusion: Accurate Multi-View Tsdf Fusion With Local Perspective Warp

Jiwoo Kang, Seongmin Lee, Mingyu Jang, Hyunse Yoon, Sanghoon Lee

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:59
22 Sep 2021

In this paper, we propose the novel 3D reconstruction framework, where the surface of a target object is reconstructed accurately and robustly from multi-view depth maps. Even if accurate sensor synchronizations are used to capture the moving object in multi-view reconstruction, a depth map from the sensors tend to have the spatially-varying perspective warps due to motion blur and rolling shutter artifacts. Incorporating those misaligned points from the views into the world coordinate leads to significant artifacts in the reconstructed shape. We address the mismatches by the patch-based depth-to-surface alignment using implicit surface-based distance measurement. The patch-based minimization enables the proposed method to find spatial warps on the depth map fast and accurately with the global transformation preserved. The proposed framework efficiently optimizes the local alignments against depth occlusions and local variants thanks to the point to surface distance based on an implicit representation. In the experiments, the proposed method shows significant improvements over the other reconstruction methods, demonstrating efficiency and benefits of our method in the multi-view reconstruction.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00