The first dense stereo-based system for live interactive 3D reconstruction on mobile phones It generates dense 3D models with absolute scale on-site while
Previous PDF | Next PDF |
[PDF] 3DCapture: 3D Reconstruction for a Smartphone - CVF Open Access
We propose a method of reconstruction of 3D represen- tation (a mesh with a texture) of an object on a smartphone with a monocular camera The reconstruction consists of two parts – real-time scanning around the object and post- processing
[PDF] Live 3D Reconstruction on Mobile Phones - Research Collection
Both sensors can also be used for improving the 3D reconstruction on the phone Since they give priors to the motion of the device, the measurements can be used
[PDF] Real-Time 3D Tracking and Reconstruction on Mobile Phones
Index Terms—3d tracking, 3d reconstruction, augmented reality, mobile phone ♢ 1 INTRODUCTION The 3D modelling of objects from 2D images is a central
[PDF] Mobile3DRecon: Real-time Monocular 3D Reconstruction on a
23 déc 2020 · Our Mobile3DRecon system can perform real-time surface mesh re- construction on mid-range mobile phones with monocular camera we usually
[PDF] Mobile Phone and Cloud – a Dream Team for 3D Reconstruction
Recently, Structure-from-Motion pipelines (SfM) for the 3D reconstruction of scenes from images were pushed from desktop computers onto mobile devices, like
[PDF] Rapid Scene Reconstruction on Mobile Phones from - Clemens Arth
ABSTRACT Rapid 3D reconstruction of environments has become an active re- search topic due to the importance of 3D models in a huge number
[PDF] Live Metric 3D Reconstruction on Mobile Phones Marc Pollefeys
The first dense stereo-based system for live interactive 3D reconstruction on mobile phones It generates dense 3D models with absolute scale on-site while
[PDF] 3D reconstruction in your pocket - Fabio Poiesi
16 fév 2018 · Abstract We present a pipeline to create digital 3D replicas of real-world objects using off-the-shelf smart- phones Our methodology uses a
[PDF] 3d reconstruction software
[PDF] 3d reconstruction tutorial
[PDF] 3d scene reconstruction from video
[PDF] 3d shape vocabulary cards
[PDF] 3d shape vocabulary eyfs
[PDF] 3d shape vocabulary ks1
[PDF] 3d shape vocabulary ks2
[PDF] 3d shape vocabulary mat
[PDF] 3d shape vocabulary worksheet
[PDF] 3d shape vocabulary year 6
[PDF] 3rd arrondissement 75003 paris france
[PDF] 4 2 practice quadratic equations
[PDF] 4 2 skills practice powers of binomials
[PDF] 4 avenue de paris 78000 versailles
Live Metric 3D Reconstruction on Mobile Phones
ICCV 2013
1. Target & Related Work
2. Main Features of This System
3. System Overview & Workflow
4. Detail of This System
5. Experiments
6. Conclusion
Main Contents
1. Target & Related Work
The first dense stereo-based system for live interactive 3D reconstruction on mobile phones. It generates dense 3D models with absolute scale on-site while simultaneously supplying the user with real time interactive feedback.Related work
Wendelet al.[1] rely on a distributed framework with a variant of PTAM on a micro air vehicle. All demanding computations are performed on a separate server machine that provides visual feedback to a tablet computer.[1] A. Wendel, M. Maurer, G. Graber, T. Pock, and H. Bischof. Dense reconstruction on-the-fly. CVPR 2012.
( Graz University of Technology, Austria ) Pan et al.[2] demonstrated an interactive system for3D reconstruction on a mobile phone.
Related work
[2] Q. Pan, C. Arth, E. Rosten, G. Reitmayr, and T. Drummond. Rapid scene reconstruction on mobile phones
from panoramic images. ISMAR, 2011. ( Cambridge University & Graz University of Technology) Pisacariuet al.[3] presented a shape-from-silhouette framework running in real time on mobile phone.Related work
[3] V. A. Prisacariu, O. Kaehler, D. Murray, and I. Reid. Simultaneous. 3D tracking and reconstruction on a
mobile phone. ISMAR, 2013. (University of Oxford)2. Main Features of This System
(1)Initialization: fully automatic˗markers or any other specific settings are not required. (2) Estimate the metric scale of the reconstructed 3D models: feature-based tracking andmapping in real time ; inertial sensing in position and orientation to estimate the metric scale of
the reconstructed 3D models. (3)Interactive: automatically select suitable keyframeswhen the phone is held still; use the intermediate motion to calculate scale; Visual and auditory feedback is provided to enable intuitive and fool-proof operation. (4) Dense stereo matching : an efficient and accurate multi-resolution scheme for dense stereo matching and GPU acceleration; reduce the processing time to interactive speed.The example of how this system works:
Demo3. System Overview & Workflow
Two main inputstreams of this
system: (1)camera frames : 640*480 ; 15-30 Hz (2) inertial sensor information: angular velocity : 200Hz linear acceleration : 100HzThe outputis a 3D model in metric
coordinates in form of a colored point cloud. visualThis system consists of three main blocks:
Inertial tracking; visual pose estimation; Dense 3D modeling.3. System Overview & Workflow
Initialization
Visual
tracker Rvxv xfWhen scale
is fixedSparse
Mapping
Depth 3D Modeling
Inertial Sensors
RB isthe rotation from the current world to body/camera frame. is the rotation refinement with the visual tracker. , , donate the fused, vision and inertial position estimates in the world coordinate.RBRvxvxfxixi
4.1 Initialization
4.2 Inertial Sensor
4.3 Visual Tracker
4.4 Sparse Mapping
4.5 Depth 3D Modeling
4. Detail of This System
4.1 Initialization
(1) Two View Initialization : The map is initialized from two keyframes. Keyframe-1Keyframe-2 (the inertial estimator detects a salient motion with a minimal baseline)ORB features extracted and matched
RANSAC + 5-point algorithm
Relative pose (R, t)
Matched points are triangulated
,RtORB features
(2) A denser initial map:640*480
320*240
160*120
80*60Rotate the map
Fast corners extracted and 8*8 patch as descriptor Compare the ZSSD value along the segment of the epipolarlineMatched points are triangulated
Included to the map & bundle adjustment
4.1 Initialization
(2) A denser initial map: Fast corners extracted and 8*8 patch as descriptor Compare the ZSSD value along the segment of the epipolarlineMatched points are triangulated
4.1 Initialization
mdmindmaxSearch
region C1C2Keyframe-1Keyframe-2
Rotate the map
Included to the map & bundle adjustment
W Y B g W X B m W Z4.1 Initialization
camXcamYcamZThe camera and IMU are considered to be at
the same location and with the same orientation. isthe rotation from the current world to body/camera frame. RB is the visual measurements.Introduction of the Inertial Sensors
Inertial Measurement Unit (IMU) : gyroscope and accelerometerAccelerometer
camXcamYcamZ An accelerometer is a sensor for testing the acceleration along a given axis. Output: Provide the three components of the acceleration on the three directions under the coordinate system defined by the device. When a physical body accelerates at a certain direction, it becomes subject to a force equal to:F=ma in accordance with Newton's Second Law.Gyroscope
A gyroscope is a device for measuring or maintaining orientation, based on the principles of angular momentum: Output: provides the three components of the angular velocity under the coordinate system defined by the device. When no external torque acts on an object or a closed system of objects, no change of angular momentum can occur.System workflow
Initialization
Visual tracker
Rvxv xfWhen scale is fixed
Sparse
Mapping
Depth 3D Modeling
Inertial Sensors
RBxiVisual trackerKalmanFilter
Verlet
Integration
RvxvGyroscope
Accelerometer
RBxivi
xfWhen scale is fixed
wIntegration
Inertial Sensors
BBag3.2 Inertial Sensor
Pose Prediction with the Inertial Sensor:
Visual
trackerKalman
Filter
Verlet
Integration
RvxvGyroscope
Acceleromet
erRBxivi
wIntegration
Inertial Sensors
BBag4.2 Inertial Sensor
The estimation of the rotation:
RBThe filter prediction and update equations
4.2 Inertial Sensor
f, v and idenote fused, vision and inertial position estimates, k is the normalizing factor. decaying velocity modelThe estimation of the positions
Kalman
Filter
Verlet
Integration
RvGyroscope
Acceleromet
erRBxivi
wIntegrationBBag
xfxv3.2 Inertial Sensor
Metric scale estimation with the inertial sensors
The scale for visual-inertial fusion :
: the displacement estimated by accelerometer : the displacement estimated by vision C1 C2 WX WY WZ1y 1x 3x 2y 3y 2xMetric scale estimation with the inertial sensors
In order to deal with the noise and time-dependent bias from the accelerometer, an event-based outlier-rejection scheme is proposed. > thresholdFind the optimal scale: given ( , )
Metric scale estimation with the inertial sensors
As soon as the scale estimation converges, we can update the inertial position with visual measurements.4.2 Inertial Sensor
System workflow
Initialization
Visual tracker
Rvxv xfWhen scale is fixed
Sparse
Mapping
Depth 3D Modeling
Inertial Sensors
RBxi4.3 Visual Tracker
Refine the pose estimate from the inertial pose estimator and correct drift. If the visual tracking is lost, image localization module from PTAM is used.4.3 Visual Tracker
Inertial pose
Fast Corner
detectorPotential
visible map points {Xi}{mi}ZSSD feature
matchingRobust L-M absolute
pose estimator {Xi,mi} ImageSystem workflow
Initialization
Visual tracker
Rvxv xfWhen scale is fixed
Sparse
Mapping
Depth 3D Modeling
Inertial Sensors
RBxi4.4 Sparse mapping
New keyframes: moved the camera a certain amount ; or the inertial position estimator detects that the phone is held still after salient motion.A list of candidates of the new map points:
non maximum suppressed FAST corners+ Shi-Tomasiscore > a certain threshold.Add new map points:
C1C2C3
Image Mask of
C3Masked area
4.4 Sparse mapping
Create a mask indicate the already covered regions : Overcome to map the already exist points.4.4 Sparse Mapping
After a keyframeis added,
Priority: Local Bundle Adjustment > Keyframesoptimization for dense modeling > Global bundle adjustment