For a considerable period, 3D data capture from mobile mapping systems was completely or primarily based on LiDAR sensors. Meanwhile camera-based systems can be used for highly efficient and accurate 3D mapping even in complex urban environments, which was demonstrated in the city center of Basel using data from different campaigns within the scope of the ongoing cooperation between the ifp and the Institute of Geomatics Engineering (IVGI, http://www.fhnw.ch/habg/ivgi), University of Applied Sciences and Arts Northwestern Switzerland (FHNW).
Mobile mapping images are predominantly captured in viewing direction. Since standard rectification approaches fail in such cases, we additionally implemented the polar rectification technique in the SURE framework. The following dense image matching investigations showed that not only matching stereo images captured at the same point of time, but also incorporating the two previous and the two following images leads to an increase in completeness, reliability and accuracy of both the point clouds and depth maps.
Figure 3 depicts the benefit of exploiting imagery from the back-right as well as from the left stereovision system in addition to forward imagery. While incorporating back-right imagery leads to a significant increase in sidewalk points (Figure 3e), the left stereovision system covers a larger road surface part and is beneficial for lower façade points as well (Figure 3f).
In order to provide accurate measurements in a given reference frame, high quality georeferencing of the captured multi-view image sequences is required. Moreover, sub-pixel accurate orientations of these highly redundant image sequences are needed in order to optimally perform steps as dense multi-image matching. While direct georeferencing of image-based mobile mapping data performs well in open areas, poor GNSS coverage in urban canyons aggravates fulfilling these high accuracy requirements, even with high-grade inertial navigation equipment.
Thus, we conducted comprehensive investigations aiming at assessing the quality of directly georeferenced sensor orientations as well as the expected improvement by image-based georeferencing in a challenging urban environment. Our study repeatedly delivered mean trajectory deviations of up to 80 cm. By performing image-based georeferencing using bundle adjustment for a limited set of cameras and a limited number of ground control points, mean check point residuals could be lowered from approx. 40 cm to 4 cm. Furthermore, we showed that largely automated image-based georeferencing is capable of detecting and compensating discontinuities in directly georeferenced trajectories.
Aiming at further increasing accuracy and robustness, we are currently developing a georeferencing approach which can incorporate multi-view stereo image sequences into bundle adjustment exploiting constraints for the calibrated offsets and rotations between respective cameras. Further developments will enable large-scale processing of multiple mobile mapping sequences.
Cavegn, S.; Blaser, S.; Nebiker, S. & Haala, N. 
Robust and Accurate Image-based Georeferencing Exploiting Relative Orientation Constraints. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., IV-2, 57-64, 2018.
Cavegn, S. & Haala, N. 
Image-Based Mobile Mapping for 3D Urban Data Capture. Photogrammetric Engineering & Remote Sensing, 82(12).
Cavegn, S., Nebiker, S. & Haala, N. 
A Systematic Comparison of Direct and Image-Based Georeferencing in Challenging Urban Areas. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., Prague, Czech Republic, Vol. XLI-B1, pp. 529-536.
Cavegn, S., Nebiker, S. & Haala, N. 
Ein systematischer Vergleich zwischen direkter und bildbasierter Georeferenzierung von Mobile Mapping-Stereosequenzen in einem anspruchsvollen Stadtgebiet. DGPF Tagungsband 25 / 2016, Bern, Switzerland, S. 113-123.
Cavegn, S., Haala, N., Nebiker, S., Rothermel, M. & Zwölfer, T. 
Evaluation of Matching Strategies for Image-Based Mobile Mapping. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., La Grande Motte, France, Vol. II-3/W5, pp. 361-368.