5 mai 2017 · and calibrated Hereby, distortions caused by fisheye lenses are automati- through stereo vision [Neu16], Swarm Behaviour for Path Planning [Rot14] models and camera calibration are compared and evaluated
Previous PDF | Next PDF |
[PDF] Lecture 53 Camera calibration - UiO
This camera model is typically not good enough for accurate geometrical computations based on The estimation of distortion parameters can be baked into this OpenCV – Camera calibration tutorial – We'll test it out in the lab • Matlab
[PDF] A real-time camera calibration system based on OpenCV
OpenCV based camera calibration system, and developed and implemented in the imaging must be established, the geometric model parameters is camera precision in many aspects, camera lens has distortion, and the actual imaging
[PDF] Lecture 2 – Camera Models and Calibration - Automatic Control
Lecture 2 – Camera Models and Calibration Thomas Schön, Camera calibration (gray-box sys id problem) a Geometric Camera Models – Radial Lens Distortion OpenCV is a computer vision library originally developed by Intel, now
A camera calibration technique based on OpenCV - IEEE Xplore
The camera model of calibration algorithm in OpenCV is based on pinhole model, and introduces the radial lens distortion and tangential distortion, it more truly reflects the actual distortion of the lens case compared with pinhole model and Tsai's model that only introduces first order radial distortion
[PDF] Fisheye camera system calibration for automotive applications
5 mai 2017 · and calibrated Hereby, distortions caused by fisheye lenses are automati- through stereo vision [Neu16], Swarm Behaviour for Path Planning [Rot14] models and camera calibration are compared and evaluated
[PDF] The effects of lens distortion calibration patterns on the accuracy of
This work aims to compare lens distortion modelling techniques and calibration Improvements of 20 over the method used in OpenCV are consistently obtained Caltech Camera Calibration Toolbox [5]) as the intersections can be found
[PDF] opencv radial and tangential distortion
[PDF] opencv python tutorials documentation pdf
[PDF] opening business account
[PDF] openldap 2.4 setup
[PDF] openldap administrator's guide
[PDF] openldap create database
[PDF] openldap lib
[PDF] openldap mdb
[PDF] openldap sdk
[PDF] operant conditioning
[PDF] operating modes of 8086 microprocessor
[PDF] operation research question bank with answers pdf
[PDF] operation research questions and answers pdf
[PDF] operational process of state prisons
Dahlem Center for Machine Learning and Robotics
Fisheye Camera System Calibration for Automotive
Applications
Christian Kühling
Matrikelnummer: 4481432
kuehling@zedat.fu-berlin.deZweitgutachter: Prof. Dr. Raúl Rojas
Betreuer: Fritz Ulbrich
Berlin, 05.05.2017
Abstract
In this thesis, the imagery of the fisheye camera system mounted to the au- and calibrated. Hereby, distortions caused by fisheye lenses are automati- cally corrected and a surround view of the vehicle is created. Over the next decade, autonomous cars are expected to radically change mobility as we know it. While intelligent software systems made astonishing improvements over the past years, the eventual quality of autonomous cars depends on their sensors capturing the environment. One of the most important sensors are cameras for visual input. Hence, it does not surprise that current autonomous prototypes often have multiple cameras, for example to prevent having blind spots. For this specific reason, fisheye lenses with a large field of views are often used. To utilize recordings of these camera systems by computer vision algorithms, a camera calibration is required. It consists of the intrinsic calibration, rectifying possible distortions and extrinsic calibration, determining position and pose of the cameras.Statement of Academic IntegrityHereby, I declare that I have composed the presented paper independentlyon my own and without any other resources than the ones indicated. Allthoughts taken directly or indirectly from external sources are properly de-noted as such. This paper has neither been previously submitted to anotherauthority nor has it been published yet.
05.05.2017Christian Kühling
Contents1 Introduction1
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 AutoNOMOS project . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Thesis structure . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.5 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Fundamentals5
2.1 Hardware setup . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Theoretical fundamentals . . . . . . . . . . . . . . . . . . . . 8
2.2.1 Mathematical notations . . . . . . . . . . . . . . . . . 8
2.2.2 Coordinate systems . . . . . . . . . . . . . . . . . . . 9
2.2.3 Pinhole camera model . . . . . . . . . . . . . . . . . . 10
2.3 Intrinsic camera calibration . . . . . . . . . . . . . . . . . . . 14
2.3.1 Mei"s calibration toolbox . . . . . . . . . . . . . . . . 14
2.3.2 Scaramuzza"s OcamCalibToolbox . . . . . . . . . . . . 16
2.3.3 OpenCV camera calibration . . . . . . . . . . . . . . . 18
2.4 Extrinsic camera calibration . . . . . . . . . . . . . . . . . . . 21
2.4.1 Approaches . . . . . . . . . . . . . . . . . . . . . . . . 21
2.4.2 CamOdoCal . . . . . . . . . . . . . . . . . . . . . . . . 25
2.5 Used libraries and software . . . . . . . . . . . . . . . . . . . 32
2.5.1 ROS . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.5.2 OpenCV . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.5.3 LibPCAP . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.5.4 LibJPEG-turbo . . . . . . . . . . . . . . . . . . . . . . 33
2.5.5 Docker . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.5.6 MATLAB . . . . . . . . . . . . . . . . . . . . . . . . . 33
3 Implementation35
3.1 The camera driver . . . . . . . . . . . . . . . . . . . . . . . . 35
3.2 Decompression of received compressed JPEG images . . . . . 36
3.3 Extrinsic and intrinsic camera calibration script . . . . . . . . 37
3.4 Rectification of distorted fisheye images . . . . . . . . . . . . 42
ii3.5 Surround view of the MIG . . . . . . . . . . . . . . . . . . . . 423.6 Overview of the Implementation . . . . . . . . . . . . . . . . 45
4 Results and conclusion47
4.1 Results and discussion . . . . . . . . . . . . . . . . . . . . . . 47
4.1.1 Intrinsic parameters . . . . . . . . . . . . . . . . . . . 48
4.1.2 Extrinsic parameters . . . . . . . . . . . . . . . . . . . 50
4.1.3 Performance of the implementation . . . . . . . . . . . 54
4.2 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4.3 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
List of Figures
2.1 The installed camera system . . . . . . . . . . . . . . . . . . . 6
2.2 Camera positions . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Technical data of BroadR-Reach SatCAM . . . . . . . . . . . 7
2.5 Overview of the camera positions and their theoretical FOV . 7
2.6 Mathematical Notations . . . . . . . . . . . . . . . . . . . . . 8
2.10 Roll, pitch and yaw [Rol] . . . . . . . . . . . . . . . . . . . . . 13
2.11 Two types of distortion [Dis] . . . . . . . . . . . . . . . . . . . 14
2.13 Images taken with different optics [Hof13] . . . . . . . . . . . 17
2.14 Scaramuzza perspective projection . . . . . . . . . . . . . . . 18
2.15 Calibration patterns . . . . . . . . . . . . . . . . . . . . . . . 20
2.16 Fisheye image rectification [Opec] . . . . . . . . . . . . . . . . 20
2.18 MonoSLAM. Top: in operation. Bottom: SURF features
added. [CAD11] . . . . . . . . . . . . . . . . . . . . . . . . . 242.19 Inlier feature point tracks [HLP13] . . . . . . . . . . . . . . . 29
2.20 Inlier feature point correspondences between rectified images 30
3.1 The original fish eye images of each camera . . . . . . . . . . 37
3.2 Visualization of a successful driven path used by CamOdoCal
for extrinsic camera calibration . . . . . . . . . . . . . . . . . 403.3 The original fish eye and rectified image of the rear camera . 42
3.4 The original fish eye and undistorted top view image of the
front camera . . . . . . . . . . . . . . . . . . . . . . . . . . . 433.5 Image coordinate system in relation to world coordinate sys-
tem [TOJG10] . . . . . . . . . . . . . . . . . . . . . . . . . . 443.7 UML sequence diagram of the ROS implementation . . . . . 46
3.8 Basic structure of the camera calibration script . . . . . . . . 46
4.1 The original fish eye and undistorted image of the front cam-
era with a chessboard calibration pattern . . . . . . . . . . . 484.2 The original fish eye and undistorted image of the rear camera
with a chessboard calibration pattern . . . . . . . . . . . . . . 49 iv4.3 The original fish eye and undistorted image of the left camera
with a chessboard calibration pattern . . . . . . . . . . . . . . 494.4 The original fish eye and undistorted image of the right cam-
era with a chessboard calibration pattern . . . . . . . . . . . 504.5 Calculated translation vector . . . . . . . . . . . . . . . . . . 51
4.6 Measured translation vector . . . . . . . . . . . . . . . . . . . 51
4.7 Calculated roll pitch and yaw angles . . . . . . . . . . . . . . 52
4.8 Measured roll pitch and yaw angles . . . . . . . . . . . . . . . 52
4.9 Rviz screenshots of extrinsic camera parameters using ROS
transformation package . . . . . . . . . . . . . . . . . . . . . 534.10 Set of surround view images . . . . . . . . . . . . . . . . . . . 54
4.11 Performance of the implemented ROS nodes . . . . . . . . . . 55
Chapter 1Introduction1.1 MotivationThere has been a remarkable growth in the use of cameras on autonomousand human-driven vehicles. In 2014 analysis of the market reveals possiblegrowth at a rate of over 50% Compound Annual Growth Rate until 2018 forAdvanced Driver Assistance Systems [Gro]. Cameras offer a rich source ofvisual information, which can be processed in real-time thanks to recent ad-vances in computing hardware. From the automotive perspective, multiplecameras are recommended for using them for driver assistance applicationssuch as lane detection, traffic light detection, the recognition of other trafficparticipants or simply the termination of blind spots, where there is no orlittle view.Image processing applications utilizing multiple cameras for a vehicle re-quire an accurate calibration. Camera calibration is divided into two parts:intrinsic and extrinsic calibration.An accurate intrinsic camera calibration consists of an optimal set of pa-rameters for a camera projection model that relates 2D image points to 3Dscene points. This is especially challenging for fisheye lenses, whose advan-tage is a large field of view at the cost of strong distortions. Latter shouldbe rectified by using the parameters resulting from the intrinsic calibration.An accurate extrinsic calibration corresponds to accurate camera positionsand their rotations with respect to a reference frame on the vehicle. This isneeded whenever the size of an object has to be measured, or the locationor position of an object has to be determined.Overall camera calibration is a crucial part of computer vision being therequirement for most available computer vision algorithms.
11.2 AutoNOMOS project Christian Kühling
Not displayed: the fisheye cameras used in this thesis1.2 AutoNOMOS project
In the year 2006 Prof. Dr. Raúl Rojas and his students of the working group the AutoNOMOS project [Neu16]. A Dodge Grand Caravan was converted to an autonomous car by installing several sensors and computer hardware. This car was calledSpirit of Berlin. It participated in the DARPA(Defence Advanced Research Project Agency) Grand Urban Challenge 2007, a com- petition for autonomous vehicle. Since 2007 tests were ran, while driving autonomously at the area of the for- mer airport Berlin-Tempelhof. This led to public funds granted by the fed- eral ministry of education and research, which resulted in the AutoNOMOS project [Aut]. Two new test vehicles were designed. The first one is callede- Instein, an electrically powered vehicle whose basis is a Mitsubishi i-MiEV. The other car is a Volkswagen Passat Variant 3C equipped with Drive-by- Wire, Steer-by-Wire technology, overall more sensors and computer hard- ware thane-Instein. It is calledMadeInGermanywhich will be shortened toMIGgoing forward. Due to exceptional permissions, testing function- alities in real traffic situations in Berlin, Germany was possible. This re- sulted in various publications of miscellaneous topics like vehicle detection through stereo vision [Neu16], Swarm Behaviour for Path Planning [Rot14] or Radar/Lidar Sensor Fusion for Car-Following on Highways[GWSG11]. 21.3 GoalChristian Kühling
In figure [1.1] the following sensors are displayed: Hella Aglia INKA cameras: Two front-orientated stereo-cameras, placednext to the rear mirror Lux Laser Scanner: For detecting obstacles, placed in the front andrear bumper barSmart Microwave Sensors GmH Radar(SMS): At the front bumper todetect the speed of preceding vehicles
Odometer(Applanix POS/LV System): For calculating the travelleddistance and the wheel rotations. It is placed at the left rear wheel.
TRW/ Hella radar system: This radar system is placed in the frontand rear area. It is installed for measuring the distance between theMIGand surrounding objects.
Velodyne HDL-64E: To detect obstacles all around theMIGthis LIDAR- system is placed at the roof of the vehicle. Not all installed sensors are displayed in figure [1.1], because from time to time the setup changes. New sensors get installed, tested, deactivated or even deconstructed.MIGis also equipped with four fisheye cameras, which were not used for researching until now. In this thesis, this camera system is the main component wherefore this very thesis only refers to theMIG, not discussing thee-Instein. A detailed description of the fisheye camera system can be found at section [2.1].1.3 Goal
The goal for this master"s thesis is to calibrate a fisheye camera system further usage. For this purpose, a camera driver is required in addition to an easy to use calibration script for gaining the intrinsic and extrinsic camera parameters. The verification of the resulting intrinsic parameters will be done by the rectifying the current fisheye distortion. Furthermore, a combination of the intrinsic and extrinsic calibrations represents the basis of a surround view. The aim of this work is also to provide the captured images and calibrations within the ROS(Robot Operation System) [2.5.1] framework, for further utilization. 3 4Chapter 2FundamentalsThis chapter provides the basic knowledge for the further progression of thisthesis. Starting with the hardware setup [2.1] covering detailed informationabout the camera system installed in theMIG. Afterwards, the theoretical
fundamentals [2.2] like coordinate systems, the basics of camera models and finally the essentials of intrinsic [2.3] and extrinsic camera calibration [2.4] are presented. An overview of the used libraries and software [2.5] completes this chapter.