Handbook of Machine and Computer Vision
The Guide for Developers and Users
Inbunden, Engelska, 2017
Av Alexander Hornberg, Germany) Hornberg, Alexander (University of Applied Sciences of Esslingen
3 329 kr
Produktinformation
- Utgivningsdatum2017-04-19
- Mått158 x 249 x 46 mm
- Vikt1 837 g
- FormatInbunden
- SpråkEngelska
- Antal sidor860
- Upplaga2
- FörlagJohn Wiley & Sons Inc
- ISBN9783527413393
Tillhör följande kategorier
The editor, Alexander Hornberg, worked as development and software engineer in industry. Since 1997 he has been working in the field of machine vision in an academic environment. He is Professor for Image Processing and Applied Optics at the University of Applied Sciences Esslingen, Germany.All contributors to this work are written by practitioners from leading companies which operate in the field of computer vision.
- Preface Second Edition xxiiiPreface First Edition xxvList of Contributors xxvii1 Processing of Information in the Human Visual System 1Frank Schaeffel1.1 Preface 11.2 Design and Structure of the Eye 11.3 Optical Aberrations and Consequences for Visual Performance 31.4 Chromatic Aberration 101.5 Neural Adaptation to Monochromatic Aberrations 111.6 Optimizing Retinal Processing with Limited Cell Numbers, Space, and Energy 111.7 Adaptation to Different Light Levels 121.8 Rod and Cone Responses 141.9 Spiking and Coding 161.10 Temporal and Spatial Performance 171.11 ON/OFF Structure, Division of the Whole Illuminance Amplitude 181.12 Consequences of the Rod and Cone Diversity on Retinal Wiring 181.13 Motion Sensitivity in the Retina 191.14 Visual Information Processing in Higher Centers 201.14.1 Morphology 211.14.2 Functional Aspects – Receptive Field Structures and Cortical Modules 221.15 Effects of Attention 231.16 Color Vision, Color Constancy, and Color Contrast 231.17 Depth Perception 251.18 Adaptation in the Visual System to Color, Spatial, and Temporal Contrast 261.19 Conclusions 26Acknowledgements 28References 282 Introduction to Building a Machine Vision Inspection 31Axel Telljohann2.1 Preface 312.2 Specifying a Machine Vision System 322.2.1 Task and Benefit 322.2.2 Parts 332.2.2.1 Different Part Types 332.2.3 Part Presentation 332.2.4 Performance Requirements 342.2.4.1 Accuracy 342.2.4.2 Time Performance 342.2.5 Information Interfaces 342.2.6 Installation Space 352.2.7 Environment 352.2.8 Checklist 352.3 Designing a Machine Vision System 362.3.1 Camera Type 362.3.2 Field of View 372.3.3 Resolution 382.3.3.1 Camera Sensor Resolution 382.3.3.2 Spatial Resolution 382.3.3.3 Measurement Accuracy 382.3.3.4 Calculation of Resolution 392.3.3.5 Resolution for a Line Scan Camera 392.3.4 Choice of Camera, Frame Grabber, and Hardware Platform 402.3.4.1 Camera Model 402.3.4.2 Frame Grabber 402.3.4.3 Pixel Rate 402.3.4.4 Hardware Platform 412.3.5 Lens Design 412.3.5.1 Focal Length 422.3.5.2 Lens Flange Focal Distance 432.3.5.3 Extension Tubes 432.3.5.4 Lens Diameter and Sensor Size 432.3.5.5 Sensor Resolution and Lens Quality 432.3.6 Choice of Illumination 442.3.6.1 Concept: Maximize Contrast 442.3.6.2 Illumination Setups 442.3.6.3 Light Sources 452.3.6.4 Approach to the Optimum Setup 452.3.6.5 Interfering Lighting 462.3.7 Mechanical Design 462.3.8 Electrical Design 462.3.9 Software 462.3.9.1 Software Library 472.3.9.2 Software Structure 472.3.9.3 General Topics 482.4 Costs 482.5 Words on Project Realization 492.5.1 Development and Installation 492.5.2 Test Run and Acceptance Test 492.5.3 Training and Documentation 502.6 Examples 502.6.1 Diameter Inspection of Rivets 502.6.1.1 Task 502.6.1.2 Specification 512.6.1.3 Design 512.6.2 Tubing Inspection 552.6.2.1 Task 552.6.2.2 Specification 552.6.2.3 Design 563 Lighting in Machine Vision 63Irmgard Jahr3.1 Introduction 633.1.1 Prologue 633.1.2 The Involvement of Lighting in the Complex Machine Vision Solution 633.2 Demands on Machine Vision lighting 673.3 Light used in Machine Vision 703.3.1 What is Light? Axioms of Light 703.3.2 Light and Light Perception 733.3.3 Light Sources for Machine Vision 763.3.3.1 Incandescent Lamps/Halogen Lamps 773.3.3.2 Metal Vapor Lamps 783.3.3.3 Xenon Lamps 793.3.3.4 Fluorescent Lamps 813.3.3.5 LEDs (Light Emitting Diodes) 823.3.3.6 Lasers 853.3.4 The Light Sources in Comparison 863.3.5 Considerations for Light Sources: Lifetime, Aging, Drift 863.3.5.1 Lifetime 863.3.5.2 Aging and Drift 883.4 Interaction of Test Object and Light 913.4.1 Risk Factor Test Object 913.4.1.1 What Does the Test Object do With the Incoming Light? 923.4.1.2 Reflection/Reflectance/Scattering 923.4.1.3 Total Reflection 953.4.1.4 Transmission/Transmittance 963.4.1.5 Absorption/Absorbance 973.4.1.6 Diffraction 993.4.1.7 Refraction 1003.4.2 Light Color and Part Color 1013.4.2.1 Visible Light (VIS) – Monochromatic Light 1013.4.2.2 Visible Light (VIS) – White Light 1033.4.2.3 Infrared Light (IR) 1043.4.2.4 Ultraviolet (UV) Light 1063.4.2.5 Polarized Light 1073.5 Basic Rules and Laws of Light Distribution 1093.5.1 Basic Physical Quantities of Light 1103.5.2 The Photometric Inverse Square Law 1113.5.3 The Constancy of Luminance 1133.5.4 What Light Arrives at the Sensor – Light Transmission Through the Lens 1143.5.5 Light Distribution of Lighting Components 1153.5.6 Contrast 1183.5.7 Exposure 1203.6 Light Filters 1213.6.1 Characteristic Values of Light Filters 1213.6.2 Influences of Light Filters on the Optical Path 1233.6.3 Types of Light Filters 1243.6.4 Anti-Reflective Coatings (AR) 1263.6.5 Light Filters for Machine Vision 1273.6.5.1 UV Blocking Filter 1273.6.5.2 Daylight Suppression Filter 1283.6.5.3 IR Suppression Filter 1283.6.5.4 Neutral Filter/Neutral Density Filter/Gray Filter 1293.6.5.5 Polarization Filter 1303.6.5.6 Color Filters 1303.6.5.7 Filter Combinations 1313.7 Lighting Techniques and Their Use 1313.7.1 How to Find a Suitable Lighting? 1313.7.2 Planning the Lighting Solution – Influence Factors 1333.7.3 Lighting Systematics 1353.7.3.1 Directional Properties of the Light 1353.7.3.2 Arrangement of the Lighting 1383.7.3.3 Properties of the Illuminated Field 1383.7.4 The Lighting Techniques in Detail 1403.7.4.1 Diffuse Bright Field Incident Light (No. 1, Table 3.14) 1403.7.4.2 Directed Bright Field Incident Light (No. 2, Table 3.14) 1423.7.4.3 Telecentric Bright Field Incident Light (No. 3, Table 3.14) 1433.7.4.4 Structured Bright Field Incident Light (No. 4, Table 3.14) 1453.7.4.5 Diffuse Directed Partial Bright Field Incident Light (Nos. 1 and 2, Table 3.14) 1483.7.4.6 Diffuse/Directed Dark Field Incident Light (Nos. 5 and 6, Table 3.14) 1523.7.4.7 The Limits of the Incident Lighting 1543.7.4.8 Diffuse Bright Field Transmitted Lighting (No. 7, Table 3.14) 1553.7.4.9 Directed Bright Field Transmitted Lighting (No. 8, Table 3.14) 1573.7.4.10 Telecentric Bright Field Transmitted Lighting (No. 9, Table 3.14) 1583.7.4.11 Diffuse/Directed Transmitted Dark Field Lighting (Nos. 10 and 11, Table 3.14) 1613.7.5 Combined Lighting Techniques 1623.8 Lighting Control 1633.8.1 Reasons for Light Control – The Environmental Industrial Conditions 1643.8.2 Electrical Control 1643.8.2.1 Stable Operation 1643.8.2.2 Brightness Control 1663.8.2.3 Temporal Control: Static-Pulse-Flash 1673.8.2.4 Some Considerations for the Use of Flash Light 1683.8.2.5 Temporal and Local Control: Adaptive Lighting 1713.8.3 Geometrical Control 1733.8.3.1 Lighting from Large Distances 1733.8.3.2 Light Deflection 1753.8.4 Suppression of Ambient and Extraneous Light – Measures for a Stable Lighting 1753.9 Lighting Perspectives for the Future 176References 1774 Optical Systems in Machine Vision 179Karl Lenhardt4.1 A Look at the Foundations of Geometrical Optics 1794.1.1 From Electrodynamics to Light Rays 1794.1.2 Basic Laws of Geometrical Optics 1814.2 Gaussian Optics 1834.2.1 Reflection and Refraction at the Boundary between two Media 1834.2.2 Linearizing the Law of Refraction – The Paraxial Approximation 1854.2.3 Basic Optical Conventions 1864.2.3.1 Definitions for Image Orientations 1864.2.3.2 Definition of the Magnification Ratio β 1864.2.3.3 Real and Virtual Objects and Images 1874.2.3.4 Tilt Rule for the Evaluation of Image Orientations by Reflection 1884.2.4 Cardinal Elements of a Lens in Gaussian Optics 1894.2.4.1 Focal Lengths f and f ′ 1924.2.4.2 Convention 1924.2.5 Thin Lens Approximation 1934.2.6 Beam-Converging and Beam-Diverging Lenses 1934.2.7 Graphical Image Constructions 1954.2.7.1 Beam-Converging Lenses 1954.2.7.2 Beam-Diverging Lenses 1954.2.8 Imaging Equations and Their Related Coordinate Systems 1954.2.8.1 Reciprocity Equation 1964.2.8.2 Newton’s Equations 1974.2.8.3 General Imaging Equation 1984.2.8.4 Axial Magnification Ratio 2004.2.9 Overlapping of Object and Image Space 2004.2.10 Focal Length, Lateral Magnification, and the Field of View 2004.2.11 Systems of Lenses 2024.2.12 Consequences of the Finite Extension of Ray Pencils 2054.2.12.1 Effects of Limitations of the Ray Pencils 2054.2.12.2 Several Limiting Openings 2074.2.12.3 Characterizing the Limits of Ray Pencils 2104.2.12.4 Relation to the Linear Camera Model 2124.2.13 Geometrical Depth of Field and Depth of Focus 2144.2.13.1 Depth of Field as a Function of the Object Distance p 2154.2.13.2 Depth of Field as a Function of β 2164.2.13.3 Hyperfocal Distance 2174.2.13.4 Permissible Size for the Circle of Confusion d ′ 2184.2.14 Laws of Central Projection–Telecentric System 2194.2.14.1 Introduction to the Laws of Perspective 2194.2.14.2 Central Projection from Infinity – Telecentric Perspective 2284.3 Wave Nature of Light 2354.3.1 Introduction 2354.3.2 Rayleigh–Sommerfeld Diffraction Integral 2364.3.3 Further Approximations to the Huygens–Fresnel Principle 2384.3.3.1 Fresnel’s Approximation 2394.3.4 Impulse Response of an Aberration-Free Optical System 2414.3.4.1 Case of Circular Aperture, Object Point on the Optical Axis 2434.3.5 Intensity Distribution in the Neighborhood of the Geometrical Focus 2444.3.5.1 Special Cases 2464.3.6 Extension of the Point Spread Function in a Defocused Image Plane 2484.3.7 Consequences for the Depth of Field Considerations 2494.3.7.1 Diffraction and Permissible Circle of Confusion 2494.3.7.2 Extension of the Point Spread Function at the Limits of the Depth of Focus 2504.3.7.3 Useful Effective f -Number 2514.4 Information Theoretical Treatment of Image Transfer and Storage 2524.4.1 Physical Systems as Linear Invariant Filters 2524.4.1.1 Invariant Linear Systems 2554.4.1.2 Note to the Representation of Harmonic Waves 2594.4.2 Optical Transfer Function (OTF) and the Meaning of Spatial Frequency 2604.4.2.1 Note on the Relation Between the Elementary Functions in the Two Representation Domains 2614.4.3 Extension to the Two-Dimensional Case 2614.4.3.1 Interpretation of Spatial Frequency Components (r, s) 2614.4.3.2 Reduction to One-Dimensional Representations 2624.4.4 Impulse Response and MTF for Semiconductor Imaging Devices 2654.4.5 Transmission Chain 2674.4.6 Aliasing Effect and the Space-Variant Nature of Aliasing 2674.4.6.1 Space-Variant Nature of Aliasing 2744.5 Criteria for Image Quality 2774.5.1 Gaussian Data 2774.5.2 Overview on Aberrations of the Third Order 2774.5.2.1 Monochromatic Aberrations of the Third Order (Seidel Aberrations) 2784.5.2.2 Chromatic Aberrations 2784.5.3 Image Quality in the Space Domain: PSF, LSF, ESF, and Distortion 2784.5.3.1 Distortion 2804.5.4 Image Quality in the Spatial Frequency Domain: MTF 2814.5.4.1 Parameters that Influence the Modulation Transfer Function 2824.5.5 Other Image Quality Parameters 2834.5.5.1 Relative Illumination (Relative Irradiance) 2834.5.5.2 Deviation from Telecentricity (for Telecentric Lenses only) 2844.5.6 Manufacturing Tolerances and Image Quality 2844.5.6.1 Measurement Errors due to Mechanical Inaccuracies of the Camera System 2854.6 Practical Aspects: How to Specify Optics According to the Application Requirements? 2854.6.1 Example for the Calculation of an Imaging Constellation 287References 2895 Camera Calibration 291Robert Godding5.1 Introduction 2915.2 Terminology 2925.2.1 Camera, Camera System 2925.2.2 Coordinate Systems 2925.2.3 Interior Orientation and Calibration 2935.2.4 Exterior and Relative Orientation 2935.2.5 System Calibration 2935.3 Physical Effects 2935.3.1 Optical System 2935.3.2 Camera and Sensor Stability 2945.3.3 Signal Processing and Transfer 2945.4 Mathematical Calibration Model 2955.4.1 Central Projection 2955.4.2 Camera Model 2955.4.3 Focal Length and Principal Point 2975.4.4 Distortion and Affinity 2975.4.5 Radial Symmetrical Distortion 2975.4.6 Radial Asymmetrical and Tangential Distortion 2995.4.7 Affinity and Nonorthogonality 2995.4.8 Variant Camera Parameters 2995.4.9 Sensor Flatness 3015.4.10 Other Parameters 3015.5 Calibration and Orientation Techniques 3025.5.1 In the Laboratory 3025.5.2 Using Bundle Adjustment to Determine Camera Parameters 3025.5.2.1 Calibration Based Exclusively on Image Information 3025.5.2.2 Calibration and Orientation with Additional Object Information 3045.5.2.3 Extended System Calibration 3075.5.3 Other Techniques 3075.6 Verification of Calibration Results 3085.7 Applications 3095.7.1 Applications with Simultaneous Calibration 3095.7.2 Applications with Precalibrated Cameras 3115.7.2.1 Tube Measurement within a Measurement Cell 3115.7.2.2 Online Measurements in the Field of Car Safety 3125.7.2.3 High Resolution 3D Scanning with White Light Scanners 3125.7.2.4 Other Applications 313References 3146 Camera Systems in Machine Vision 317Horst Mattfeldt6.1 Camera Technology 3176.1.1 History in Brief 3176.1.2 Machine Vision versus Closed Circuit TeleVision (CCTV) 3176.2 Sensor Technologies 3196.2.1 Spatial Differentiation: 1D and 2D 3196.2.2 CCD Technology 3206.2.2.1 Interline Transfer 3216.2.2.2 Progressive Scan Interline Transfer 3216.2.2.3 Interlaced Scan Readout 3226.2.2.4 Enhancing Frame Rate by Multitap Sensors 3246.2.2.5 SONY HAD Technology 3256.2.2.6 SONY SuperHAD (II) and ExViewHAD (II) Technology 3256.2.2.7 CCD Image Artifacts 3266.2.2.8 Blooming 3266.2.2.9 Smear 3266.2.3 CMOS Image Sensor 3286.2.3.1 Advantages of CMOS Sensor 3286.2.3.2 CMOS Sensor Shutter Concepts 3316.2.3.3 Performance Comparison of CMOS versus CCD 3366.2.3.4 Integration Complexity of CCD versus CMOS Camera Technology 3366.2.3.5 CMOS Sensor Sensitivity Enhancements 3376.2.4 MATRIX VISION Available Cameras 3386.2.4.1 Why So Many Different Models? How to Choose Among These? 3386.2.4.2 Resolution and Video Standards 3386.2.4.3 Sensor Sizes and Dimensions 3446.3 Block Diagrams and Their Description 3446.3.1 Block Diagram of SONY Progressive Scan Analog Camera 3456.3.1.1 CCD Read Out Clocks 3456.3.1.2 CCD Binning Mode 3456.3.1.3 Spectral Sensitivity 3486.3.1.4 Analog Signal Processing 3486.3.1.5 Camera and Frame Grabber 3506.3.2 Block Diagram of Color Camera with Digital Image Processing 3506.3.2.1 Bayer TM Complementary Color Filter Array 3516.3.2.2 Complementary Color Filters Spectral Sensitivity 3516.3.2.3 Generation of Color Signals 3516.4 mvBlueCOUGAR-X Line of Cameras 3546.4.1 Black and White Digital Camera mvBlueCOUGAR-X Camera Series 3556.4.1.1 Gray Level Sensor and Processing 3556.4.2 Color Camera mvBlueCOUGAR-X Family 3566.4.2.1 Analog Processing 3566.4.2.2 Analog Front End (AFE) 3576.4.2.3 A/D Conversion 3576.4.2.4 One-Chip Color Processing 3596.4.2.5 Inputting Time Stamp Data into Data Stream 3616.4.2.6 Statistics Engine for White Balance and Auto Features 3616.4.2.7 Image Memory 3616.4.2.8 Lookup Table (LUT) and Gamma Function 3626.4.2.9 Shading Correction 3656.4.2.10 Reducing Noise by Adaptive Recursive Frame Averaging 3666.4.2.11 Color Interpolation 3676.4.2.12 Color Correction 3686.4.2.13 RGB → YUV Conversion 3706.4.3 Controlling Image Capture 3716.4.4 Acquisition and Trigger Modes 3716.4.4.1 Sequencer 3746.4.4.2 Latency and Jitter Aspects 3756.4.4.3 Action Commands 3756.4.4.4 Scheduled Action Command 3776.4.5 Data Transmission 3776.4.5.1 GigE Vision and GVSP 3786.4.5.2 USB3 Vision 3806.4.6 Pixel Data 3806.4.7 Camera Connection 3816.4.8 Operating the Camera 3816.4.9 HiRose Jack Pin Assignment 3826.4.10 Sensor Frame Rates and Bandwidth 3826.5 Configuration of a GigE Vision Camera 3846.6 Qualifying Cameras and Noise Measurement (Dr. Gert Ferrano Mv) 3866.6.1 Explanation of the Most Important Measurements 3886.6.1.1 Linearity Curve 3886.6.1.2 Photon Transfer Curve 3886.7 Camera Noise (by Henning Haider AVT, Updated by Author) 3916.7.1 Photon Noise 3916.7.2 Dark Current Noise 3916.7.3 Fixed Pattern Noise (FPN) 3926.7.4 Photo Response Non Uniformity (PRNU) 3926.7.5 Reset Noise 3926.7.6 1/f Noise (Amplifier Noise) 3926.7.7 Quantization Noise 3926.7.8 Noise Floor 3936.7.9 Dynamic Range 3936.7.10 Signal to Noise Ratio 3936.7.11 Example 1: SONY IMX-174 Sensor (mvBlueFOX3-2024) 3946.7.12 Example 2: CMOSIS CMV2000 (mvBlueCOUGAR-X104) 3946.8 Useful Links and Literature 3946.9 Digital Interfaces 3957 Smart Camera and Vision Systems Design 399Howard D. Gray and Nate Holmes7.1 Introduction to Vision System Design 3997.2 Definitions 4007.3 Smart Cameras 4037.3.1 Applications 4037.3.2 Component Parts 4047.3.2.1 Processors 4047.3.2.2 FPGA Processing 4067.3.2.3 Memory and Storage 4077.3.2.4 Operating Systems 4087.3.2.5 Image Sensors 4097.3.2.6 Inputs and Outputs 4107.3.2.7 Other Interfaces 4127.3.2.8 Timers and Counters 4137.3.3 Programming and Configuring 4137.3.3.1 Scripting 4137.3.3.2 High-Level Languages 4147.3.3.3 Third-Party Tools 4167.3.4 Environment 4167.3.4.1 Power Dissipation 4167.3.4.2 Ingress Protection 4177.4 Vision Sensors 4187.4.1 Applications 4197.4.2 Component Parts 4207.4.3 Programming and Configuring 4207.4.4 Environment 4217.5 Embedded Vision Systems 4217.5.1 Applications 4247.5.1.1 Multi-Camera Applications 4247.5.1.2 Closed Loop Control Applications 4247.5.2 Component Parts 4257.5.3 Programming and Configuring 4257.5.4 Environment 4257.6 Conclusion 425References 426Further Reading 4298 Camera Computer Interfaces 431Nate Holmes8.1 Overview 4318.2 Camera Buses 4328.2.1 Software Standards 4338.2.1.1 GenICam 4338.2.1.2 Iidc 2 4348.2.2 Analog Camera Buses (Legacy) 4358.2.2.1 Analog Video Signal 4368.2.2.2 Interlaced Video 4368.2.2.3 Progressive Scan Video 4368.2.2.4 Timing Signals 4378.2.2.5 Analog Image Acquisition 4378.2.2.6 S-Video 4388.2.2.7 Rgb 4388.2.2.8 Analog Connectors 4398.2.3 Parallel Digital Camera Buses (Legacy) 4398.2.3.1 Digital Video Transmission 4398.2.3.2 Taps 4408.2.3.3 Differential Signaling 4418.2.3.4 Line Scan 4418.2.3.5 Parallel Digital Connectors 4418.2.4 IEEE 1394 (FireWire) (Legacy) 4428.2.4.1 IEEE 1394 for Machine Vision 4458.2.5 Camera Link 4498.2.5.1 Camera Link Signals 4508.2.5.2 Camera Link Connectors 4518.2.6 Camera Link HS 4518.2.7 CoaXPress 4528.2.8 USB (USB3 Vision) 4528.2.8.1 USB for Machine Vision 4548.2.9 Gigabit Ethernet (GigE Vision) 4558.2.9.1 Gigabit Ethernet for Machine Vision 4568.2.9.2 GigE Vision Device Discovery 4568.2.9.3 GigE Vision Control Protocol (GVCP) 4568.2.9.4 GenICam 4578.2.9.5 GigE Vision Stream Protocol (GVSP) 4578.2.9.6 Packet Loss and Resends 4578.2.10 Future Standards Development 4588.3 Choosing a Camera Bus 4598.3.1 Bandwidth 4598.3.2 Resolution 4598.3.3 Frame Rate 4608.3.4 Cables 4608.3.5 Line Scan 4608.3.6 Reliability 4608.3.7 Summary of Camera Bus Specifications 4618.3.8 Sample Use Cases 4618.3.8.1 Manufacturing Inspection 4618.3.8.2 LCD Inspection 4628.3.8.3 Security 4638.4 Computer Buses 4638.4.1 Isa/eisa 4638.4.2 PCI/CompactPCI/PXI 4648.4.3 Pci-x 4668.4.4 PCI Express/CompactPCI Express/PXI Express 4678.4.5 Throughput 4698.4.6 Prevalence and Lifetime 4718.4.6.1 Cost 4718.5 Choosing a Computer Bus 4718.5.1 Determine Throughput Requirements 4718.5.2 Applying the Throughput Requirements 4738.6 Driver Software 4738.6.1 Application Programming Interface 4758.6.2 Supported Platforms 4778.6.3 Performance 4778.6.4 Utility Functions 4788.6.5 Acquisition Mode 4798.6.5.1 Snap 4798.6.5.2 Grab 4798.6.5.3 Sequence 4808.6.5.4 Ring 4818.6.6 Image Representation 4828.6.6.1 Image Representation in Memory 4828.6.7 Bayer Color Encoding 4858.6.7.1 Image Representation on Disk 4878.6.8 Image Display 4878.6.8.1 Understanding Display Modes 4888.6.8.2 Palettes 4898.6.8.3 Nondestructive Overlays 4908.7 Features of a Machine Vision System 4918.7.1 Image Reconstruction 4918.7.2 Timing and Triggering 4928.7.3 Memory Handling 4948.7.4 Additional Features 4968.7.4.1 Look-Up Tables 4978.7.4.2 Region of Interest 4998.7.4.3 Color Space Conversion 4998.7.4.4 Shading Correction 5018.8 Summary 501References 5029 Machine Vision Algorithms 505Carsten Steger9.1 Fundamental Data Structures 5059.1.1 Images 5059.1.2 Regions 5069.1.3 Subpixel-Precise Contours 5089.2 Image Enhancement 5099.2.1 Gray Value Transformations 5099.2.2 Radiometric Calibration 5129.2.3 Image Smoothing 5179.2.4 Fourier Transform 5289.3 Geometric Transformations 5329.3.1 Affine Transformations 5329.3.2 Projective Transformations 5339.3.3 Image Transformations 5349.3.4 Polar Transformations 5389.4 Image Segmentation 5409.4.1 Thresholding 5409.4.2 Extraction of Connected Components 5489.4.3 Subpixel-Precise Thresholding 5509.5 Feature Extraction 5529.5.1 Region Features 5529.5.2 Gray Value Features 5569.5.3 Contour Features 5599.6 Morphology 5609.6.1 Region Morphology 5619.6.2 Gray Value Morphology 5759.7 Edge Extraction 5799.7.1 Definition of Edges in One and Two Dimensions 5799.7.2 1D Edge Extraction 5839.7.3 2D Edge Extraction 5899.7.4 Accuracy of Edges 5969.8 Segmentation and Fitting of Geometric Primitives 6029.8.1 Fitting Lines 6039.8.2 Fitting Circles 6079.8.3 Fitting Ellipses 6089.8.4 Segmentation of Contours into Lines, Circles, and Ellipses 6099.9 Camera Calibration 6139.9.1 Camera Models for Area Scan Cameras 6149.9.2 Camera Model for Line Scan Cameras 6189.9.3 Calibration Process 6229.9.4 World Coordinates from Single Images 6269.9.5 Accuracy of the Camera Parameters 6299.10 Stereo Reconstruction 6319.10.1 Stereo Geometry 6329.10.2 Stereo Matching 6399.11 Template Matching 6439.11.1 Gray-Value-Based Template Matching 6449.11.2 Matching Using Image Pyramids 6499.11.3 Subpixel-Accurate Gray-Value-Based Matching 6529.11.4 Template Matching with Rotations and Scalings 6539.11.5 Robust Template Matching 6549.12 Optical Character Recognition 6729.12.1 Character Segmentation 6729.12.2 Feature Extraction 6749.12.3 Classification 676References 69010 Machine Vision in Manufacturing 699Peter Waszkewitz10.1 Introduction 69910.1.1 The Machine Vision Market 69910.2 Application Categories 70110.2.1 Types of Tasks 70110.2.2 Types of Production 70310.2.2.1 Discrete Unit Production Versus Continuous Flow 70310.2.2.2 Job-Shop Production Versus Mass Production 70410.2.3 Types of Evaluations 70410.2.4 Value-Adding Machine Vision 70510.3 System Categories 70610.3.1 Common Types of Systems 70710.3.2 Sensors 70710.3.3 Vision Sensors 70810.3.4 Compact Systems 70910.3.5 Vision Controllers 71010.3.6 PC-Based Systems 71010.3.6.1 Library-Based Systems 71110.3.6.2 Application-Package-Based Systems 71210.3.6.3 Library-Based Application Packages 71310.3.7 Excursion: Embedded Image Processing 71310.3.8 Summary 71410.4 Integration and Interfaces 71510.4.1 Standardization 71510.4.2 Interfaces 71610.5 Mechanical Interfaces 71610.5.1 Dimensions and Fixation 71710.5.2 Working Distances 71810.5.3 Position Tolerances 71810.5.4 Forced Constraints 71910.5.5 Additional Sensor Requirements 71910.5.6 Additional Motion Requirements 72010.5.7 Environmental Conditions 72110.5.8 Reproducibility 72210.5.9 Gauge Capability 72310.6 Electrical Interfaces 72510.6.1 Wiring and Movement 72610.6.2 Power Supply 72610.6.3 Internal Data Connections 72710.6.4 External Data Connections 72910.7 Information Interfaces 72910.7.1 Interfaces and Standardization 73010.7.2 Traceability 73010.7.3 Types of Data and Data Transport 73110.7.4 Control Signals 73110.7.5 Result and Parameter Data 73210.7.6 Mass Data 73310.7.7 Digital I/O 73310.7.8 Field Bus 73310.7.9 Serial Interfaces 73410.7.10 Network 73410.7.10.1 Standard Ethernet–TCP/IP 73410.7.10.2 OPC UA and Industry 4.0 73510.7.10.3 Ethernet-Based Field Bus/Real-Time Ethernet 73510.7.11 Files 73610.7.12 Time and Integrity Considerations 73610.8 Temporal Interfaces 73810.8.1 Discrete Motion Production 73810.8.2 Continuous Motion Production 74010.8.3 Line-Scan Processing 74310.9 Human–Machine Interfaces 74510.9.1 Interfaces for Engineering Vision Systems 74610.9.2 Runtime Interface 74710.9.2.1 Using the PLC HMI for Machine Vision 74910.9.3 Remote Maintenance 75010.9.3.1 Safety Precaution: No Movements 75110.9.4 Offline Setup 75110.10 3D Systems 75310.10.1 Dimensionality and Representation 75310.10.1.1 Dimensionality 75310.10.1.2 2.5D and 3D 75410.10.1.3 Point Clouds and Registration 75510.10.1.4 Representation 75710.10.2 3D Data Acquisition 75710.10.2.1 Passive Methods 75810.10.2.2 Active Methods 75910.10.3 Applications 76410.10.3.1 Identification 76510.10.3.2 Completeness Check 76510.10.3.3 Object and Pose Recognition 76610.10.3.4 Shape and Dimension Applications 76710.10.3.5 Surface Inspection 76910.10.3.6 Robotics 77010.10.4 Conclusion 77110.11 Industrial Case Studies 77210.11.1 Glue Check Under UV Light 77210.11.1.1 Task 77210.11.1.2 Solution 77310.11.1.3 Equipment 77310.11.1.4 Algorithms 77410.11.1.5 Key Points 77410.11.2 Completeness Check 77410.11.2.1 Task 77410.11.2.2 Solution 77410.11.2.3 Key Point: Mechanical Setup 77510.11.2.4 Equipment 77510.11.2.5 Algorithms 77510.11.3 Multiple Position and Completeness Check 77610.11.3.1 Task 77610.11.3.2 Solution 77610.11.3.3 Key Point: Cycle Time 77810.11.3.4 Equipment 77810.11.3.5 Algorithms 77910.11.4 Pin-Type Verification 77910.11.4.1 Task 77910.11.4.2 Solution 77910.11.4.3 Key Point: Self-Test 78110.11.4.4 Equipment 78110.11.4.5 Algorithms 78110.11.5 Robot Guidance 78110.11.5.1 Task 78110.11.5.2 Solution 78210.11.5.3 Key Point: Calibration 78210.11.5.4 Key Point: Communication 78310.11.5.5 Equipment 78410.11.5.6 Algorithms 78410.11.6 Type and Result Data Management 78410.11.6.1 Task 78410.11.6.2 Solution 78510.11.6.3 Key Point: Type Data 78510.11.6.4 Key Point: Result Data 78510.11.6.5 Equipment 78610.11.7 Dimensional Check for Process Control 78610.11.7.1 Task 78610.11.7.2 Solution 78710.11.7.3 Equipment 78710.11.7.4 Algorithms 78810.11.8 Ceramic Surface Check 78810.11.8.1 Task 78810.11.8.2 Solution 78810.11.8.3 Equipment 78910.12 Constraints and Conditions 78910.12.1 Inspection Task Requirements 78910.12.2 Circumstantial Requirements 79010.12.2.1 Cost 79110.12.2.2 Automation Environment 79110.12.2.3 Organizational Environment 79210.12.3 Refinements 79310.12.4 Limits and Prospects 794References 796Appendix 801Index 805
"[The editor] has compiled a wealth of information that addresses topics at both practical and theoretical levels. The book covers areas such as general system-design principles, lighting, optics, camera systems, computer interfaces, and algorithms.Section nine, 'Machine Vision in Manufacturing,' ... provides a solid introduction to what vision systems can do and the problems they can solve.Designers will ... find much useful information in the section 'Lighting in Machine Vision.' This chapter provides just the type of explanations and illustrations that help engineers properly plan and assemble light sources for an application.If you still perceive lighting techniques to be black magic, turn to this chapter for advice. I cannot think of another source that offers as much practical information about lighting.This reference book provides many helpful diagrams and photographs that illustrate how algorithms work, the results of lighting components in various ways, and how camera systems operate." (UBM Tech 2006)