Mathematics for Engineers
Inbunden, Engelska, 2008
2 799 kr
Beställningsvara. Skickas inom 7-10 vardagar
Fri frakt för medlemmar vid köp för minst 249 kr.This book offers comprehensive coverage of all the mathematical tools needed by engineers in the field of processing and transport of all forms of information, data and images - as well as many other engineering disciplines. It provides essential theories, equations and results in probability theory and statistics, which constitute the basis for the presentation of signal processing,information theory, traffic and queueing theory, and reliability. The mathematical foundations of simulation are also covered. The book's accessible style will enable students, engineers and researches new to this area to advance their knowledge of communication and other engineering technologies; however, it will also serve as a useful reference guide to anyone wishing to further explore this field.
Produktinformation
- Utgivningsdatum2008-12-02
- Mått158 x 236 x 33 mm
- Vikt794 g
- FormatInbunden
- SpråkEngelska
- Antal sidor448
- FörlagISTE Ltd and John Wiley & Sons Inc
- ISBN9781848210554
Tillhör följande kategorier
Georges Fiche has worked at Alcatel-Lucent for more than 20 years, where he has acted as Technical Coordinator for Performance Standardization and as Performance Manager for the design of Alcatel products. Gerard Hebuterne has been a member of the research lab at France telecom for 20 years. He is currently a professor and is responsible for the "Network" Department in INT ( a French school of higher education).
- Preface 15Chapter 1. Probability Theory 191.1. Definition and properties of events 191.1.1. The concept of an event 191.1.2.Complementary events 211.1.2.1. Basic properties 211.1.3. Properties of operations on events 211.1.3.1.Commutativity 211.1.3.2.Associativity 211.1.3.3.Distributivity 211.1.3.4. Difference 211.1.3.5.DeMorgan’s rules 221.2. Probability 231.2.1. Definition 231.2.2. Basic theorems and results 231.2.2.1. Addition theorem 231.2.2.2. Conditional probability 241.2.2.3. Multiplication theorem 251.2.2.4. The posterior probability theorem 261.3. Random variable 271.3.1. Definition 271.3.2. Probability functions of a random variable 271.3.2.1.Notations 271.3.2.2.Cumulative distribution function 271.3.2.3. Probability density function 271.3.3. Moments of a random variable 281.3.3.1. Moments about the origin 291.3.3.2.Central moments 291.3.3.3. Mean and variance 291.3.3.4.Example applications 311.3.4. Couples of random variables 321.3.4.1. Definition 321.3.4.2. Joint probability 321.3.4.3. Marginal probability of couples of random variables 331.3.4.4. Conditional probability of a couple of random variables 341.3.4.5. Functions of a couple of random variables 341.3.4.6. Sum of independent random variables 361.3.4.7. Moments of the sum of independent random variables 371.3.4.8. Practical interest 391.4.Convolution 401.4.1. Definition 401.4.2. Properties of the convolution operation 411.4.2.1.The convolution is commutative 411.4.2.2. Convolution of exponential distributions 411.4.2.3. Convolution of normal (Gaussian) distributions 411.5.Laplace transform 421.5.1. Definition 431.5.2. Properties 431.5.2.1. Fundamental property 431.5.2.2. Differentiation property 431.5.2.3. Integration property 441.5.2.4. Some common transforms 441.6. Characteristic function, generating function, z-transform 471.6.1.Characteristic function 471.6.1.1. Definition 471.6.1.2. Inversion formula 481.6.1.3. The concept of event indicator and the Heaviside function 481.6.1.4. Calculating the inverse function and residues 491.6.1.5. The residue theorem 501.6.1.6.Asymptotic formula 511.6.1.7.Moments 521.6.1.8. Some common transforms 531.6.2. Generating functions, z-transforms 541.6.2.1. Definition 541.6.2.2.Moments 541.6.2.3. Some common transforms 551.6.3.Convolution 56Chapter 2. Probability Laws 572.1.Uniform(discrete) distribution 582.2.The binomial law 582.3.Multinomial distribution 602.4.Geometric distribution 612.5. Hypergeometric distribution 622.6.The Poisson law 632.7. Continuous uniform distribution 652.8.Normal (Gaussian) distribution 662.9.Chi-2 distribution 702.10. Student distribution 712.11. Lognormal distribution 722.12. Exponential and related distributions 722.12.1. Exponential distribution 722.12.2. Erlang-k distribution 732.12.3. Hyperexponential distribution 752.12.4. Generalizing: Coxian distribution 772.12.5.Gamma distribution 772.12.6.Weibull distribution 782.13.Logistic distribution 792.14. Pareto distribution 812.15.A summary of the main results 822.15.1.Discrete distributions 822.15.2. Continuous distributions 83Chapter 3. Statistics 873.1.Descriptive statistics 883.1.1.Data representation 883.1.2. Statistical parameters 903.1.2.1. Fractiles 903.1.2.2. Samplemean 913.1.2.3. Sample variance 913.1.2.4.Moments 913.1.2.5.Mode 913.1.2.6. Other characterizations 923.2.Correlation and regression 923.2.1. Correlation coefficient 933.2.2.The regression curve 943.2.2.1. The least squares method 943.3. Sampling and estimation techniques 963.4.Estimation 973.4.1. Point estimation 983.4.1.1.Average 993.4.1.2. Variance 993.4.1.3. Estimating the mean and variance of a normal distribution 1003.4.1.4. Example: estimating the average lifetime of equipment 1013.4.2. Estimating confidence intervals 1023.4.2.1. Example 1: estimating the mean of a normal distribution 1043.4.2.2. Example 2: Chi-2 distribution in reliability 1043.4.2.3. Estimating proportion 1063.4.2.4. Estimating the parameter of a Poisson distribution 1083.5. Hypothesis testing 1083.5.1. Example: testing the mean value of a normal distribution 1083.5.2. Chi-2 test: uniformity of a random generator 1103.5.3.Correlation test 111Chapter 4. Signal Theory 1134.1. Concept of signal and signal processing 1134.2. Linear time-invariant systems and filtering 1154.2.1. Linear time-invariant systems 1154.2.2. Impulse response and convolution function of an LTI system 1154.2.3. Filtering function 1164.3. Fourier transform and spectral representation 1174.3.1. Decomposing a periodic signal using Fourier series 1184.3.2. Fourier transform of an arbitrary signal 1194.3.3.Dirac delta function and itsFourier transform 1214.3.4. Properties of Fourier transforms 1244.3.4.1. Time and frequency shifts 1244.3.4.2. Convolution product and filtering 1254.3.4.3. Product of functions and transform convolution 1254.3.4.4. Product of functions and modulation 1264.3.4.5. Energy conservation and Parseval’s theorem 1294.4. Sampling 1304.4.1. Sampling function 1304.4.2. Shannon sampling theorem 1314.5. Quantization and coding 1324.5.1.Quantization noise 1334.5.2.Coding power 1354.6.Discrete LTI system 1364.7. Transforms for digital signal processing 1364.7.1. The z-transform 1364.7.1.1. Definition 1374.7.1.2.Time translation 1374.7.1.3.Discrete convolution 1384.7.1.4. Inversion 1394.7.2. Fourier transform of a discrete signal 1404.7.3.Discrete Fourier transform 1414.7.3.1. Definition 1414.7.3.2. Properties 1434.7.4.Cosine transform 1444.7.5.The fast Fourier transform(FFT) 1454.7.5.1.Cooley-Tukey FFT algorithm 1454.8. Filter design and synthesis 1464.8.1.Definitions and principles 1474.8.1.1. Principle 1474.8.1.2. Causality and stability 1484.8.2. Finite impulse response (FIR) filters 1484.8.2.1. Design methodology 1494.8.2.2. FIR filter synthesis 1524.8.2.3. Low-pass, high-pass and band-pass filters 1544.8.3. Infinite impulse response (IIR) filters 1544.8.3.1. Filter design from models of the s plane 1564.8.3.2. Butterworth model 1584.8.3.3. Chebychev model 1604.8.3.4. Synthesis of IIR filters 1614.8.3.5. Low-pass, high-pass and band-pass filters 1624.8.4. Non-linear filtering 1634.8.4.1. Median filtering at rank N 1634.8.5. Filter banks and multirate systems 1644.8.5.1. Sub- and up-sampling 1654.8.5.2. Multirate filtering and polyphase bank 1684.8.5.3. Signal decomposition and reconstruction 1694.8.5.4. Half-band filters 1714.8.5.5. Quadrature mirror filters 1724.9. Spectral analysis and random signals 1734.9.1. Statistical characterization of random signals 1734.9.1.1.Average 1744.9.1.2. Autocorrelation function 1744.9.1.3. Power spectral density 1764.9.2. Filtering a random signal 1784.9.2.1. Spectral density of a filtered signal 1784.9.2.2. Filtering white noise 1804.9.3. Sampled random signals 1804.9.3.1. Autocorrelation function 1814.9.3.2.Cross-correlation function 1814.9.3.3. Power spectral density 1814.9.4. Filtering a sampled random signal 1824.9.4.1. Spectral density of the filtered signal 1834.9.4.2. Correlation between input and output signals (cross-correlation) 1834.9.4.3.Colored noise 1834.9.5. Spectral estimation 1844.9.5.1.Estimating the autocorrelation function 1844.9.5.2. Non-parametric spectral estimation with periodogram 1854.9.5.3. Parametric spectral estimation: ARMA, AR, MA 1874.9.5.4. Toeplitz matrix and Levinson algorithm 1894.10. Linear prediction, coding and speech synthesis, adaptive filtering 1924.10.1. Linear prediction 1924.10.1.1. Variance minimization 1934.10.1.2. Example of speech processing: coding and synthesis 1944.10.2. Adaptive filtering 1954.10.2.1. The least squares method with exponential forgetting 1964.10.2.2. The stochastic gradient descent algorithm 1974.10.2.3. Example: echo cancelation 198Chapter 5. Information and Coding Theory 2015.1. Information theory 2015.1.1.The basic diagram of a telecommunication system 2025.2. Information measurement 2025.2.1. Algebraic definition of information 2035.2.2. Probabilistic definition of information 2035.2.3. Self-information 2045.2.4.Unit of information 2045.2.5. Conditional information 2055.2.6.Mutual information 2055.3.Entropy 2065.3.1.Entropy of a memoryless source 2065.3.2.Entropy of a binary source 2065.3.3.Maximumentropy 2075.3.4. Joint entropy of random variables 2085.3.5. Average conditional entropy 2085.3.6.Additivity and joint entropy 2095.3.7.Averagemutual information 2105.3.8. Conditional average mutual information 2105.3.9. Extension to the continuous case 2105.4. Source modeling 2115.4.1. Concept of sources with and without memory 2115.4.2.Discrete Markov source in discrete time 2125.4.3. The main types of source 2155.4.3.1.The binary source 2155.4.3.2. Text, or alphabetic source 2165.4.4. Information rate of the source 2175.5. Source coding 2175.5.1.Code efficiency 2175.5.2. Redundancy of a code 2195.5.3. Instantaneous and uniquely decipherable codes 2205.6. Shannon’s first theorem 2205.6.1. Optimal coding 2205.6.2. Shannon’s first theorem 2225.7.Coding and data compression 2245.7.1. Huffman coding algorithm 2245.7.2. Retrieval algorithms and decision trees 2265.7.3. (Reversible) data compression 2275.7.3.1.The Huffman method 2285.7.3.2. Fano method 2295.7.3.3. Shannon’s method 2305.7.3.4. Arithmetic coding 2305.7.3.5. Adaptive and dictionary methods 2335.7.4. Image compression 2345.7.4.1. Describing an image: luminance and chrominance 2345.7.4.2. Image redundancy 2365.7.4.3.The discrete cosine transform(DCT) 2375.7.4.4. Quantization and coding 2425.7.4.5. Recent methods: wavelets 2445.7.4.6. JPEG2000 2505.8. Channel modeling 2525.8.1. Definition of the channel 2525.8.2. Channel capacity 2535.8.3. Binary symmetric channel 2545.9. Shannon’s second theorem 2545.9.1. The noisy-channel coding theorem (Shannon’s second theorem) 2555.10. Error-detecting and error-correcting codes 2565.10.1. Algebraic coding 2575.10.1.1. Principles 2575.10.1.2. Hamming distance 2585.10.1.3. Detection and correction capability 2605.10.1.4. Additional definitions and properties 2615.10.1.5. Linear block codes, group codes 2625.10.1.6. Cyclic codes 2675.10.2. Convolutional codes 2725.10.2.1. D-transform 2735.10.2.2. Graphical representation, graphs and trellis 2735.10.2.3.Viterbi’s algorithm 2755.10.3. Combined codes and turbo codes 2765.10.3.1. Interleaving 2775.10.3.2. Product codes 2785.10.3.3. Concatenation 2785.10.3.4. Parallelization 2785.10.3.5. Turbo codes 2785.11.Cryptology 2825.11.1.Encryption 2825.11.1.1. Symmetric-key encryption 2825.11.1.2. Public-key encryption 2845.11.2. Digital signature 2865.11.3. Signature and hashing 286Chapter 6. Traffic and Queueing Theory 2896.1. Traffic concepts 2896.1.1. The Erlang concept 2906.1.2.Traffic modeling 2916.2. The concept of processes 2926.2.1. Arrival process 2926.2.1.1. Renewal process 2926.2.1.2. Poisson arrivals 2936.2.1.3. The use of the Poisson process 2966.2.2. Service process 2966.2.2.1. Exponential distribution 2976.2.2.2. Residual service time 2976.2.2.3.Erlang distribution 2996.2.2.4. Hyperexponential distribution 2996.2.3. General arrival and service processes 3006.3. Markov and birth/death processes 3016.3.1. State concept 3026.3.2. Markov chains 3026.3.3. Birth and death processes 3036.4. Queueing models 3066.4.1. Introduction 3066.4.2. A general result: the Little formula 3076.4.3. PASTA property (Poisson arrivals see time averages) 3096.4.4. The elementary queue: the M/M/1 system 3106.4.4.1. Resolution of the state equations 3106.4.4.2. Using generating functions 3126.4.4.3. Waiting time distribution 3136.4.5. The M/M/R/R model (Erlang model) 3176.4.6. The M/M/R queue and the Erlang-C formula 3186.4.7. The M/M/∞ queue and the Poisson law 3216.4.8. The M(n)/M/R/R queue and the Engset formula 3226.4.9. Models with limited capacity 3256.5. More complex queues 3256.5.1. Multi-bitrate Erlang model 3256.5.2. The embedded Markov chain 3266.5.3.The number of clients in a system 3276.5.4. Waiting times: Pollaczek formulae 3306.5.4.1. Introduction: calculation of residual service time 3306.5.4.2. The Pollaczek-Khintchine formula 3326.5.4.3. Example 1: the M/M/1 queue 3336.5.4.4. Example 2: the M/D/1 queue 3336.5.4.5. Generalization: Takacs’ formula 3336.5.5. The Bene¢s method: application to the M/D/1 system 3336.6. The G/G/1 queue 3346.6.1. Pollaczek method 3356.6.2. Application to the stochastic relation of the queue to one server (GI/G/1 queue) 3376.6.3. Resolution of the integral equation 3396.6.3.1. Application to the M/G/1 queue 3396.6.3.2. Application to the G/M/1 queue 3436.6.4. Other applications and extension of the Pollaczek method 3466.7. Queues with priorities 3476.7.1. Work conserving system 3486.7.2.The HoL discipline 3506.8. Using approximate methods 3516.8.1.Reattempts 3526.8.2. Peakedness factor method 3536.8.3.Approximate formulae for theG/G/Rsystem 3586.9. Appendix: Pollaczek transform 359Chapter 7. Reliability Theory 3637.1. Definition of reliability 3637.2. Failure rate and bathtub curve 3647.3. Reliability functions 3657.4. System reliability 3667.4.1. Reliability of non-repairable systems 3667.4.1.1. Reliability of the series configuration 3677.4.1.2. Reliability of the parallel configuration 3687.4.1.3. Reliability of the series-parallel configuration 3697.4.1.4. Reliability of the parallel-series configuration 3707.4.1.5. Complex configurations 3707.4.1.6. Non-repairable redundant configurations 3717.4.2. Reliability and availability of repairable systems 3737.4.2.1. State equations 3737.4.2.2. Reliability of redundant repairable systems 3757.4.2.3. Imperfect structures 3807.4.3.Using Laplace transform 3827.4.4.Use of matrices 3857.4.4.1. Exact resolution by inversion 3877.4.4.2.Approximate solutions 3897.5. Software reliability 3907.5.1. Reliability growth model, early-life period 3917.5.2. Useful-life period model 3927.6. Spare parts calculation 3957.6.1.Definitions 3957.6.2. Periodical restocking 3967.6.3. Continuous restocking 396Chapter 8. Simulation 3998.1.Roulette simulation 4008.2.Discrete-event simulation 4028.3. Measurements and accuracy 4048.3.1.Measurements 4048.3.2. Accuracy 4048.4. Random numbers 4078.4.1. Generation according to a distribution 4078.4.2. Generating pseudo-random variables 408Appendix. Mathematical Refresher 413A.1. The function of the complex variable: definition and theorems 413A.2. Usual z-transforms 415A.3. Series expansions (real functions) 415A.4. Series expansion of a function of the complex variable 418A.5. Algebraic structures 419A.6. Polynomials over the binary finite field 422A.7.Matrices 424Bibliography 427Index 431