Beställningsvara. Skickas inom 7-10 vardagar. Fri frakt för medlemmar vid köp för minst 249 kr.
Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background.The book has the following features: Several appendices include related material on integration, important inequalities and identities, frequency-domain transforms, and linear algebra. These topics have been included so that the book is relatively self-contained. One appendix contains an extensive summary of 33 random variables and their properties such as moments, characteristic functions, and entropy.Unlike most books on probability, numerous figures have been included to clarify and expand upon important points. Over 600 illustrations and MATLAB plots have been designed to reinforce the material and illustrate the various characterizations and properties of random quantities.Sufficient statistics are covered in detail, as is their connection to parameter estimation techniques. These include classical Bayesian estimation and several optimality criteria: mean-square error, mean-absolute error, maximum likelihood, method of moments, and least squares.The last four chapters provide an introduction to several topics usually studied in subsequent engineering courses: communication systems and information theory; optimal filtering (Wiener and Kalman); adaptive filtering (FIR and IIR); and antenna beamforming, channel equalization, and direction finding. This material is available electronically at the companion website.Probability, Random Variables, and Random Processes is the only textbook on probability for engineers that includes relevant background material, provides extensive summaries of key results, and extends various statistical techniques to a range of applications in signal processing.
JOHN J. SHYNK, PhD, is Professor of Electrical and Computer Engineering at the University of California, Santa Barbara. He was a Member of Technical Staff at Bell Laboratories, and received degrees in systems engineering, electrical engineering, and statistics from Boston University and Stanford University.
PREFACE xxi NOTATION xxv1 Overview and Background 11.1 Introduction 11.1.1 Signals, Signal Processing, and Communications 31.1.2 Probability, Random Variables, and Random Vectors 91.1.3 Random Sequences and Random Processes 111.1.4 Delta Functions 161.2 Deterministic Signals and Systems 191.2.1 Continuous Time 201.2.2 Discrete Time 251.2.3 Discrete-Time Filters 291.2.4 State-Space Realizations 321.3 Statistical Signal Processing with MATLAB® 351.3.1 Random Number Generation 351.3.2 Filtering 38Problems 39Further Reading 45PART I Probability, Random Variables, and Expectation2 Probability Theory 492.1 Introduction 492.2 Sets and Sample Spaces 502.3 Set Operations 542.4 Events and Fields 582.5 Summary of a Random Experiment 642.6 Measure Theory 642.7 Axioms of Probability 682.8 Basic Probability Results 692.9 Conditional Probability 712.10 Independence 732.11 Bayes’ Formula 742.12 Total Probability 762.13 Discrete Sample Spaces 792.14 Continuous Sample Spaces 832.15 Nonmeasurable Subsets of R 84Problems 87Further Reading 903 Random Variables 913.1 Introduction 913.2 Functions and Mappings 913.3 Distribution Function 963.4 Probability Mass Function 1013.5 Probability Density Function 1033.6 Mixed Distributions 1043.7 Parametric Models for Random Variables 1073.8 Continuous Random Variables 1093.8.1 Gaussian Random Variable (Normal) 1103.8.2 Log-Normal Random Variable 1133.8.3 Inverse Gaussian Random Variable (Wald) 1143.8.4 Exponential Random Variable (One-Sided) 1163.8.5 Laplace Random Variable (Double-Sided Exponential) 1193.8.6 Cauchy Random Variable 1223.8.7 Continuous Uniform Random Variable 1243.8.8 Triangular Random Variable 1253.8.9 Rayleigh Random Variable 1273.8.10 Rice Random Variable 1293.8.11 Gamma Random Variable (Erlang for r ∈ N) 1313.8.12 Beta Random Variable (Arcsine for α = β = 1/2, Power Function for β = 1) 1333.8.13 Pareto Random Variable 1363.8.14 Weibull Random Variable 1373.8.15 Logistic Random Variable (Sigmoid for {μ = 0, α = 1}) 1393.8.16 Chi Random Variable (Maxwell–Boltzmann, Half-Normal) 1413.8.17 Chi-Square Random Variable 1443.8.18 F-Distribution 1473.8.19 Student’s t Distribution 1493.8.20 Extreme Value Distribution (Type I: Gumbel) 1503.9 Discrete Random Variables 1513.9.1 Bernoulli Random Variable 1523.9.2 Binomial Random Variable 1543.9.3 Geometric Random Variable (with Support Z+ or N) 1573.9.4 Negative Binomial Random Variable (Pascal) 1603.9.5 Poisson Random Variable 1623.9.6 Hypergeometric Random Variable 1653.9.7 Discrete Uniform Random Variable 1673.9.8 Logarithmic Random Variable (Log-Series) 1683.9.9 Zeta Random Variable (Zipf) 170Problems 173Further Reading 1764 Multiple Random Variables 1774.1 Introduction 1774.2 Random Variable Approximations 1774.2.1 Binomial Approximation of Hypergeometric 1774.2.2 Poisson Approximation of Binomial 1794.2.3 Gaussian Approximations 1814.2.4 Gaussian Approximation of Binomial 1814.2.5 Gaussian Approximation of Poisson 1814.2.6 Gaussian Approximation of Hypergeometric 1834.3 Joint and Marginal Distributions 1834.4 Independent Random Variables 1864.5 Conditional Distribution 1874.6 Random Vectors 1904.6.1 Bivariate Uniform Distribution 1934.6.2 Multivariate Gaussian Distribution 1934.6.3 Multivariate Student’s t Distribution 1964.6.4 Multinomial Distribution 1974.6.5 Multivariate Hypergeometric Distribution 1984.6.6 Bivariate Exponential Distributions 2004.7 Generating Dependent Random Variables 2014.8 Random Variable Transformations 2054.8.1 Transformations of Discrete Random Variables 2054.8.2 Transformations of Continuous Random Variables 2074.9 Important Functions of Two Random Variables 2184.9.1 Sum: Z = X + Y 2184.9.2 Difference: Z = X − Y 2204.9.3 Product: Z = XY 2214.9.4 Quotient (Ratio): Z = X/Y 2244.10 Transformations of Random Variable Families 2264.10.1 Gaussian Transformations 2264.10.2 Exponential Transformations 2274.10.3 Chi-Square Transformations 2284.11 Transformations of Random Vectors 2294.12 Sample Mean ¯X and Sample Variance S2 2324.13 Minimum, Maximum, and Order Statistics 2344.14 Mixtures 238Problems 240Further Reading 2435 Expectation and Moments 2445.1 Introduction 2445.2 Expectation and Integration 2445.3 Indicator Random Variable 2455.4 Simple Random Variable 2465.5 Expectation for Discrete Sample Spaces 2475.6 Expectation for Continuous Sample Spaces 2505.7 Summary of Expectation 2535.8 Functional View of the Mean 2545.9 Properties of Expectation 2555.10 Expectation of a Function 2595.11 Characteristic Function 2605.12 Conditional Expectation 2655.13 Properties of Conditional Expectation 2675.14 Location Parameters: Mean, Median, and Mode 2765.15 Variance, Covariance, and Correlation 2805.16 Functional View of the Variance 2835.17 Expectation and the Indicator Function 2845.18 Correlation Coefficients 2855.19 Orthogonality 2915.20 Correlation and Covariance Matrices 2945.21 Higher Order Moments and Cumulants 2965.22 Functional View of Skewness 3025.23 Functional View of Kurtosis 3035.24 Generating Functions 3045.25 Fourth-Order Gaussian Moment 3095.26 Expectations of Nonlinear Transformations 310Problems 313Further Reading 316PART II Random Processes, Systems, and Parameter Estimation6 Random Processes 3196.1 Introduction 3196.2 Characterizations of a Random Process 3196.3 Consistency and Extension 3246.4 Types of Random Processes 3256.5 Stationarity 3266.6 Independent and Identically Distributed 3296.7 Independent Increments 3316.8 Martingales 3336.9 Markov Sequence 3386.10 Markov Process 3506.11 Random Sequences 3526.11.1 Bernoulli Sequence 3526.11.2 Bernoulli Scheme 3526.11.3 Independent Sequences 3536.11.4 Bernoulli Random Walk 3546.11.5 Binomial Counting Sequence 3566.12 Random Processes 3596.12.1 Poisson Counting Process 3596.12.2 Random Telegraph Signal 3656.12.3 Wiener Process 3686.12.4 Gaussian Process 3716.12.5 Pulse Amplitude Modulation 3726.12.6 Random Sine Signals 373Problems 375Further Reading 3797 Stochastic Convergence, Calculus, and Decompositions 3807.1 Introduction 3807.2 Stochastic Convergence 3807.3 Laws of Large Numbers 3887.4 Central Limit Theorem 3907.5 Stochastic Continuity 3947.6 Derivatives and Integrals 4047.7 Differential Equations 4147.8 Difference Equations 4227.9 Innovations and Mean-Square Predictability 4237.10 Doob–Meyer Decomposition 4287.11 Karhunen–Lo`eve Expansion 433Problems 441Further Reading 4448 Systems, Noise, and Spectrum Estimation 4458.1 Introduction 4458.2 Correlation Revisited 4458.3 Ergodicity 4488.4 Eigenfunctions of RXX(τ ) 4568.5 Power Spectral Density 4578.6 Power Spectral Distribution 4638.7 Cross-Power Spectral Density 4658.8 Systems with Random Inputs 4688.8.1 Nonlinear Systems 4698.8.2 Linear Systems 4718.9 Passband Signals 4768.10 White Noise 4798.11 Bandwidth 4848.12 Spectrum Estimation 4878.12.1 Periodogram 4878.12.2 Smoothed Periodogram 4938.12.3 Modified Periodogram 4978.13 Parametric Models 5008.13.1 Autoregressive Model 5008.13.2 Moving-Average Model 5058.13.3 Autoregressive Moving-Average Model 5098.14 System Identification 513Problems 515Further Reading 5189 Sufficient Statistics and Parameter Estimation 5199.1 Introduction 5199.2 Statistics 5199.3 Sufficient Statistics 5209.4 Minimal Sufficient Statistic 5259.5 Exponential Families 5289.6 Location-Scale Families 5339.7 Complete Statistic 5369.8 Rao–Blackwell Theorem 5389.9 Lehmann–Scheff´e Theorem 5409.10 Bayes Estimation 5429.11 Mean-Square-Error Estimation 5459.12 Mean-Absolute-Error Estimation 5529.13 Orthogonality Condition 5539.14 Properties of Estimators 5559.14.1 Unbiased 5559.14.2 Consistent 5579.14.3 Efficient 5599.15 Maximum A Posteriori Estimation 5619.16 Maximum Likelihood Estimation 5679.17 Likelihood Ratio Test 5699.18 Expectation–Maximization Algorithm 5709.19 Method of Moments 5769.20 Least-Squares Estimation 5779.21 Properties of LS Estimators 5829.21.1 Minimum ξWLS 5829.21.2 Uniqueness 5829.21.3 Orthogonality 5829.21.4 Unbiased 5849.21.5 Covariance Matrix 5849.21.6 Efficient: Achieves CRLB 5859.21.7 BLU Estimator 5859.22 Best Linear Unbiased Estimation 5869.23 Properties of BLU Estimators 590Problems 592Further Reading 595A Note on Part III of the Book 595APPENDICESIntroduction to Appendices 597A Summaries of Univariate Parametric Distributions 599A.1 Notation 599A.2 Further Reading 600A.3 Continuous Random Variables 601A.3.1 Beta (Arcsine for α = β = 1/2, Power Function for β = 1) 601A.3.2 Cauchy 602A.3.3 Chi 603A.3.4 Chi-Square 604A.3.5 Exponential (Shifted by c) 605A.3.6 Extreme Value (Type I: Gumbel) 606A.3.7 F-Distribution 607A.3.8 Gamma (Erlang for r ∈ N with (r ) = (r − 1)!) 608A.3.9 Gaussian (Normal) 609A.3.10 Half-Normal (Folded Normal) 610A.3.11 Inverse Gaussian (Wald) 611A.3.12 Laplace (Double-Sided Exponential) 612A.3.13 Logistic (Sigmoid for {μ = 0, α = 1}) 613A.3.14 Log-Normal 614A.3.15 Maxwell–Boltzmann 615A.3.16 Pareto 616A.3.17 Rayleigh 617A.3.18 Rice 618A.3.19 Student’s t Distribution 619A.3.20 Triangular 620A.3.21 Uniform (Continuous) 621A.3.22 Weibull 622A.4 Discrete Random Variables 623A.4.1 Bernoulli (with Support {0, 1}) 623A.4.2 Bernoulli (Symmetric with Support {−1, 1}) 624A.4.3 Binomial 625A.4.4 Geometric (with Support Z+) 626A.4.5 Geometric (Shifted with Support N) 627A.4.6 Hypergeometric 628A.4.7 Logarithmic (Log-Series) 629A.4.8 Negative Binomial (Pascal) 630A.4.9 Poisson 631A.4.10 Uniform (Discrete) 632A.4.11 Zeta (Zipf) 633B Functions and Properties 634B.1 Continuity and Bounded Variation 634B.2 Supremum and Infimum 640B.3 Order Notation 640B.4 Floor and Ceiling Functions 641B.5 Convex and Concave Functions 641B.6 Even and Odd Functions 641B.7 Signum Function 643B.8 Dirac Delta Function 644B.9 Kronecker Delta Function 645B.10 Unit-Step Functions 646B.11 Rectangle Functions 647B.12 Triangle and Ramp Functions 647B.13 Indicator Functions 648B.14 Sinc Function 649B.15 Logarithm Functions 650B.16 Gamma Functions 651B.17 Beta Functions 653B.18 Bessel Functions 655B.19 Q-Function and Error Functions 655B.20 Marcum Q-Function 659B.21 Zeta Function 659B.22 Rising and Falling Factorials 660B.23 Laguerre Polynomials 661B.24 Hypergeometric Functions 662B.25 Bernoulli Numbers 663B.26 Harmonic Numbers 663B.27 Euler–Mascheroni Constant 664B.28 Dirichlet Function 664Further Reading 664C Frequency-Domain Transforms and Properties 665C.1 Laplace Transform 665C.2 Continuous-Time Fourier Transform 669C.3 z-Transform 670C.4 Discrete-Time Fourier Transform 676Further Reading 677D Integration and Integrals 678D.1 Review of Riemann Integral 678D.2 Riemann–Stieltjes Integral 681D.3 Lebesgue Integral 684D.4 Pdf Integrals 688D.5 Indefinite and Definite Integrals 690D.6 Integral Formulas 692D.7 Double Integrals of Special Functions 692Further Reading 696E Identities and Infinite Series 697E.1 Zero and Infinity 697E.2 Minimum and Maximum 697E.3 Trigonometric Identities 698E.4 Stirling’s Formula 698E.5 Taylor Series 699E.6 Series Expansions and Closed-Form Sums 699E.7 Vandermonde’s Identity 702E.8 Pmf Sums and Functional Forms 703E.9 Completing the Square 704E.10 Summation by Parts 705Further Reading 706F Inequalities and Bounds for Expectations 707F.1 Cauchy–Schwarz and H¨older Inequalities 707F.2 Triangle and Minkowski Inequalities 708F.3 Bienaym´e, Chebyshev, and Markov Inequalities 709F.4 Chernoff’s Inequality 711F.5 Jensen’s Inequality 713F.6 Cram´er–Rao Inequality 714Further Reading 718G Matrix and Vector Properties 719G.1 Basic Properties 719G.2 Four Fundamental Subspaces 721G.3 Eigendecomposition 722G.4 LU, LDU, and Cholesky Decompositions 724G.5 Jacobian Matrix and the Jacobian 726G.6 Kronecker and Schur Products 728G.7 Properties of Trace and Determinant 728G.8 Matrix Inversion Lemma 729G.9 Cauchy–Schwarz Inequality 730G.10 Differentiation 730G.11 Complex Differentiation 731Further Reading 732GLOSSARY 733REFERENCES 743INDEX 755PART III Applications in Signal Processing and CommunicationsChapters at the Web Site www.wiley.com/go/randomprocesses10 Communication Systems and Information Theory 77110.1 Introduction 77110.2 Transmitter 77110.2.1 Sampling and Quantization 77210.2.2 Channel Coding 77710.2.3 Symbols and Pulse Shaping 77810.2.4 Modulation 78110.3 Transmission Channel 78310.4 Receiver 78610.4.1 Receive Filter 78610.4.2 Demodulation 78710.4.3 Gram–Schmidt Orthogonalization 78910.4.4 Maximum Likelihood Detection 79410.4.5 Matched Filter Receiver 79710.4.6 Probability of Error 80210.5 Information Theory 80310.5.1 Mutual Information and Entropy 80410.5.2 Properties of Mutual Information and Entropy 81010.5.3 Continuous Distributions: Differential Entropy 81310.5.4 Channel Capacity 81810.5.5 AWGN Channel 820Problems 821Further Reading 82411 Optimal Filtering www.wiley.com/go/randomprocesses 82511.1 Introduction 82511.2 Optimal Linear Filtering 82511.3 Optimal Filter Applications 82711.3.1 System Identification 82711.3.2 Inverse Modeling 82711.3.3 Noise Cancellation 82811.3.4 Linear Prediction 82811.4 Noncausal Wiener Filter 82911.5 Causal Wiener Filter 83111.6 Prewhitening Filter 83711.7 FIR Wiener Filter 83911.8 Kalman Filter 84411.8.1 Evolution of the Mean and Covariance 84611.8.2 State Prediction 84611.8.3 State Filtering 84811.9 Steady-State Kalman Filter 85111.10 Linear Predictive Coding 85711.11 Lattice Prediction-Error Filter 86111.12 Levinson–Durbin Algorithm 86511.13 Least-Squares Filtering 86811.14 Recursive Least-Squares 872Problems 876Further Reading 87912 Adaptive Filtering www.wiley.com/go/randomprocesses 88012.1 Introduction 88012.2 MSE Properties 88012.3 Steepest Descent 88912.4 Newton’s Method 89412.5 LMS Algorithm 89512.5.1 Convergence in the Mean 89912.5.2 Convergence in the Mean-Square 90112.5.3 Misadjustment 90612.6 Modified LMS Algorithms 91112.6.1 Sign-Error LMS Algorithm 91112.6.2 Sign-Data LMS Algorithm 91212.6.3 Sign-Sign LMS Algorithm 91412.6.4 LMF Algorithm 91412.6.5 Complex LMS Algorithm 91612.6.6 “Leaky” LMS Algorithm 91712.6.7 Normalized LMS Algorithm 91812.6.8 Perceptron 92012.6.9 Convergence of Modified LMS Algorithms 92212.7 Adaptive IIR Filtering 92312.7.1 Output-Error Formulation 92412.7.2 Output-Error IIR Filter Algorithm 92812.7.3 Equation-Error Formulation 93212.7.4 Equation-Error Bias 933Problems 936Further Reading 93913 Equalization, Beamforming, and Direction Finding www.wiley.com/go/randomprocesses 94013.1 Introduction 94013.2 Channel Equalization 94113.3 Optimal Bussgang Algorithm 94313.4 Blind Equalizer Algorithms 94913.4.1 Sato’s Algorithm 94913.4.2 Constant Modulus Algorithm 95013.5 CMA Performance Surface 95213.6 Antenna Arrays 95813.7 Beampatterns 96013.8 Optimal Beamforming 96213.8.1 Known Look Direction 96213.8.2 Multiple Constraint Beamforming 96413.8.3 Training Signal 96613.8.4 Maximum Likelihood 96813.8.5 Maximum SNR and SINR 96913.9 Adaptive Beamforming 97013.9.1 LMS Beamforming 97013.9.2 Constant Modulus Array 97013.9.3 Decision-Directed Mode 97313.9.4 Multistage CM Array 97413.9.5 Output SINR and SNR 97713.10 Direction Finding 98113.10.1 Beamforming Approaches 98113.10.2 MUSIC Algorithm 984Problems 985Further Reading 989