| 查看: 7161 | 回复: 139 | ||||
| 当前只显示满足指定条件的回帖,点击这里查看本话题的所有回帖 | ||||
hylpy专家顾问 (知名作家)
|
[资源]
计算统计COMPUTATIONAL STATISTICS【最新英文版】[美]G·H·吉文斯
|
|||
|
COMPUTATIONAL STATISTICS GEOF H. GIVENS AND JENNIFER A. HOETING 计算统计(第2版) 【最新英文版】 -[美]G·H·吉文斯 -Wiley出版 -2013 CONTENTS PREFACE xv ACKNOWLEDGMENTS xvii 1 REVIEW 1 1.1 Mathematical Notation 1 1.2 Taylor’s Theorem and Mathematical Limit Theory 2 1.3 Statistical Notation and Probability Distributions 4 1.4 Likelihood Inference 9 1.5 Bayesian Inference 11 1.6 Statistical Limit Theory 13 1.7 Markov Chains 14 1.8 Computing 17 PART I OPTIMIZATION 2 OPTIMIZATION AND SOLVING NONLINEAR EQUATIONS 21 2.1 Univariate Problems 22 2.1.1 Newton’s Method 26 2.1.1.1 Convergence Order 29 2.1.2 Fisher Scoring 30 2.1.3 Secant Method 30 2.1.4 Fixed-Point Iteration 32 2.1.4.1 Scaling 33 2.2 Multivariate Problems 34 2.2.1 Newton’s Method and Fisher Scoring 34 2.2.1.1 Iteratively Reweighted Least Squares 36 2.2.2 Newton-Like Methods 39 2.2.2.1 Ascent Algorithms 39 2.2.2.2 Discrete Newton and Fixed-Point Methods 41 2.2.2.3 Quasi-Newton Methods 41 2.2.3 Gauss–Newton Method 44 2.2.4 Nelder–Mead Algorithm 45 2.2.5 Nonlinear Gauss–Seidel Iteration 52 Problems 54 3 COMBINATORIAL OPTIMIZATION 59 3.1 Hard Problems and NP-Completeness 59 3.1.1 Examples 61 3.1.2 Need for Heuristics 64 3.2 Local Search 65 3.3 Simulated Annealing 68 3.3.1 Practical Issues 70 3.3.1.1 Neighborhoods and Proposals 70 3.3.1.2 Cooling Schedule and Convergence 71 3.3.2 Enhancements 74 3.4 Genetic Algorithms 75 3.4.1 Definitions and the Canonical Algorithm 75 3.4.1.1 Basic Definitions 75 3.4.1.2 Selection Mechanisms and Genetic Operators 76 3.4.1.3 Allele Alphabets and Genotypic Representation 78 3.4.1.4 Initialization, Termination, and Parameter Values 79 3.4.2 Variations 80 3.4.2.1 Fitness 80 3.4.2.2 Selection Mechanisms and Updating Generations 81 3.4.2.3 Genetic Operators and Permutation Chromosomes 82 3.4.3 Initialization and Parameter Values 84 3.4.4 Convergence 84 3.5 Tabu Algorithms 85 3.5.1 Basic Definitions 86 3.5.2 The Tabu List 87 3.5.3 Aspiration Criteria 88 3.5.4 Diversification 89 3.5.5 Intensification 90 3.5.6 Comprehensive Tabu Algorithm 91 Problems 92 4 EM OPTIMIZATION METHODS 97 4.1 Missing Data, Marginalization, and Notation 97 4.2 The EM Algorithm 98 4.2.1 Convergence 102 4.2.2 Usage in Exponential Families 105 4.2.3 Variance Estimation 106 4.2.3.1 Louis’s Method 106 4.2.3.2 SEM Algorithm 108 4.2.3.3 Bootstrapping 110 4.2.3.4 Empirical Information 110 4.2.3.5 Numerical Differentiation 111 4.3 EM Variants 111 4.3.1 Improving the E Step 111 4.3.1.1 Monte Carlo EM 111 4.3.2 Improving the M Step 112 4.3.2.1 ECM Algorithm 113 4.3.2.2 EM Gradient Algorithm 116 4.3.3 Acceleration Methods 118 4.3.3.1 Aitken Acceleration 118 4.3.3.2 Quasi-Newton Acceleration 119 Problems 121 PART II INTEGRATION AND SIMULATION 5 NUMERICAL INTEGRATION 129 5.1 Newton–Cˆotes Quadrature 129 5.1.1 Riemann Rule 130 5.1.2 Trapezoidal Rule 134 5.1.3 Simpson’s Rule 136 5.1.4 General kth-Degree Rule 138 5.2 Romberg Integration 139 5.3 Gaussian Quadrature 142 5.3.1 Orthogonal Polynomials 143 5.3.2 The Gaussian Quadrature Rule 143 5.4 Frequently Encountered Problems 146 5.4.1 Range of Integration 146 5.4.2 Integrands with Singularities or Other Extreme Behavior 146 5.4.3 Multiple Integrals 147 5.4.4 Adaptive Quadrature 147 5.4.5 Software for Exact Integration 148 Problems 148 6 SIMULATION AND MONTE CARLO INTEGRATION 151 6.1 Introduction to the Monte Carlo Method 151 6.2 Exact Simulation 152 6.2.1 Generating from Standard Parametric Families 153 6.2.2 Inverse Cumulative Distribution Function 153 6.2.3 Rejection Sampling 155 6.2.3.1 Squeezed Rejection Sampling 158 6.2.3.2 Adaptive Rejection Sampling 159 6.3 Approximate Simulation 163 6.3.1 Sampling Importance Resampling Algorithm 163 6.3.1.1 Adaptive Importance, Bridge, and Path Sampling 167 6.3.2 Sequential Monte Carlo 168 6.3.2.1 Sequential Importance Sampling for Markov Processes 169 6.3.2.2 General Sequential Importance Sampling 170 6.3.2.3 Weight Degeneracy, Rejuvenation, and Effective Sample Size 171 6.3.2.4 Sequential Importance Sampling for Hidden Markov Models 175 6.3.2.5 Particle Filters 179 6.4 Variance Reduction Techniques 180 6.4.1 Importance Sampling 180 6.4.2 Antithetic Sampling 186 6.4.3 Control Variates 189 6.4.4 Rao–Blackwellization 193 Problems 195 7 MARKOV CHAIN MONTE CARLO 201 7.1 Metropolis–Hastings Algorithm 202 7.1.1 Independence Chains 204 7.1.2 Random Walk Chains 206 7.2 Gibbs Sampling 209 7.2.1 Basic Gibbs Sampler 209 7.2.2 Properties of the Gibbs Sampler 214 7.2.3 Update Ordering 216 7.2.4 Blocking 216 7.2.5 Hybrid Gibbs Sampling 216 7.2.6 Griddy–Gibbs Sampler 218 7.3 Implementation 218 7.3.1 Ensuring Good Mixing and Convergence 219 7.3.1.1 Simple Graphical Diagnostics 219 7.3.1.2 Burn-in and Run Length 220 7.3.1.3 Choice of Proposal 222 7.3.1.4 Reparameterization 223 7.3.1.5 Comparing Chains: Effective Sample Size 224 7.3.1.6 Number of Chains 225 7.3.2 Practical Implementation Advice 226 7.3.3 Using the Results 226 Problems 230 8 ADVANCED TOPICS IN MCMC 237 8.1 Adaptive MCMC 237 8.1.1 Adaptive Random Walk Metropolis-within-Gibbs Algorithm 238 8.1.2 General Adaptive Metropolis-within-Gibbs Algorithm 240 8.1.3 Adaptive Metropolis Algorithm 247 8.2 Reversible Jump MCMC 250 8.2.1 RJMCMC for Variable Selection in Regression 253 8.3 Auxiliary Variable Methods 256 8.3.1 Simulated Tempering 257 8.3.2 Slice Sampler 258 8.4 Other Metropolis–Hastings Algorithms 260 8.4.1 Hit-and-Run Algorithm 260 8.4.2 Multiple-Try Metropolis–Hastings Algorithm 261 8.4.3 Langevin Metropolis–Hastings Algorithm 262 8.5 Perfect Sampling 264 8.5.1 Coupling from the Past 264 8.5.1.1 Stochastic Monotonicity and Sandwiching 267 8.6 Markov Chain Maximum Likelihood 268 8.7 Example: MCMC for Markov Random Fields 269 8.7.1 Gibbs Sampling for Markov Random Fields 270 8.7.2 Auxiliary Variable Methods for Markov Random Fields 274 8.7.3 Perfect Sampling for Markov Random Fields 277 Problems 279 PART III BOOTSTRAPPING 9 BOOTSTRAPPING 287 9.1 The Bootstrap Principle 287 9.2 Basic Methods 288 9.2.1 Nonparametric Bootstrap 288 9.2.2 Parametric Bootstrap 289 9.2.3 Bootstrapping Regression 290 9.2.4 Bootstrap Bias Correction 291 9.3 Bootstrap Inference 292 9.3.1 Percentile Method 292 9.3.1.1 Justification for the Percentile Method 293 9.3.2 Pivoting 294 9.3.2.1 Accelerated Bias-Corrected Percentile Method, BCa 294 9.3.2.2 The Bootstrap t 296 9.3.2.3 Empirical Variance Stabilization 298 9.3.2.4 Nested Bootstrap and Prepivoting 299 9.3.3 Hypothesis Testing 301 9.4 Reducing Monte Carlo Error 302 9.4.1 Balanced Bootstrap 302 9.4.2 Antithetic Bootstrap 302 9.5 Bootstrapping Dependent Data 303 9.5.1 Model-Based Approach 304 9.5.2 Block Bootstrap 304 9.5.2.1 Nonmoving Block Bootstrap 304 9.5.2.2 Moving Block Bootstrap 306 9.5.2.3 Blocks-of-Blocks Bootstrapping 307 9.5.2.4 Centering and Studentizing 309 9.5.2.5 Block Size 311 9.6 Bootstrap Performance 315 9.6.1 Independent Data Case 315 9.6.2 Dependent Data Case 316 9.7 Other Uses of the Bootstrap 316 9.8 Permutation Tests 317 Problems 319 PART IV DENSITY ESTIMATION AND SMOOTHING 10 NONPARAMETRIC DENSITY ESTIMATION 325 10.1 Measures of Performance 326 10.2 Kernel Density Estimation 327 10.2.1 Choice of Bandwidth 329 10.2.1.1 Cross-Validation 332 10.2.1.2 Plug-in Methods 335 10.2.1.3 Maximal Smoothing Principle 338 10.2.2 Choice of Kernel 339 10.2.2.1 Epanechnikov Kernel 339 10.2.2.2 Canonical Kernels and Rescalings 340 10.3 Nonkernel Methods 341 10.3.1 Logspline 341 10.4 Multivariate Methods 345 10.4.1 The Nature of the Problem 345 10.4.2 Multivariate Kernel Estimators 346 10.4.3 Adaptive Kernels and Nearest Neighbors 348 10.4.3.1 Nearest Neighbor Approaches 349 10.4.3.2 Variable-Kernel Approaches and Transformations 350 10.4.4 Exploratory Projection Pursuit 353 Problems 359 11 BIVARIATE SMOOTHING 363 11.1 Predictor–Response Data 363 11.2 Linear Smoothers 365 11.2.1 Constant-Span Running Mean 366 11.2.1.1 Effect of Span 368 11.2.1.2 Span Selection for Linear Smoothers 369 11.2.2 Running Lines and Running Polynomials 372 11.2.3 Kernel Smoothers 374 11.2.4 Local Regression Smoothing 374 11.2.5 Spline Smoothing 376 11.2.5.1 Choice of Penalty 377 11.3 Comparison of Linear Smoothers 377 11.4 Nonlinear Smoothers 379 11.4.1 Loess 379 11.4.2 Supersmoother 381 11.5 Confidence Bands 384 11.6 General Bivariate Data 388 Problems 389 12 MULTIVARIATE SMOOTHING 393 12.1 Predictor–Response Data 393 12.1.1 Additive Models 394 12.1.2 Generalized Additive Models 397 12.1.3 Other Methods Related to Additive Models 399 12.1.3.1 Projection Pursuit Regression 399 12.1.3.2 Neural Networks 402 12.1.3.3 Alternating Conditional Expectations 403 12.1.3.4 Additivity and Variance Stabilization 404 12.1.4 Tree-Based Methods 405 12.1.4.1 Recursive Partitioning Regression Trees 406 12.1.4.2 Tree Pruning 409 12.1.4.3 Classification Trees 411 12.1.4.4 Other Issues for Tree-Based Methods 412 12.2 General Multivariate Data 413 12.2.1 Principal Curves 413 12.2.1.1 Definition and Motivation 413 12.2.1.2 Estimation 415 12.2.1.3 Span Selection 416 Problems 416 DATA ACKNOWLEDGMENTS 421 REFERENCES 423 INDEX 457[ Last edited by hylpy on 2015-5-12 at 12:09 ] |
» 本帖附件资源列表
-
欢迎监督和反馈:小木虫仅提供交流平台,不对该内容负责。
本内容由用户自主发布,如果其内容涉及到知识产权问题,其责任在于用户本人,如对版权有异议,请联系邮箱:xiaomuchong@tal.com - 附件 1 : 计算统计(第2版)【最新英文版】-美G·H·吉文斯-Wiley出版-2013.pdf
2015-05-12 12:03:13, 4.78 M
» 猜你喜欢
实验室接单子
已经有3人回复
要不要辞职读博?
已经有5人回复
不自信的我
已经有10人回复
磺酰氟产物,毕不了业了!
已经有8人回复
求助:我三月中下旬出站,青基依托单位怎么办?
已经有10人回复
26申博(荧光探针方向,有机合成)
已经有4人回复
论文终于录用啦!满足毕业条件了
已经有26人回复
2026年机械制造与材料应用国际会议 (ICMMMA 2026)
已经有4人回复
Cas 72-43-5需要30g,定制合成,能接单的留言
已经有8人回复
北京211副教授,35岁,想重新出发,去国外做博后,怎么样?
已经有8人回复
120楼2017-08-09 14:57:26
8楼2015-05-13 08:49:38
简单回复
2015-05-12 13:29
回复
五星好评 顶一下,感谢分享!
2015-05-12 13:55
回复
五星好评 顶一下,感谢分享!
askuyue4楼
2015-05-12 18:01
回复
五星好评 顶一下,感谢分享!
jml5065楼
2015-05-12 18:41
回复
五星好评 顶一下,感谢分享!
wjy20116楼
2015-05-13 08:12
回复
五星好评 顶一下,感谢分享!
2015-05-13 08:16
回复
五星好评 顶一下,感谢分享!
psylhh9楼
2015-05-13 09:49
回复
五星好评 顶一下,感谢分享!
wxq282810楼
2015-05-13 12:24
回复
五星好评 顶一下,感谢分享!
hms200611楼
2015-05-13 12:30
回复
五星好评 顶一下,感谢分享!
math200012楼
2015-05-13 14:36
回复
五星好评 顶一下,感谢分享!
wyf_199913楼
2015-05-13 14:54
回复
五星好评 顶一下,感谢分享!
liuqiang6814楼
2015-05-14 05:56
回复
五星好评 顶一下,感谢分享!













回复此楼