Chapter
1.4 The Notion of Information in Statistics
1.5 Statistical Decision Theory
Chapter 2 Point Estimation
2.1 Optimal Unbiased Estimators
2.2 Variance-Invariant Estimation
2.3 Methods for Construction and Improvement of Estimators
2.3.1 Maximum Likelihood Method
2.3.2 Least Squares Method
2.3.3 Minimum Chi-Squared Method
2.3.5 Jackknife Estimators
2.3.6 Estimators Based on Order Statistics
2.3.6.1 Order and Rank Statistics
2.4 Properties of Estimators
2.4.2 Asymptotic Properties
Chapter 3 Statistical Tests and Confidence Estimations
3.1 Basic Ideas of Test Theory
3.2 The Neyman–Pearson Lemma
3.3 Tests for Composite Alternative Hypotheses and One-Parametric Distribution Families
3.3.1 Distributions with Monotone Likelihood Ratio and Uniformly Most Powerful Tests for One-Sided Hypotheses
3.3.2 UMPU-Tests for Two-Sided Alternative Hypotheses
3.4 Tests for Multi-Parametric Distribution Families
3.4.2 The Two-Sample Problem: Properties of Various Tests and Robustness
3.4.2.1 Comparison of Two Expectations
3.4.3 Comparison of Two Variances
3.4.4 Table for Sample Sizes
3.5 Confidence Estimation
3.5.1 One-Sided Confidence Intervals in One-Parametric Distribution Families
3.5.2 Two-Sided Confidence Intervals in One-Parametric and Confidence Intervals in Multi-Parametric Distribution Families
3.5.3 Table for Sample Sizes
3.6.2 Wald´s Sequential Likelihood Ratio Test for One-Parametric Exponential Families
3.6.3 Test about Mean Values for Unknown Variances
3.6.4 Approximate Tests for the Two-Sample Problem
3.6.5 Sequential Triangular Tests
3.6.6 A Sequential Triangular Test for the Correlation Coefficient
3.7 Remarks about Interpretation
Chapter 4 Linear Models – General Theory
4.1 Linear Models with Fixed Effects
4.1.1 Least Squares Method
4.1.2 Maximum Likelihood Method
4.1.3 Tests of Hypotheses
4.1.4 Construction of Confidence Regions
4.1.5 Special Linear Models
4.1.6 The Generalised Least Squares Method (GLSM)
4.2 Linear Models with Random Effects: Mixed Models
4.2.1 Best Linear Unbiased Prediction (BLUP)
4.2.2 Estimation of Variance Components
Chapter 5 Analysis of Variance (ANOVA) – Fixed Effects Models (Model I of Analysis of Variance)
5.2 Analysis of Variance with One Factor (Simple- or One-Way Analysis of Variance)
5.2.1 The Model and the Analysis
5.2.2 Planning the Size of an Experiment
5.2.2.1 General Description for All Sections of This Chapter
5.2.2.2 The Experimental Size for the One-Way Classification
5.3 Two-Way Analysis of Variance
5.3.1 Cross-Classification (AxB)
5.3.1.1 Parameter Estimation
5.3.1.2 Testing Hypotheses
5.3.2 Nested Classification (AB)
5.4 Three-Way Classification
5.4.1 Complete Cross-Classification (AxBxC)
5.4.2 Nested Classification (CBA)
5.4.3 Mixed Classification
5.4.3.1 Cross-Classification between Two Factors Where One of Them Is Subordinated to a Third Factor BAxC
5.4.3.2 Cross-Classification of Two Factors in Which a Third Factor Is Nested CAxB
Chapter 6 Analysis of Variance: Estimation of Variance Components (Model II of the Analysis of Variance)
6.1 Introduction: Linear Models with Random Effects
6.2 One-Way Classification
6.2.1 Estimation of Variance Components
6.2.1.1 Analysis of Variance Method
6.2.1.2 Estimators in Case of Normally Distributed Y
6.2.1.4 Matrix Norm Minimising Quadratic Estimation
6.2.1.5 Comparison of Several Estimators
6.2.2 Tests of Hypotheses and Confidence Intervals
6.2.3 Variances and Properties of the Estimators of the Variance Components
6.3 Estimators of Variance Components in the Two-Way and Three-Way Classification
6.3.1 General Description for Equal and Unequal Subclass Numbers
6.3.2 Two-Way Cross-Classification
6.3.3 Two-Way Nested Classification
6.3.4 Three-Way Cross-Classification with Equal Subclass Numbers
6.3.5 Three-Way Nested Classification
6.3.6 Three-Way Mixed Classification
Chapter 7 Analysis of Variance – Models with Finite Level Populations and Mixed Models
7.1 Introduction: Models with Finite Level Populations
7.2 Rules for the Derivation of SS, df, MS and E(MS) in Balanced ANOVA Models
7.3 Variance Component Estimators in Mixed Models
7.3.1 An Example for the Balanced Case
7.3.2 The Unbalanced Case
7.4 Tests for Fixed Effects and Variance Components
7.5 Variance Component Estimation and Tests of Hypotheses in Special Mixed Models
7.5.1 Two-Way Cross-Classification
7.5.2 Two-Way Nested Classification BA
7.5.2.1 Levels of A Random
7.5.2.2 Levels of B Random
7.5.3 Three-Way Cross-Classification
7.5.4 Three-Way Nested Classification
7.5.5 Three-Way Mixed Classification
Chapter 8 Regression Analysis – Linear Models with Non-random Regressors (Model I of Regression Analysis) and with Random Regressor...
8.2.1 Least Squares Method
8.2.2 Optimal Experimental Design
8.5 Models with Random Regressors
8.5.2 Experimental Designs
8.7 Concluding Remarks about Models of Regression Analysis
Chapter 9 Regression Analysis – Intrinsically Non-linear Model I
9.1 Estimating by the Least Squares Method
9.1.1 Gauss–Newton Method
9.1.2 Internal Regression
9.1.3 Determining Initial Values for Iteration Methods
9.2 Geometrical Properties
9.2.1 Expectation Surface and Tangent Plane
9.3 Asymptotic Properties and the Bias of LS Estimators
9.4 Confidence Estimations and Tests
9.4.2 Tests and Confidence Estimations Based on the Asymptotic Covariance Matrix
9.4.3 Simulation Experiments to Check Asymptotic Tests and Confidence Estimations
9.5 Optimal Experimental Design
9.6 Special Regression Functions
9.6.1 Exponential Regression
9.6.1.2 Confidence Estimations and Tests
9.6.1.3 Results of Simulation Experiments
9.6.1.4 Experimental Designs
9.6.2 The Bertalanffy Function
9.6.3 The Logistic (Three-Parametric Hyperbolic Tangent) Function
9.6.4 The Gompertz Function
9.6.5 The Hyperbolic Tangent Function with Four Parameters
9.6.6 The Arc Tangent Function with Four Parameters
9.6.7 The Richards Function
9.6.8 Summarising the Results of Sections
9.6.9 Problems of Model Choice
Chapter 10 Analysis of Covariance (ANCOVA)
10.2 General Model I–I of the Analysis of Covariance
10.3 Special Models of the Analysis of Covariance for the Simple Classification
10.3.1 One Covariable with Constant γ
10.3.2 A Covariable with Regression Coefficients γi Depending on the Levels of the Classification Factor
10.3.3 A Numerical Example
Chapter 11 Multiple Decision Problems
11.1 Selection Procedures
11.1.2 Indifference Zone Formulation for Expectations
11.1.2.1 Selection of Populations with Normal Distribution
11.1.2.2 Approximate Solutions for Non-normal Distributions and t = 1
11.1.3 Selection of a Subset Containing the Best Population with Given Probability
11.1.3.1 Selection of the Normal Distribution with the Largest Expectation
11.1.3.2 Selection of the Normal Distribution with Smallest Variance
11.2 Multiple Comparisons
11.2.1 Confidence Intervals for All Contrasts: Scheffé´s Method
11.2.2 Confidence Intervals for Given Contrasts: Bonferroni´s and Dunn´s Method
11.2.3 Confidence Intervals for All Contrasts for ni=n: Tukey´s Method
11.2.4 Confidence Intervals for All Contrasts: Generalised Tukey´s Method
11.2.5 Confidence Intervals for the Differences of Treatments with a Control: Dunnett´s Method
11.2.6 Multiple Comparisons and Confidence Intervals
11.2.7 Which Multiple Comparison Shall Be Used?
Chapter 12 Experimental Designs
12.2.1 Completely Balanced Incomplete Block Designs (BIBD)
12.2.2 Construction Methods of BIBD
12.2.3 Partially Balanced Incomplete Block Designs
12.5 Programs for Construction of Experimental Designs
Appendix B: Abbreviations
Appendix C: Probability and Density Functions
Solutions and Hints for Exercises
Index Mathematical Statistics