Variance estimation of some EU-SILC based indicators at regional

Transcrição

Variance estimation of some EU-SILC based indicators at regional
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Notes
Variance estimation of some EU-SILC based
indicators at regional level
Jean Monet Lecture in Pisa – 2016
Ralf Münnich
Trier University, Faculty IV
Chair of Economic and Social Statistics
Pisa, 19th May 2016 | Ralf Münnich | 1 (78)
1.
2.
3.
4.
Variance Estimation Methods
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Notes
1. Introduction to variance estimation
2. Linearization methods
3. Resampling methods
4. Further topics in variance estimation
Pisa, 19th May 2016 | Ralf Münnich | 2 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Unemployment in Saarland
Unemployed
Women
P
τ
14 – 24
2.387
25 – 44
7.248
45 – 64
4.686
65 +
128
14.449
Men
τ
4.172
9.504
10.588
0
24.264
P
τ
6.559
16.752
15.274
128
38.713
I
True values in Saarland
I
Estimates from the Microcensus
I
Is the quality of the cell estimates identical?
Pisa, 19th May 2016 | Ralf Münnich | 3 (78)
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Unemployment in Saarland
Unemployed
Women
Men
P
τ
E τb
τ
E τb
τ
E τb
14 – 24
2.387
2.387
4.172
4.172
6.559
6.558
25 – 44
7.248
7.238
9.504
9.505
16.752
16.743
I
True values in Saarland
I
Estimates from the Microcensus
I
Is the quality of the cell estimates identical?
Pisa, 19th May 2016 | Ralf Münnich | 3 (78)
1.
2.
3.
4.
45 – 64
4.686
4.684
10.588
10.598
15.274
15.282
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
65 +
128
128
0
0
128
128
P
14.449
14.436
24.264
24.275
38.713
38.711
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Unemployment in Saarland
Unemployed
Women
Men
P
τ
E τb
τ
E τb
τ
E τb
14 – 24
2.387
2.387
4.172
4.172
6.559
6.558
25 – 44
7.248
7.238
9.504
9.505
16.752
16.743
I
True values in Saarland
I
Estimates from the Microcensus
I
Is the quality of the cell estimates identical?
Pisa, 19th May 2016 | Ralf Münnich | 3 (78)
1.
2.
3.
4.
45 – 64
4.686
4.684
10.588
10.598
15.274
15.282
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
65 +
128
128
0
0
128
128
I
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Main emphasis is put on sample surveys
Statistical items of interest:
I
Sampling distribution
I
I
I
I
I
I
Theoretical distribution
Approximate distribution
Simulated distribution
Focus on properties
I
Large sample properties
Small sample properties
View of Official Statistics
Quality and the Code of Practice
Pisa, 19th May 2016 | Ralf Münnich | 4 (78)
14.449
14.436
24.264
24.275
38.713
38.711
Variance Estimation Methods
How can we measure quality?
I
P
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
European Statistics Code of Practice
15 principles on institutional environment (6), statistical processes
(4), and statistical output (5):
Relevance of the statistical concept:
End-user, user needs, hierarchical structure and contents
Accuracy and reliability:
I Sampling errors: standard error, CI coverage
I Non-sampling errors: nonresponse, coverage error,
measurement errors
Timeliness and punctuality: Time and duration from data
acquisition until publication
Coherence and comparability: Preliminary and final statistics,
annual and intermediate statistics (regions, domains, time)
Accessibility and clarity: Data, analysis and method reports
http://ec.europa.eu/eurostat/quality
Pisa, 19th May 2016 | Ralf Münnich | 5 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Evaluation of samples and surveys
Practicability
Costs of a survey
Accuracy of results
I
I
I
Standard errors
Confidence interval coverage
Disparity of sub-populations
Robustness of results
In order to adequately evaluate the estimates from samples,
appropriate evaluation criteria have to be considered.
Pisa, 19th May 2016 | Ralf Münnich | 6 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Why do we need variance estimation
Most accuracy measures are based on variances or variance
estimates!
I Measures for point estimators
I
I
I
I
I
Bias, variance, MSE
CV, relative root MSE
Bias ratio, confidence interval coverage
Design effect, effective sample size
Problems with measures:
I
I
I
I
I
Theoretical measures are problematic
Estimates from the sample (e.g. bias)
Availability in simulation study
Does large sample theory help much?
Small sample properties
Do we need special measures for variance estimators or variance
estimates?
Pisa, 19th May 2016 | Ralf Münnich | 7 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Why do we need variance estimation
Most accuracy measures are based on variances or variance
estimates!
I Measures for point estimators
I
I
I
I
I
Bias, variance, MSE
CV, relative root MSE
Bias ratio, confidence interval coverage
Design effect, effective sample size
Problems with measures:
I
I
I
I
I
Theoretical measures are problematic
Estimates from the sample (e.g. bias)
Availability in simulation study
Does large sample theory help much?
Small sample properties
Do we need special measures for variance estimators or variance
estimates?
Pisa, 19th May 2016 | Ralf Münnich | 7 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Why do we need variance estimation
Most accuracy measures are based on variances or variance
estimates!
I Measures for point estimators
I
I
I
I
I
Bias, variance, MSE
CV, relative root MSE
Bias ratio, confidence interval coverage
Design effect, effective sample size
Problems with measures:
I
I
I
I
I
Theoretical measures are problematic
Estimates from the sample (e.g. bias)
Availability in simulation study
Does large sample theory help much?
Small sample properties
Do we need special measures for variance estimators or variance
estimates?
Pisa, 19th May 2016 | Ralf Münnich | 7 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Why do we need variance estimation
Most accuracy measures are based on variances or variance
estimates!
I Measures for point estimators
I
I
I
I
I
Bias, variance, MSE
CV, relative root MSE
Bias ratio, confidence interval coverage
Design effect, effective sample size
Problems with measures:
I
I
I
I
I
Theoretical measures are problematic
Estimates from the sample (e.g. bias)
Availability in simulation study
Does large sample theory help much?
Small sample properties
Do we need special measures for variance estimators or variance
estimates?
Pisa, 19th May 2016 | Ralf Münnich | 7 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Example: Men in Hamburg
τ:
805258.00 N:
1669690
b ) : 1.29e+008 V V(τ
b ) : 6.72e+014
τ̂ :
805339.10 Vτ̂ :
1.29e+008 E V(τ
Bias Est:
81.10 MSE Est: 1.29e+008 Bias Var: -3.78e+005 MSE Var: 6.72e+014
Skew Est:
0.0747 Curt Est:
3.0209 Skew Var:
Curt Var:
CI (90%):
CI (95%):
Pisa, 19th May 2016 | Ralf Münnich | 8 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Example: Men in Hamburg
τ:
805258.00 N:
1669690
b ) : 1.29e+008 V V(τ
b ) : 6.72e+014
τ̂ :
805339.10 Vτ̂ :
1.29e+008 E V(τ
Bias Est:
81.10 MSE Est: 1.29e+008 Bias Var: -3.78e+005 MSE Var: 6.72e+014
Skew Est:
0.0747 Curt Est:
3.0209 Skew Var:
1.8046 Curt Var:
6.9973
CI (90%):
CI (95%):
Pisa, 19th May 2016 | Ralf Münnich | 8 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Example: Men in Hamburg
τ:
805258.00 N:
1669690
b ) : 1.29e+008 V V(τ
b ) : 6.72e+014
τ̂ :
805339.10 Vτ̂ :
1.29e+008 E V(τ
Bias Est:
81.10 MSE Est: 1.29e+008 Bias Var: -3.78e+005 MSE Var: 6.72e+014
Skew Est:
0.0747 Curt Est:
3.0209 Skew Var:
1.8046 Curt Var:
6.9973
CI (90%):
90.16 (4.1;5.7)
CI (95%):
94.79 (2.0;3.2)
Pisa, 19th May 2016 | Ralf Münnich | 8 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Total estimate separated by house size class (GGK)
Persons
Sampling units
GGK1
468293
173
GGK2
651740
446
Pisa, 19th May 2016 | Ralf Münnich | 9 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
GGK3
439745
414
GGK4
9940
10
GGK5
99970
75
total
1669690
1118
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Distribution of men in HAM (per SU)
HAM GGK 1
6000
2000
5 10 16 22 28 34 40 46 52 58 64 70 76
0
0
500
2000
1000
4000
1500
15000
10000
5000
0
0
0
5 10 16 22 28 34 40 46 52 58 64 70 76
5 10 16 22 28 34 40 46 52 58 64 70 76
HAM GGK 5
0
0
50
200
400
100
600
150
800
8000
6000
4000
2000
0
5 10 16 22 28 34 40 46 52 58 64 70 76
0
5 10 16 22 28 34 40 46 52 58 64 70 76
Pisa, 19th May 2016 | Ralf Münnich | 10 (78)
1.
2.
3.
4.
0
HAM GGK 4
200
HAM GGK 3
0
HAM GGK 2
8000
HAM Total
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
0
5 10 16 22 28 34 40 46 52 58 64 70 76
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Distribution of men in HAM (per SU)
HAM GGK 1
6000
2000
5 10 16 22 28 34 40 46 52 58 64 70 76
0
0
500
2000
1000
4000
1500
15000
10000
5000
0
0
0
5 10 16 22 28 34 40 46 52 58 64 70 76
5 10 16 22 28 34 40 46 52 58 64 70 76
HAM GGK 5
0
0
50
200
400
100
600
150
800
8000
6000
4000
2000
0
5 10 16 22 28 34 40 46 52 58 64 70 76
0
HAM GGK 4
200
HAM GGK 3
0
HAM GGK 2
8000
HAM Total
0
5 10 16 22 28 34 40 46 52 58 64 70 76
Pisa, 19th May 2016 | Ralf Münnich | 10 (78)
0
5 10 16 22 28 34 40 46 52 58 64 70 76
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Unemployed women, 25 – 44
Raking estimator
variance estimator
NR rates: 1: 5%, 2: 10%, 3: 25%, 4: 40%
95% 90%
Pisa, 19
1.
2.
3.
4.
th
May 2016 | Ralf Münnich | 11 (78)
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
3 e−06
Density
1 e−06
2 e−06
3 e−04
2 e−04
0 e+00
0 e+00
1 e−04
Density
4 e−04
4 e−06
Unemployed women, 25 – 44, distribution of point and
variance estimator (25% NR)
4000
5000
6000
7000
8000
9000
10000
11000
600000
Unemployed
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
800000
900000 1000000
Variance
Pisa, 19th May 2016 | Ralf Münnich | 12 (78)
1.
2.
3.
4.
700000
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Unemployed women, 65 +
Raking estimator
variance estimator
NR rates: 1: 5%, 2: 10%, 3: 25%, 4: 40%
95% 90%
Pisa, 19th May 2016 | Ralf Münnich | 13 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
0.00010
Density
0.00005
0.010
0.000
0.00000
0.005
Density
0.015
0.00015
Unemployed women, 65 +, distribution of point and
variance estimator (25% NR)
0
200
400
600
800
0
20000
Unemployed
60000
80000
Variance
Pisa, 19th May 2016 | Ralf Münnich | 14 (78)
1.
2.
3.
4.
40000
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Direct variance estimators for single stage sampling
Stratified random sampling Since E(sq2 ) =
b µStrRSWOR ) = s 2 =
V(b
µ
b
L
X
q=1
N
N−1
γq2 ·
· σq2 = Sq2
sq2 Nq − nq
·
nq
Nq
is an unbiased estimate for V(b
µStrRSWOR ).
Cluster sampling Similarly:
L2 s 2 L − l
b µ
V
bSIC = 2 · e ·
N
l
L
where
se2 =
l X
1
N ·µ
bSIC 2
·
Nrsel µsel
r −
l −1
L
.
r =1
Pisa, 19th May 2016 | Ralf Münnich | 15 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Direct variance estimator for two stage sampling
I
Direct variance estimator :
2
l sq2
L−l
s
L X Nq − nq
V̂ (τ̂2St ) = L2 ·
· e +
· Nq2 ·
L
l
l q=1
Nq
nq
with se2 =
2
nq
l X
2
1 X
τ̂
1
τ̂q −
, sq2 =
·
yqi − y q
l − 1 q=1
L
nq − 1
i=1
cf. Lohr (1999), p. 147.
I
The estimator is unbiased, but the first and second term do not
estimate the variance at the respective stage (cf. Särndal et al.
1992, p. 139 f., Lohr 1999, p. 210):
2
2
L
L−l
s
l
σ
L
l X
E L2 ·
· e = L2 · 1 −
· e +
1−
V (τ̂q )
L
l
L
l
l
L q=1
Pisa, 19th May 2016 | Ralf Münnich | 16 (78)
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Aim of the lecture
I
Learn more about the statistic π
b
I
I
I
I
I
I
I
I
b π
How do V π
b and V
b correspond to each other?
Is the characteristic equation for variance estimation all we
need to know?
b π
E V
b =V π
b
(1)
How do we measure the characteristic equation?
What influences the reliability of the characteristic equation?
Can we properly simulate this?
By the way
I
I
I
Pisa, 19
1.
2.
3.
4.
Distribution of the estimator
Derivation
from this
of more information
th
Do we have more than one choice?
Can we apply this to more sophisticated estimators?
How does the statistical production process influence the
findings, e.g w.r.t. non-sampling errors
May 2016 | Ralf Münnich | 17 (78)
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Experimental study: Sampling design
I
Two stage sampling with stratification at the first stage, 25
strata
I
1. Stage: Drawing 4 PSU in each stratum (contains 8 PSU on
average, altogether 200 PSU)
I
2. Stage: Proportional allocation of the sample size (1,000
USU) to the PSU (contains 500 USU on average, altogether
100,000 USU)
Pisa, 19th May 2016 | Ralf Münnich | 18 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Experimental study: Scenarios
I
Scenario 1 : Units within PSU are heterogeneous with respect
to the variable of interest Y ∼ LN(10, 1.52 ), PSU are of equal
size
I
Scenario 2 : Units within PSU are homogeneous with respect
to the variable of interest, PSU are of equal size
I
Scenario 3 : Units within PSU are heterogeneous with respect
to the variable of interest Y ∼ LN(10, 1.52 ), PSU are of
unequal size
Pisa, 19th May 2016 | Ralf Münnich | 19 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
1.1 An introductory example
1.2 Surveys and quality measurement
1.3 Some first examples of surveys
Notes
Variance estimates for the total
Variance Estimates for the Total
PSU Size
2000
3000
Scenario 1
c(−1, 1)
Scenario 2
140000 60000
100000
140000 60000
1000
Scenario 3
60000
100000
PSU mean
100000
140000
0
4000
5000
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
+●●
●
●●
●
●●
●
●
●
●
●
●
●
●● ●
●
●
●
●●
●●
●●
●● ●●●
+●●●
●
●●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
+●●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●● ●
+ ●●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●
● ●
●● ● ●
+ ●●●●●●●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
●
●●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●●● ●
+ ●●●
+
+
+
+
+
+
+
+
+
+
+
Scenario 1
●
●● ●
●
●
●
●●
●
●
●
●
●●
●●●●
●
●
● ●●
●●
●
●●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●●
● ● ●● ●●●
●
●
●
●
●●
●
●●
●●
●
●
●
●●
●
●
●
●
Scenario 2
●
●
●
●
●
●●
●
●
●
●●
●
●●
●●● ● ●
● ●
●
●
●●
●
●
●●
●●
●●
●●
●
●
● ● ●●●
● ● ●● ●
●●
● ● ●● ●
●
●●
●
●●●●
●
●
● ●●●●● ● ●
● ●● ●●
●
●
●
●
●
●●
●
●
●
●●
● ●●●●●
●
● ● ●
●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
● ●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
●
●
●
●●
●
●
●
●
● ●●
●
●●●
●
●●
● ●●
●●●
●
●
●●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●● ● ●
+●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
+●●●●
●
●●
●
●●
●
●
●●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●●● ●
+●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●● ●
●
● ● ●
+●●
●
●
●
●
●
●●●
●
●●● ●
●
+ ●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●● ● ● ●
+
●●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
+●●●
●
●●
●
●
●
●
●
●
●●
●
●
●●●
●●
●●
●
●●
●●
●
●● ● ●
+●●●●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●●
●
● ●●●●●● ●●
+ ●●●●
●●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
+●●●
●
●●
●
●
●
●
●
●
●●
●
●
●●●
●●
●●
●
●●
●●
●
●● ● ●
+●●●●●
+
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●●●●
●
●● ●●
●●
●●●● ●
●●●
●●
●
●● ●
●
●
●●● ●
●
●
●
●
●
●
●
Scenario 3
●
●●
●●● ●● ● ●
●
●●
●●●
● ●
●●
● ●● ●●
●●● ●
●
●
●
●
● ●●
● ●●
●
●● ● ●●
●●
● ●● ●
●
● ●● ●●
● ●●
●
● ●● ● ●
● ●
●
●●
●
●●
●●
●●● ● ●● ●
●●
●
●
●
●
● ● ●●
●● ●●
●
●
●
●
● ●
● ●
●
●
● ●●
●
●
●●
● ●●
●●
● ● ●
●●
●● ●● ●
●●
●●● ● ●● ●
●●
●
●
●
●
0
●
● ●
● ●
● ●●●● ● ● ●● ●
●
●
●
●
● ● ● ●● ●● ● ●● ●●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
10
● ●
● ●
●
● ●●●
● ●●
●
●
●●
● ●●
●●●
●
20
30
40
Relative Bias of the Variance Estimation
Pisa, 19th May 2016 | Ralf Münnich | 20 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Second order inclusion probabilities
In case of unequal probability sampling designs, we also need the
second order inclusion probabilities for variance estimation:
Second order inclusion probability
The probability that both elements i and j are drawn in the sample
is denoted by
X
πij =
P(S)
,
S∈Si,j
and is called second order inclusion probability.
From this definition, we can conclude that πii = πi holds.
Pisa, 19th May 2016 | Ralf Münnich | 21 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Horvitz-Thompson estimator
For estimating a total τY of a variable of interest Y (πps, WOR)
we take
X yi
X
τbHT =
=
di · yi ,
πi
i∈S
i∈S
where di = 1/πi denote the design weights (as reciprocal of the
first order inclusion probabilities.
In order to estimate means, one should take the Hájek estimator
P
di · yi
i∈S
µ
b= P
di
i∈S
even if the population total is known.
Pisa, 19th May 2016 | Ralf Münnich | 22 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Properties of the HT estimator
I
HT estimator: class of homogeneous, linear unbiased
estimators
I
The HT estimator is admissible;
the HH estimator is not necessarily admissable
(improved HH via Rao-Blackwellization)
I
Negative variance estimates are possible in some designs!
Cassel, C.; Särndal, C.-E.; Wretman, J.H. (1977): Foundations of
inference in survey sampling. Wiley.
Gabler, S. (1990): Minimax Solutions in Sampling from Finite
Populations. Lecture Notes in Statistics, 64. Springer.
Hedayat, A.S.; Sinha, B.K. (1991); Design and Inference in Finite
Population Sampling, Wiley.
Pisa, 19th May 2016 | Ralf Münnich | 23 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Variance of the HT estimator
Assuming positive second order inclusion probabilities we obtain
y 2
XX
X
yi yj
i
+2·
(πij − πi · πj ) ·
V τb =
πi (1 − πi ) ·
·
πi
πi πj
i∈U
i,j∈U
i<j
as the variance of the Horvitz-Thompson estimator τb.
An unbiased estimator of this variance is:
y 2
XX
X
πi · πj
yi yj
i
b HT τb =
V
(1 − πi ) ·
+2·
1−
·
·
πi
πij
πi πj
i∈S
i,j∈S
i<j
Pisa, 19th May 2016 | Ralf Münnich | 24 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Sen-Yates-Grundy variance estimator
Alternatively, for designs with fixed sample sizes, we can use the
Sen-Yates-Grundy variance estimator:
yj 2
1 XX
yi
VSYG τb = −
(πij − πi · πj ) ·
−
2
πi
πj
i,j∈U
i6=j
=
XX
(πi · πj − πij ) ·
i,j∈U
i<j
yj
yi
−
πi
πj
2
As unbiased estimator can be applied:
X X πi · πj − πij
b SYG τb =
V
·
πij
i,j∈S
i<j
Pisa, 19th May 2016 | Ralf Münnich | 25 (78)
yj
yi
−
πi
πj
2
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Example for approximations
I In presence of a sampling design with maximum entropy the following
general approximation of the variance results:
Vapprox (τ̂ ) =
X bi
2
· (yi − yi∗ )
πi2
i∈U
P
yi∗ = πi ·
j∈U
bi · yi /πj
P
bj
j∈U
I Hájek approximation:
biHajek =
πi · (1 − πi ) · N
N −1
Cf. Matei and Tillé (2005) or Hulliger et. al (2011)
Pisa, 19th May 2016 | Ralf Münnich | 26 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Linearization of non-linear statistics
Suppose we have d different study variables and we want to
estimate parameter θ of the finite population U of size N, which
has the following form
θ = f (τ ) ,
(2)
P
where τ = (τ1 , . . . , τk , . . . , τd ) and τk = i∈U yki , with yki as the
observation of k-th study variable of the i-th element in U.
Then we substitute the unknown parameter vector τ in (2) by its
estimate τ̂ = (τ̂1 , . . . , τ̂k , . . . , τ̂d ), which yields
θ̂ = f (τ̂ ) ,
P
with τ̂k = i∈s yki wi as the estimated total of the k-th study
variable and wi is the survey weight of the i-th element in s.
Further, it is assumed that τ̂k is a consistent estimator of τk .
Pisa, 19th May 2016 | Ralf Münnich | 27 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
In case the function f is continuously differentiable up to order two
at each point in the open set S containing τ and τ̂ , we can use a
Taylor expansion
θ̂ − θ =
d X
∂f (p1 , . . . , pd )
(τ̂k − τk ) + R(τ̂ ,τ ) ,
∂τk
p=τ
(3)
k=1
where
R(τ̂ ,τ ) =
d
d 1 X X ∂ 2 f (p1 , . . . , pd )
(τ̂k − τk )(τ̂l − τl )
2!
∂pk ∂pl
p=τ̈
k=1 l=1
and τ̈ is in the interior of line segment L(τ , τ̂ ) joining τ and τ̂ .
For the remainder term R we have R = Op (rn2 ), where rn → 0 as
n → ∞. Further, we have θ̂ − θ = Op (rn ). Thus, in most
applications it is common practice to regard R as negligible in (3)
for sample sizes large enough.
Pisa, 19th May 2016 | Ralf Münnich | 28 (78)
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
This justifies the use of the following approximation:
θ̂ − θ ≈
d X
∂f (p1 , . . . , pd )
(τ̂k − τk ) .
∂τk
p=τ
(4)
k=1
Note, that in expression (4) only the linear part of the Taylor series
is kept. Now, we can use (4) to derive an approximation of the
mean square error (MSE) of θ̂ which is given by
!
d X
∂f (p1 , . . . , pd )
τ̂k
(5)
MSE θ̂ ≈ V
∂τk
p=τ
k=1
=
d
X
ak2 V(τ̂k ) + 2
k=1
d X
d
X
ak al Cov (τ̂k , τ̂l ) ,
k=1 l=1
k<l
pd )
]p=τ .
where ak = [ ∂f (p1∂τ, ...,
k
Pisa, 19th May 2016 | Ralf Münnich | 29 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Because MSE(θ̂) = V(θ̂) + Bias(θ̂)2 , where Bias(θ̂) = θ̂ − θ, we
can approximate the variance of θ̂ by MSE(θ̂) since V(θ̂) is of
higher order than Bias(θ̂)2 for unbiased or at least consistent
estimators. Thus, we can use (5) as an approximation of the
design variance of θ̂.
The methodology holds for smooth functions. For
non-differentiable functions (e.g. quantiles), influence functions
could be applied (cf. Deville, 1999). Finally, estimating equations
may also lead to linearized variables (cf. Binder and Patak, 1994).
Finally, only the linearized variables have to be derived in order to
apply the linear methodology for non-linear statistics (e.g. poverty
indicators).
Pisa, 19th May 2016 | Ralf Münnich | 30 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
To estimate (5), we could simply substitute the variances and
covariances with their corresponding estimates. This, however,
might become unpractical if d, the number of estimated totals in
τ̂ , becomes large. To evade this problem, Woodruff (1971)
suggested the following:
d
d
n
X
X
X
MSE θ̂ ≈ V(
ak τ̂k ) ≈ V(
ak
wi yki )
k=1
≈ V(
n
X
k=1
wi
i=1
d
X
k=1
i=1
n
X
ak yki ) ≈ V(
wi zi )
,
i=1
where
zi =
d
X
ak yki
with
k=1
Pisa, 19th May 2016 | Ralf Münnich | 31 (78)
X
MSE θ̂ ≈ V
wi zi
i∈s
Variance Estimation Methods
!
.
(6)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
CLAN: Function of Totals
Andersson and Nordberg introduced easy to computer SAS macros
in order to produce linearized values for functions of totals:
Let θ = τ1 ◦ τ2 a function of totals from ◦ ∈ {+, −, ·, /}. Then
Operator
+
−
·
/
z transformation
zk = y1k + y2k
zk = y1k − y2k
zk = θ · (y1k /t1 + y2k /t2 )
zk = θ · (y1k /t1 − y2k /t2 )
The proof follows from applying Woodruff’s method. Now, any
functions using the above operators of totals can be recursively
developed, which can be integrated in software (cf. Andersson and
Nordberg, 1994).
Pisa, 19th May 2016 | Ralf Münnich | 32 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Evidence-based Policy Decision
Based on Indicators
I
I
I
Indicators are seen as true values
In general, indicators are simply survey variables
No modelling is used to
I
I
I
Reading naively point estimator tables may lead to
misinterpretations
I
I
I
I
Improve quality and accuracy of indicators
Disaggregate values towards domains and areas
Change (Münnich and Zins, 2011)
Benchmarking (change in European policy)
How accurate are estimates for indicators (ARPR, RMPG,
GINI, and QSR)?
This leads to applying the adequate variance estimation
methods
Pisa, 19th May 2016 | Ralf Münnich | 33 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Linearization and Resampling Methods
The statistics in question (the Laeken indicators) are highly
non-linear.
I
Resampling methods
Kovačević and Yung (1997)
I
I
I
I
Balanced repeated replication
Jackknife
Bootstrap
Linearization methods
I
I
I
I
I
Taylor’s method
Woodruff linearization, Woodruff (1971) or Andersson and
Nordberg (1994)
Estimating equations, Kovačević and Binder (1997)
Influence functions, Deville (1999)
Demnati and Rao (2004)
Pisa, 19th May 2016 | Ralf Münnich | 34 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Application to poverty and inequality indicators
Using the linearized values for the statistics ARPR, GINI, and QSR
to approximate their variances.
Calibrated weights wi : zi are residuals of the regression of the
linearized values on the auxiliary variables used in the calibration
(cf. Deville, 1999).
Indicator I Source
ARPR:
Deville (1999)
GINI:
Kovačević and Binder (1997)
QSR:
Hulliger and Münnich (2007)
RMPG:
Osier (2009)
For CI estimation, empirical likelihood methods may be preferable
(cf. Berger, De La Riva Torres, 2015).
Pisa, 19th May 2016 | Ralf Münnich | 35 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
Notes
Example: Quintile share ratio
Starting the quintiles of the groups R and P we get
X
. X
wi · yi − yi · 1(yi ≤ yb0,8 )
wi · (1 − 0,8)
µ
bR =
i
µ
bP =
X
i
.X
wi · yi · 1(yi ≤ yb0,2 )
wi · 0,2 .
i
i
Next, we define the QSR as a function of four totals:
bR
τb1 . τb3
d =µ
QSR
=
µ
bP
τb2 τb4
In order to apply linearized variance estimation, we have to derive
the linearized variables.
Pisa, 19th May 2016 | Ralf Münnich | 36 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
2.1 Unequal probability sampling
2.2 Linearization of nonlinear statistics
u1i = yi − (yi − y0,8 ) · 1(yi ≤ yb0,8 ) + 0,8 · y0,8
Notes
u2i = 0,2
u3i = (yi − y0,2 ) · 1(yi ≤ yb0,2 ) + 0,2 · y0,2
u4i = 0,2
1
τb1
=µ
bR
· u2i ) ·
b · 0,2
τb2
N
τb3
1
u6i = (u3i −
· u4i ) ·
=µ
bP
b · 0,2
τb4
N
b · 0,2. Finally, we get
where τb2 = τb4 = N
[ · u6i ) · τb4
zi = (u5i − QSR
τb3
Note: The linearization of quintiles will subject to the methods of
using influence functions (see Deville, 1999).
u5i = (u1i −
Pisa, 19th May 2016 | Ralf Münnich | 37 (78)
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Resampling methods
I
Idea: draw repeatedly (sub-)samples from the sample in order
to build the sampling distribution of the statistic of interest
I
Estimate the variance as variability of the estimates from the
resamples
Methods of interest
I
I
I
I
I
I
Some remarks:
I
I
I
Pisa, 19
1.
2.
3.
4.
Random groups
Balanced repeated replication (balanced half samples)
Jackknife techniques
Bootstrap techniques
th
If it works, one doesn’t need second order statistics for the
estimate
May be computationally exceptional
What does influence the quality of these estimates
May 2016 | Ralf Münnich | 38 (78)
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Random groups
I
I
I
I
I
Mahalanobis (1939)
Aim: estimate variance of statistic θ
Random partition of sample into R groups (independently)
θb(r ) denotes the estimate of θ on r -th subsample
Random group points estimate:
R
1 Xb
θ(r )
θbRG = ·
R
r =1
I
Random group variance estimate:
R
X
2
b θbRG ) = 1 · 1 ·
V(
θb(r ) − θbRG
R R −1
r =1
I
Random selection versus random partition!
Pisa, 19th May 2016 | Ralf Münnich | 39 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Balanced repeated replication
I
I
I
I
I
Originally we have two observations per stratum
Random partioning of observations into two groups
θbr is the estimate of the r -th selection using the H half
samples
Instead of recalling all possible R 2H replications, we use a
balanced selection via Hadamard matrices
We obtain:
R
R
X
2
1 X
bBRR θb = 1
θbBRR = ·
τbr and V
θbr − θb
R
R
r =1
I
I
.
r =1
May lead to highly variable variance estimates, especially
when H is small (cf. Davison and Sardy, 2004). Repetition of
random grouping may be useful (cf. Rao and Shao, 1996)
Use special weighting techniques for improvements
Pisa, 19th May 2016 | Ralf Münnich | 40 (78)
Variance Estimation Methods
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
The Jackknife
Originally, the Jackknife method was introduced for estimating the
bias of a statistic (Quenouille, 1949).
b 1 , . . . , Yn ) be the statistic of interest for estimating the
Let θ(Y
parameter θ. Then,
b 1 , . . . , Yi−1 , Yi+1 , . . . , Yn )
θb−i = θ(Y
is the corresponding statistic omitting the observation Yi which is
therefore based on n − 1 observations. Finally, the
delete-1-Jackknife (d1JK) bias for θ is
!
X
b d1JK θb = (n − 1) · 1
B
θb−i − θb
n
i∈S
(cf. Shao und Tu, 1995).
Pisa, 19th May 2016 | Ralf Münnich | 41 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
The jackknife (continued)
From the bias follows immediately the Jackknife point estimate
b d1JK θb
θbd1JK = θb − B
n−1Xb
= n · θb −
θ−i
n
i∈S
which is a delete-1-Jackknife bias corrected estimate. This
estimator is under milde smoothness conditions of order n−2 .
Pisa, 19th May 2016 | Ralf Münnich | 42 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Jackknife variance estimation
Tukey (1958) defined the so-called jackknife pseudo values
θbi∗ := n · θb − (n − 1) · θb−i which yield under the assumption of
stochastic independency and approximately equal variance of the
θbi∗ . Finally
b d1JK θb =
V
X
∗ 2
1
·
θbi∗ − θb
n(n − 1)
i∈S
n−1X b
1 Xb
=
θ−i −
θ−j
n
n
i∈S
!2
i∈S
b d1JK θb for θb = Y ?
Problem: What is θbi∗ and V
Pisa, 19th May 2016 | Ralf Münnich | 43 (78)
Variance Estimation Methods
.
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Advantages and disadvantages of the jackknife
I
I
I
Very good for smooth statistics
Biased for the estimation of the median
Needs special weights in stratified random sampling (missing
independency of jackknife resamples)
b =
bd1JK,strat (θ)
V
nh
h
X
2
(1 − fh ) · (nh − 1) X
θbh,−i − θbh
·
nh
h=1
I
I
i=1
where −i indicates the unit i that is left out.
Specialized procedures are needed for (really) complex designs
(cf. Rao, Berger, and others)
Huge effort in case of large samples sizes (n):
I
I
grouped jackknife (m groups; cf. Kott and R-package EVER)
delete-d-jackknife (m replicates with
d sample observations
eliminated simultaneously; m dn )
Pisa, 19th May 2016 | Ralf Münnich | 44 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Bootstrap resampling
I
Theoretical bootstrap
I
Monte-Carlo bootstrap:
Random selection of size n (SRS) yields
bBoot,MC =
V
B
B
1 X b∗
1 X b∗ 2
θn,i −
θn,j
B −1
B
i=1
I
Special adaptions are needed in complex surveys
I
Insufficient estimates in WOR sampling and higher sample
fractions
Pisa, 19th May 2016 | Ralf Münnich | 45 (78)
1.
2.
3.
4.
.
j=1
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Monte-Carlo bootstrap
Efron (1982):
1. Estimate Fb as the empirical distribution function
(non-parametric maximum likelihood estimation);
2. Draw bootstrap samples from Fb , that is
X1∗ , . . . , Xn∗
IID
∼
Fb
of size n;
∗ =τ
3. Compute the bootstrap estimate τbn,i
b X1∗ , . . . , Xn∗ ;
4. Repeat 1. to 3. B times (B arbitrarily large) and compute
finally the variance
b Boot,MC =
V
B
B
1 X ∗ 2
1 X ∗
τbn,i −
τbn,j
B −1
B
i=1
Pisa, 19th May 2016 | Ralf Münnich | 46 (78)
j=1
Variance Estimation Methods
.
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Properties of the Monte-Carlo bootstrap
The bootstrap variance estimates converge by the law of large
numbers to the true (theoretical) bootstrap variance estimate (cf.
Shao and Tu, 1995, S. 11)
a.s.
b Boot,MC −→
V
VBoot
.
Analogously, one can derive the bootstrap bias of the estimator by
B
X
∗
b Boot,MC = 1
B
τbn,i
− τb
B
.
i=1
Pisa, 19th May 2016 | Ralf Münnich | 47 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Bootstrap confidence intervals
I
Via variance estimation
q
q
h
i
b Boot,MC τb · z1−α/2 ; τb − V
b Boot,MC τb · zα/2
τb − V
I
Via bootstrap resamples:
τb∗ − τb
z1∗ = q 1
b Boot, MC τb∗
V
1
, ... ,
τb∗ − τb
zB∗ = q B
b Boot, MC τb∗
V
B
From this empirical distribution, one can calculate the α/2∗
∗
and (1 − α/2) quantiles zα/2
and z1−α/2
respectively by
q
q
h
i
b Boot,MC τb · z ∗
b Boot,MC τb · zBα/2
τb − V
b− V
B(1−α/2) ; τ
This is referred to as the studentized bootstrap confidence
interval.
Pisa, 19th May 2016 | Ralf Münnich | 48 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Rescaling bootstrap
I Rescaling bootstrap: In case of multistage sampling only the first stage is
considered. l ∗ (must be chosen) instead of l PSU are drawn with replacement
(see Rao, Wu and Yue, 1992, Rust, 1996) The weights are adjusted by:
"
#
∗ 1/2 ! ∗ 1/2 l
l
l
wqi∗ =
1−
+
·
· rq · wqi .
∗
l −1
l −1
l
I Rescaling bootstrap without replacement: From the l units of the sample,
l ∗ = bl/2c units are drawn without replacement (see Chipperfield and Preston,
2007). In case of single stage sampling, the weights are adjusted by:
s
n
(1 − f )
wi∗ = 1 − λ + λ · ∗ · δi · wi , with λ = n∗ ·
,
n
(n − n∗ )
where δi is 1 when element i is chosen and 0 otherwise. For multistage designs
(cf. Preston, 2009)
the weights are adjusted q
at each stage by adding the term
Q −1 q
Q −1
∗ ) · δ at
−λG · ( G
(ng /ng∗ ) · δg ) + λG · ( G
(ng /ng∗ ) · δg ) · (nG /nG
g
g =1
g =1
s
Q
(1
−
f
)
G
G
−1
∗(
each stage G with λG = nG
f
)
·
.
g
g =1
∗
nG − nG
Pisa, 19th May 2016 | Ralf Münnich | 49 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Comparison (cf. Bruch et al., 2011)
Method
BRR (Basic Model)
BRR
(Group)
Delete-1
Jackknife
Delete-d
Jackknife
Deletea-Group
Jackknife
Statistic
Smooth
and nonsmooth
Only when
2
elements per
stratum
Wolter
(2007, p.
113)
Smooth
and nonsmooth
Required
Only for
smooth
statistics
Appropriate
Smooth
and nonsmooth
Appropriate
Smooth
and nonsmooth
Appropriate
Not considered
Berger
(2007)
Not considered
Not considered
WR
WoR
WR/WoR
WR/WoR
WR/WoR
Not considered
Considered
Possible
Possible
Possible
Stratification
Unequal
Probability
Sampling
Sampling
WR/WoR
FPC
Pisa, 19th May 2016 | Ralf Münnich | 50 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Monte
Carlo
Bootstrap
Smooth
and nonsmooth
Appropriate
Rescaling
Bootstrap
Smooth
and nonsmooth
Appropriate
Rescaling
Bootstrap
WoR
Smooth
and nonsmooth
Appropriate
The ordinary
Monte
Carlo
Bootstrap
may lead
to biased
variance
estimates
WR
Not considered
Not considered
WR
WoR
Not considered
Not considered
Considered
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Characteristics of the
AMELI universe
Pisa, 19th May 2016 | Ralf Münnich | 51 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
−0.5
ARPR
rbrr
boot
approx4
naive
+
+
+
+
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
● ●●●●●●
●
●●●● ● ●●
GINI
+
+
MEAN
+
+
●●●
+
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
+●●●●●●●●●●
+●●●●
●●
●●●
+
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
+●●●●●●●●●●
+●●●●
●●
●●
●
1.5a
QSR
RMPG
+
●●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●●
●
● ●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
+
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●● ●
● ●
●
+
●
●
●
●●
●●●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●● ●●
●
●●
●
●
+
MEAN
●
●●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●●●
●●●●●●
+
+●
+
●
●
●●
●
●●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●●
●
●●
●●
●
●●
●
●●●●●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●●●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●● ● ●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●
●●
●
●
●●●
● ●●
●
●
●
+
●●
●
●
●●
●
●
●●
●
●●
●●
+
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●●
●
● ●
+
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
+
●
●●
●
●●
●
●
●●
●
●●
●●
+
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
● ●
+
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
+
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●●●
● ●●
+
●●
●
●
●
●
●●
●
●
●
ARPR
+
1.4a
QSR
RMPG
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●●
+
●
●●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●●●
●
+
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
● ● ●
+
●
●
●
●
●
●
●
●●
●
●●●
+
●
●
●
●
0.5
+
GINI
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●●
+
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●●●
●
●
+
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●●
●
●
●
●
●
●●
1.2
QSR
RMPG
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
● ●● ●
+
●
●●
●
●
●
●
+
● ●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●●●● ●●
+
+
●
●●
●●
●
●●
●
●
●
●●
●
●
●
● ●●
−0.5
●
● ●
●●
●
+
GINI
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
+
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●
●●
●
● ●
●●
●●
●●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●●●●
●
●●
● ●●
+
+●●●●●●●●●●●●●●●●
+
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●●
●●
●
●
●●●●●●
●
●●●●●●●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●●●●● ● ●●
●
●●
●● ● ●●● ● ●●
+
+
●● ●
+
+
●
●
+
●●
●●
●
●
●
●●●
●
−0.5
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
●●●●
●●●
●
●
●
●●●
●● ●
+●●●●●●●●●●
+
+
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●●●
●
●
●●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●●●
●
●●
●●
●●
●
●
●
●
+
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●●
●
●
●●●
●
+
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●●●
●
●
●●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●●●
●
●●
●
●●
+
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●●
●
●●
●
●
●●
●
●
●●
● ●●●● ●● ●
+
●
●
●
MEAN
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●●●
●
●
●●+
●●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●●
●
●●
●
●● ●●
●●
●●
●●
●
●●● ●
●●
+●●●●●●●●●●●●●●●●●●●●●●●●
●
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●●
●
●●
●●●●●
●● ●● ●●●●● ●●● ●
+
●
●
●
●
●
●●
2.6
QSR
RMPG
+
+
0.5
GINI
+
+
ARPR
rbrr
boot
approx4
naive
2.7
QSR
+
ARPR
rbrr
boot
approx4
naive
−0.5
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●●●●
●
●
●
●
●●●
●●
●
●●
● ●
● ●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●●
●
● ●●
●
● ● ●●●●
●●●●
ARPR
rbrr
boot
approx4
naive
0.5
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●●
●●●
●
●
●
●● ●●●● ●
●
●
●● ●
●●● ● ●
●● ●
●
+
rbrr
boot
approx4
naive
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
RMPG
+
Variance Estimation Methods
+
MEAN
●
●●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●●●●● ●
+
+
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●
●●
●●
●
●●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
GINI
●
●●
●
●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●
●
●●●
●● ●
●●
●
●●
●
●
●●
●●
●
●●●
●●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●●
● ●
+
+
0.5
Pisa, 19th May 2016 | Ralf Münnich | 52 (78)
●
●
●
+
●
●
●●
●●
●
●●
●
●●
●●
●
●
●●●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●●
●●
●●
●
+
●● ●
+
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
MEAN
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●
● ●●●
●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●●●
●●
●
●
●
+
+
●
−0.5
●
●●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●●
●
●●
●
●●
●
●●●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
0.5
Variance Estimation Methods
Notes
ARPR
rbrr
boot
approx4
naive
RMPG
+
QSR
+
GINI
+
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●●
●●●
●
●
●
●● ●●●● ●
●
●
●● ●
●●● ● ●
●● ●
●
+
+
+
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●●●●
●
●
●
●
●●●
●●
●
●●
● ●
● ●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●●
●
● ●●
●
● ● ●●●●
●●●●
+
+
+
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
● ●●●●●●
●
●●●● ● ●●
ARPR
●
+
●
+
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
●●●●
●●●
●
●
●
●●●
●● ●
+●●●●●●●●●●
+
●
2.6
QSR
RMPG
MEAN
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●●●
●
●
●●+
●●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●●
●
●●
●
●● ●●
●●
●●
●●
●
●●● ●
●●
+●●●●●●●●●●●●●●●●●●●●●●●●
●
●●
GINI
●
●● ●
MEAN
rbrr
boot
approx4
naive
1.
2.
3.
4.
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●●
●
●●
●
●
●●
●
●
●●
● ●●●● ●● ●
●●
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●
●●
●
● ●
●●
●●
●●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●●●●
●
●●
● ●●
● ●
● ●
●●
●
+
+
+
+
+●●●●●●●●●●●●●●●●
Introduction to variance
Random
groups
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●●
●
●●
●
●●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●●
●●
●
●
●●●●●●
●
●●●●●●●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●●●●● ● ●●
●
●●
●● ● ●●● ● ●●
+ estimation
+●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●3.1
+●●● ●● ●●●●● ●●● ●
+
+
Linearization methods
3.2 Balanced repeated replication
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●●●
●
●● ●●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
●●
+●●●●●●●●●●●●●●●●●●●●
+●
+●●●●●●●●●●
+●●●●
Resampling methods +
3.3
Jackknife
methods
Further topics in variance
3.5
The
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●bootstrap
●
●
●●
●
●
●●●
●
●● ●●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
●●
+ estimation
+●●●●●●●●●●●●●●●●●●●●
+●
+●●●●●●●●●●
+●●●●
ARPR
rbrr
boot
approx4
naive
+
●
●
●
+
●
●
●
●
RMPG
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●● ●
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●●
●
● ●
+
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
● ●
+
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
1.4a
QSR
RMPG
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●●●
● ●●
●
●
●
+
+
●
●●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●●●
●
+
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
● ● ●
+
●
●
●
●
●
●
●
●●
●
●●●
+
●
●
●
●
ARPR
+
●
●●
●
●
●●
●●
●
●●●
●●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●●
+
● ●
ARPR
+
+
●
MEAN
●
●●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●●●
+
●●
●
●
●
+
●
●
●●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●●
●
●●
●
●●
●
●●●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
0.5
+
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
●●●●
●●●
●
●
●
●●●
●● ●
+●●●●●●●●●●
+
●
GINI
+
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
+●●●●●●●●●●
+●●●●
●●
+
●●●
+
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●
●
●
●● ●
●●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●● ● ● ●
●
● ● ●● ●● ●
+●●●●●●●●●●
+●●●●
●●
●●
●
+
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
+
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●●
●●
●
●
●
+
●
●
●
+
●●
●
●
●●
●
●
●●
●
●●
●●
+
●
●
●
+
●
●●
●
●●
●
●
●●
●
●●
●●
Pisa, 19th May
2016 | Ralf Münnich
| 52 (78)
ARPR
RMPG
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●●●
● ●●
●
●
●
● ●
+
●
●●
●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●●●
●●●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
● ● ●
+
●
●
●
●
+
+
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●
●●
●
●
●●●
● ●●
+
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
● ●
+
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●●
+
●
●
●
●
●●
●
●●
●
●
●●
●●
●
●●●
●●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●●●●● ●
+
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●
●●
+
● ●
+
+
0.5
RMPG
ARPR
RMPG
2.6
QSR
GINI
MEAN
ARPR
RMPG
1.5a
QSR
GINI
MEAN
ARPR
RMPG
1.4a
QSR
GINI
MEAN
ARPR
RMPG
1.2
QSR
GINI
MEAN
GINI
MEAN
MEAN
+
●●
●
●●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
rbrr
Variance Estimation Methods
●
●
●
+
●
●
●●
●●
●
●●
●
●●
●●
●
●
●●●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●●
●●
●●
●
+
●● ●
+
●
GINI
●
●●
●
●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●
●
●●●
●● ●
●●
ARPR
Pisa, 19th May 2016 | Ralf Münnich | 53 (78)
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●● ● ●
●
●
●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
boot
+
+
+
−0.5
+●
●
●
●●
●
●●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●●
●
●●
●●
●
●●
●
●●●●●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●●●●●●
●
●●
●
●●
●
●●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●●
●● ●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●●
●
● ●
+
●
● ●
●●
●
MEAN
●
●●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●●●
●●●●●●
+
2.7
QSR
approx4
+
●
●●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●● ●
+
●
●●
●●
●
●●
●
●
●
●●
●
●
●
● ●●
0.5
●
+
+
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●●●● ●●
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
●
●
●●
●●●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●● ●●
●
●●
●
1.2
QSR
RMPG
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
● ●● ●
GINI
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
+
1.4a
Variance
QSR Estimation Methods
GINI
+
●
●
●
●
●
●
●
●●
●
●●●
+
+
+
+
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●
●●
●
● ●
●●
●●
●●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●●●●
●
●●
● ●●
+
+●●●●●●●●●●●●●●●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●●
●●
●
●
●●●●●●
●
●●●●●●●
●●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●●●●● ● ●●
●
●●
●● ● ●●● ● ●●
1.5a
QSR
RMPG
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●● ●
+
MEAN
●●●
●●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●●
●
● ●
●
●●
●
●
●
●
●● ●
+
+
ARPR
●
+
ARPR
+
●
+
+
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●●●
●
●
●●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●●●
●
●●
●●
●
●
●
●
●●
MEAN
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●●
●
●
●●
●
●●●●
●
●
●
●
●
●
●●
●
●●
●
●
●●●
●
+
+
●●
●●
●
●
●
0.5
+
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●●●●●
●
●
●●
●
●
●
●●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●●
●
●
●●●
●
●●
naive
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●●●
●
●
●●+
●●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●●
●
●●
●
●● ●●
●●
●●
●●
●
●●● ●
●●
+●●●●●●●●●●●●●●●●●●●●●●●●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●●
●
●●
●●●●●
●● ●● ●●●●● ●●● ●
−0.5
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
Notes
2.6
QSR
+
+
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
+
GINI
+
RMPG
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●●
●
●
●●
●●
●
●●
●
●●
●●
●
●
●●●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●●
●●
●●
●
+
−0.5
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●●
●
●●
●
●
●●
●
●
●●
● ●●●● ●● ●
+
+
●● ●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●
● ●●●
●
−0.5
+
+
●●
●
●●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
+
2.7
QSR
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
● ●●●●●●
●
●●●● ● ●●
●
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●
●●
0.5
+
●●●
●
+
●
●
●
GINI
0.5
+
+
●
●●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●●●●● ●
Variance Estimation Methods
RMPG
+
MEAN
+
+
●
●●
●
●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●
●
●●●
●● ●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●●●●
●
●
●
●
●●●
●●
●
●●
● ●
● ●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●●
●
● ●●
●
● ● ●●●●
●●●●
+
●
●
●
●●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●●
●
●
●
●●
●
●
●●
●●●
●
●
●
●● ●●●● ●
●
●
●● ●
●●● ● ●
●● ●
●
+
●
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
−0.5
1.
2.
3.
4.
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●●
+
−0.5
Pisa, 19th May 2016 | Ralf Münnich | 52 (78)
+
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●●●
●
●
+
●
●●
●●
●
●●
●
●
●
●●
●
●
●
● ●●
0.5
+
+
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●●●● ●●
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●●
1.2
QSR
+
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
● ●● ●
GINI
+
● ●
●
RMPG
+
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●
●●
●
●
●●●
● ●●
+
●
●●
●
●
●
●
rbrr
boot
approx4
naive
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●● ● ●
●
●
+
−0.5
rbrr
boot
approx4
naive
+
●
●●
●
●●
●
●
●●
●
●●
●●
+
●
●
●
+●
●
●
●●
●
●●
●
●
●●
●●
●
●
●●●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●●
●
●●
●●
●
●●
●
●●●●●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●●●●●●
●●
●
●
●●
●
●
●●
●
●●
●●
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●●●
Notes
MEAN
●
●●
●
●●
●
●●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●●●
●●●●●●
+
+
●
●
●
●
+
+
●
+
●●
●
●
●
●
●●
rbrr
boot
approx4
naive
●
●
●●
●●●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●● ●●
●
●●
●
●
●
●
●
rbrr
boot
approx4
naive
●
●●
●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
+
●
●
●
+
rbrr
boot
approx4
naive
GINI
+
●
+
ARPR
1.
2.
3.
4.
● ●
+
●●●
●
●
●●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●●
●
●
●
●●
●●
●
●
●
ARPR
rbrr
boot
approx4
naive
+
+
●●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●●
●
● ●
●●
●●
●
●
●
rbrr
boot
approx4
naive
1.5a
QSR
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
MEAN
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●●
● ●●●
●
●
●●
●
●●
●
●
●●
●
●●
●
●●
●●
●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●●●●
●●
●
●
●
+
+
●
−0.5
●
●●
●
●
●
●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●●
●●
●
●●
●
●●
●
●●●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
●
0.5
Notes
ARPR
1.
2.
3.
4.
2.6
QSR
RMPG
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
RMPG
1.5a
QSR
GINI
MEAN
ARPR
RMPG
1.4a
QSR
GINI
MEAN
ARPR
RMPG
1.2
QSR
GINI
MEAN
approx4
boot
Pisa, 19th May 2016 | Ralf Münnich | 53 (78)
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
rbrr
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
ARPR
RMPG
2.7
QSR
GINI
MEAN
ARPR
RMPG
2.6
QSR
GINI
MEAN
ARPR
RMPG
1.5a
QSR
GINI
MEAN
1.4a
QSR
GINI
Variance Estimation Methods
MEAN
1.2
QSR
MEAN
ARPR
Pisa, 19th May 2016
ARPR
1.
2.
3.
4.
MEAN
ARPR
naive
1.
2.
3.
4.
GINI
| RMPG
Ralf Münnich | 53 (78)
RMPG
Introduction to variance estimation
naive
Linearization methods
Resampling methods
Further topics in variance estimation
GINI
Random groups
approx4 3.1
boot
rbrr
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Coverage Rates (in %) of indicator estimates
Direct/appr.
ARPR
RMPG
QSR
GINI
MEAN
Bootstrap
ARPR
RMPG
QSR
GINI
MEAN
1.2
95.070
94.640
94.620
94.440
94.850
1.2
95.100
94.410
94.280
94.240
94.620
1.4a
94.700
94.790
95.260
95.090
95.070
1.4a
94.910
94.750
95.180
94.770
95.260
Pisa, 19th May 2016 | Ralf Münnich | 54 (78)
1.5a
94.950
94.550
94.850
95.140
95.320
1.5a
94.810
94.600
94.220
94.660
95.090
2.6
89.340
92.930
83.880
84.230
78.720
2.6
87.850
92.390
82.210
81.890
77.630
Variance Estimation Methods
2.7
90.640
92.650
83.690
85.550
79.960
2.7
93.070
94.940
88.260
90.070
90.340
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Experimental study: Sampling design
I
Two stage sampling with stratification at the first stage, 25
strata
I
1. Stage: Drawing 4 PSU in each stratum (contains 8 PSU in
average, altogether 200 PSU)
I
2. Stage: Proportional allocation of the sample size (1,000
USU) to the PSU (contains 500 USU in average, altogether
100,000 USU)
Pisa, 19th May 2016 | Ralf Münnich | 55 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Experimental study: Scenarios
I
Scenario 1 : Units within PSU are heterogeneous with respect
to the variable of interest Y ∼ LN(10, 1.52 ), PSU are of equal
size
I
Scenario 2 : Units within PSU are homogeneous with respect
to the variable of interest, PSU are of equal size
I
Scenario 3 : Units within PSU are heterogeneous with respect
to the variable of interest Y ∼ LN(10, 1.52 ), PSU are of
unequal size
Pisa, 19th May 2016 | Ralf Münnich | 56 (78)
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Variance Esitmates for the Total
140000 60000
PSU Size
2000
3000
4000
5000
Scenario 1
c(−1, 1)
Scenario 2
140000 60000
100000
1000
Scenario 3
100000
PSU mean
100000
140000
0
60000
1.
2.
3.
4.
Variance Estimation Methods
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
+●●
●
●●
●
●●
●
●
●
●
●
●
●
●● ●
●
●
●
●●
●●
●●
●● ●●●
+●●●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
+●●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●● ●
+ ●●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●●
● ●
●● ● ●
+ ●●●●●●●●●●
●
●
●
●●
●
●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●●●●
+●●
●
●
●
●●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
+ ●●●●
+
●
●●
●
●●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
Scenario 1
●
●● ●
●
●
●
●●
●
●
● ●●
●●
●
●●
●●
●
●
●●
●
●●
●●
●
●
●●
●
●●
●●
● ● ●● ●●●
●
●
●
●
●●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●●● ●
Scenario 2
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
+
+
+
+
+
+
+
+
+
+
+
RescWoR
Resc
MCboot1
MCboot
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
Dir1
Dir
BRR
+
+
+
+
+
+
●●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
+●●●
●
●●
●
●
●
●
●
●
●●
●
●
●●●
●●
●●
●
●●
●●
●
●● ● ●
+●●●●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●●
●
● ●●●●●● ●●
+ ●●●●
●●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
+●●●
●
●●
●
●
●
●
●
●
●●
●
●
●●●
●●
●●
●
●●
●●
●
●● ● ●
+●●●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●●●●
●
●● ●● ●
+●●●
●
●
●
●
●
●●
●
●
●
● ●●
●●
●●● ● ●
● ●
●
●
●●
●
●
●●
●●
●●
●●
●
●
● ● ●●●
● ● ●● ●
●●
● ● ●● ●
●
●●
●
●●●●
●
●
● ●●●●● ● ●
● ●● ●●
●
●
●
●
●
●●
●
●
●
●●
● ●●●●●
●
● ● ●
●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
● ●
●
●
●
●
●●●
●●●
●
●
●●● ●●● ● ● ●●●
●●● ● ●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
● ● ●●● ● ● ●
●
●
●
●●
●
●
●
●
● ●●
●
●●●
●
●●
● ●●
●●
●●●● ●
●●●
●
● ●
● ●
● ●●●● ● ● ●● ●
●
●
●
●
● ● ● ●● ●● ● ●● ●●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
●
●● ●
●
●
●●
●●● ● ●
● ● ●●
●
●●
●
●● ●
●
●
●●● ●
●
●
●
●
●
●
●
Scenario 3
●
●●●
●●●
●
●
●●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●● ● ●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
●
●●
●
●●
●
●●
●
●
●●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●●
●●● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●● ●
●
● ● ●
●
●●
●
●
●
●
●
●●●
●
●●● ●
●
●
●●
●
●
●
●●
●
●
●●
●
●
●●
●
●●●
●● ● ● ●
0
●
●
●●
●●●●
●
●
●
●●
●●● ●● ● ●
●
●●
●●●
● ●
●●
● ●● ●●
●●● ●
●
●
●
●
● ●●
● ●●
●
●● ● ●●
●●
● ●● ●
● ●● ●●
● ●●
●
● ●● ● ●
● ●
●
●●
●
●●
●●
●●● ● ●● ●
●●
●
●
● ●
● ●
●
●
●
●
●
●● ●●
●
●
●
●
● ● ●●
● ●●
●
●
●●
● ●●
●●
● ● ●
●●
●● ●● ●
●●
●●● ● ●● ●
●●
●
●
●
10
20
● ●
● ●
●
● ●●●
● ●●
●
●
●●
● ●●
●●●
●
30
Relative Bias of the Variance Estimation
Pisa, 19th May 2016 | Ralf Münnich | 57 (78)
Variance Estimation Methods
40
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Variance Estimates for the ARPR
Scenario 1
RescWoR
Resc
MCboot1
MCboot
Lin
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
BRR
RescWoR
Resc
MCboot1
MCboot
Lin
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
BRR
RescWoR
Resc
MCboot1
MCboot
Lin
Jackeuro1
Jackeuro
Jack1
Jack
Gr2
BRR
●
●
●●
+●●●●●●
●
●
●
●
●
●
+●●●●
●
●
●
●
+●●●●●●
●
●
●●
●
●
●
●
●
●
●
●
●●
+ ●●●●
●
+●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●●
●
●
●
●
● ● ●●
●
●
●
●●●●
●
●
●
●
● ●
●
●
●
●
●
+ ●●●●●●●●
●●●●●
●
●
●
●
●
●
●
●●
●●
●●
●●
●●●
●
●●●
●
●●
●
●
●
●●
●
●●●
●
●● ●
●●●
●
●●
●
●
●●
●●●●
●●●
●
●
●
●●
●●
●
● ●
+
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●●
●
●
●
●
● ● ●●
●
●
●
●●●●
●
●
●
●
● ●
●
●
●
●
●
+ ●●●●●●●●
●●●●●
●
●
●
●
●
●
●
●●
●●
●●
●●
●●●
●
●●●
●
●●
●
●
●
●●
●
●●●
●
●● ●
●●●
●
●●
●
●
●●
●●●●
●●●
●
●
●
●●
●●
●
● ●
+
●
●●
●
●●
●
●●
●
●●
●
●
●
●
●
●●●
●
●●
●
●
●
●
●●●
●
●
●
●●●●●●●●
●●
●●
● ●
●●
●
+
+ ●●●●●●●●●●●●●●●●
●● ●
●● ●●●
●●
●
●●●●
●●
●
● ●●
●
● ●
●
●● ●
●● ●●●
●●
●
●●●●
●●
●
● ●●
●
● ●
Scenario 2
+
●
●
●
●
●
●
●
●●●●
+
+
+
+
+
●
+
●
●
●
●
●●
●●
●
●
●
●
●● ●
●
●
●
●●
●
●●
●
●●
●
●
●
●●●
● ● ●●
●●
●
●
●●
●
●
●
●●●
●●●●
+
+
+
+
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●
●●
●
●
●● ●
●
●
● ●
●●
●
● ●
●●
● ●●●
●
●
●
●
●
●
●
●●
●●
●●
●●●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
● ●●
●●
●●
●
●
●●●●●
●●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●
●●
●
●
●● ●
●
●
● ●
●●
●
● ●
●●●●● ●
● ● ●
●
●
●● ●●●
●●
● ●●●
●
●
●
●
●
●
●
●●
●●
●●
●●●
●
●
●
●●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
● ●●
●●
●●
●
●
●●●●●
●●
●
●
●
●
●●
●●
●
●●
●
●● ●
●
●●
●●
●
●
●
●
●
●●
●●
●●
● ●
●●●
●
● ●●● ●
●
●
●●●
●
●
●●●
●
●●●
●
●
●●
●
●●●●
●●●●● ●
● ● ●
●
●
●● ●●●
●●●
●
●●
●● ●
●●
●
●●●●
●
●
●
●●
●● ●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●●
●
●●●●●
●● ●●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
+●●●
●
●
●●
●
●●
●
●
●
●●
+●●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
+●●●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
+ ●●●
●●
●●
●
+
+
+
+
●●
●
●●
●●●
●
●
●
●
●
●
●
●
+ ●●●●●
Scenario 3
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●● ●●●
●●
●●●
●
●
● ●●
●●
●● ●
●● ● ●
+
+
●●● ●●●
●
●●●
● ●●
●
●
●
●
●
●
●
●
●●●●
●
●
●
●●
●●●●●
●
●●●
●●
●●●●●
●●●
●
●
●● ●
●●●
●
●
●●
●●
● ●●● ● ●● ●
●
●●
●
●
●●
●
●
●
●
●●●
●
●
●●
●
●
●●
●
●
●
●
●
●● ●
●
●●
●
● ●●
●●
●● ● ●
●● ●
●●●
● ●● ●
●●
●
●● ●
●●
●● ● ●●
●●
● ●
● ●●
●● ●● ● ●● ● ●
●●●
●●
●
●● ●●
●
● ●●● ●
●●
●●
●
●●●
●●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●●●
●●●
●●●
●
●●
●● ●
●
●
●●● ● ●
●
●● ●
●● ●
●
●● ● ●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●●●
●●
●
●● ●
●●●
●● ●●
●
●●
●●● ●
●●
●
● ●●
●●●●● ●
●
●
●
● ●●
●●
●
●
● ●
●
0
10
20
30
40
Relative Bias of the Variance Estimation
Pisa, 19th May 2016 | Ralf Münnich | 58 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
3.1 Random groups
3.2 Balanced repeated replication
3.3 Jackknife methods
3.5 The bootstrap
Notes
Replication weights
I Doing resampling methods by adjusting the weights
I Advantage: partial anonymization
only the design weights are required (may not be fully true)
I BRR: Adjusting weights by
(r )
wh,i :=

"
1/2 #

(nh − mh ) · (1 − fh )


whi · 1 +
,



mh

δrh = 1,
"

1/2 #


mh · (1 − fh )


,

 whi · 1 −
nh − mh
δrh = −1,
where δrh indicates if the first or second group in stratum h in replication
r is chosen and mh = bnh /2c (cf. Davison and Sardy, 2004)
I Delete-1-Jackknife: The weights of the deleted unit are 0, all others are
nh
computed by
· whi
nh − 1
I Monte-Carlo bootstrap: Computing weights by whi · chi where chi
indicates how often unit i in stratum h is drawn with replacement
Pisa, 19th May 2016 | Ralf Münnich | 59 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Jackknife method for EBP I
Jiang et al. (2998) and Chattopadhyay et al. (1999) propose to
use the Jackknife for the estimation of the MSE of an empirical
best predictor. Here reduced to an area-level EBP, they show that
the MSE of θbEBP can be decomposed into:
2
MSE θbEBP = MSE θbBP + EθbEBP − θbBP
Pisa, 19th May 2016 | Ralf Münnich | 60 (78)
Variance Estimation Methods
(7)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Jackknife method for EBP II
Further, they propose to estimate MSE θbBP by
h
i
[ θbBP
MSE
Jack
D h
i
h
i
h
i
X
BP
[ θbBP − D − 1
[ θb−j
[ θbBP
= MSE
MSE
− MSE
D
,
j=1
(8)
2
and EθbEBP − θbBP by
D 2
h
i2
X
EBP
b θbEBP − θbBP = D − 1
θb−j
E
− θbEBP
D
.
(9)
j=1
EBP is the θ
bEBP when omitting area j in the estimation of the
The θb−j
model parametershof the
i underlying model Chattopadhyay et al.
[ θbBP depends on the BP at hand.
(1999), and MSE
Pisa, 19th May 2016 | Ralf Münnich | 61 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
The Fay-Herriot estimator I
(Fay and Herriot, 1979) proposed the so called Fay-Herriot
estimator (FH) for the estimation of the mean population income
in a small area setting.
I
Covariates only available at aggregate level.
I
Covariates are true population parameters, e.g. population
means X .
Direct estimates µ
bd,direct are used as dependent variable.
I
I
I
Only one observation per area.
The model they use may be expressed as
µ
bd,direct = X β + ud + ed
ud ∼ N(0, σu2 )
2
and ed ∼ N(0, σe,d
)
Pisa, 19th May 2016 | Ralf Münnich | 62 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
.
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
The Fay-Herriot estimator II
The FH is the prediction from this mixed model and is given by
bd
µ
bd,FH = X d βb + u
,
(10)
σ
b2
b
bd = 2 u 2 (b
u
µd,direct − X β)
σ
bu + σe,d
I
σ
bu2 and βb are estimates
I
2 , d = 1..D are assumed to be known
σe,d
Pisa, 19th May 2016 | Ralf Münnich | 63 (78)
Variance Estimation Methods
.
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
The Fay-Herriot PB MSE Estimator I
There exist different analytical approximations to the MSE of the
FH depending on the estimation method for σu2 (cf. Datta et al.,
2005).
Recalling the parametric bootstrap method for the FH
h
i
∗
MSE∗d,EST = E∗ (ψd∗ − ψbd,FH
)2
.
Now the right hands side is written in function of the distribution
of y |X , Z .
MSE∗d,EST =
Z∞
Z∞
...
−∞
(ψd − ψbd,EST )2 fy |X ,Z (u1 , . . . , uD , e1 . . . , eD )
−∞
du1 . . . duD de1 . . . deD
Pisa, 19th May 2016 | Ralf Münnich | 64 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
.
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
The Fay-Herriot PB MSE Estimator II
Simplifying the equation one can write h(u) := (ψd − ψbd,FH )2 and
fu,e := fy |X ,Z .
Then the MSE estimate obtains the form
MSE∗d,EST =
Z∞
Z∞
...
−∞
h(u)fu,e (u1 , . . . , uD , e1 . . . , eD )
−∞
du1 . . . duD de1 . . . deD .
I
Multivariate normal probability distribution function fu,e does
not have a closed form integral
7→ The equation above generally will not be tractable analytically.
Pisa, 19th May 2016 | Ralf Münnich | 65 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
The Fay-Herriot PB MSE estimator III
I
Two possible approaches
I
I
Numerical approximation (curse of dimensionality; cf. Donoho,
2000)
Monte-Carlo approximation (classical parametric bootstrap)
I
It follows so far, that the parametric bootstrap may be written
as a special case of a Monte-Carlo integration problem.
I
Thus, methods to improve estimates gained by Monte-Carlo
integration may be helpful in estimating the parametric
bootstrap MSE estimate as well.
Cf. Burgard (2015) – he also proposed variance reduction
techniques for resampling in small area.
Pisa, 19th May 2016 | Ralf Münnich | 66 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Missing Data - Everybody has them, nobody wants them
Missingness may be either
I
MCAR (missing completely at random),
I
MAR (missing at random), or
I
MNAR (missing not at random)
Rubin and Little (1987, 2002)
Pisa, 19th May 2016 | Ralf Münnich | 67 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Methods to handle missing data
I
Procedures based on the available cases only, i.e., only those
cases that are completely recorded for the variables of interest
I
Weighting procedures such as Horvitz-Thompson type
estimators or raking estimators that adjust for nonresponse
I
Single imputation and correction of the variance estimates
to account for imputation uncertainty
I
Multiple imputation (MI) according to Rubin (1978, 1987)
and standard complete-case analysis
I
Model-based corrections of parameter estimates such as the
expectation-maximization (EM) algorithm
Pisa, 19th May 2016 | Ralf Münnich | 68 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Resampling under non-response
I Balanced Repeated Replication: The method is applied as before but it
comes to an adjustment of the imputed values in each resample:
yimp,i + E (yimp,i )∗ − E (yimp,i ) ,
where E (yimp,i ) is the expectation of the imputes value with respect to
the original imputation procedure and E (yimp,i )∗ indicates the same
except that the imputation is applied in the resample (cf. Shao, Chen,
und Chen, 1998).
I Jackknife: The same adjustments of imputed values yimp,i as for the BRR
in each resample: yimp,i + E (yimp,i )∗ − E (yimp,i )
(cf. Rao and Shao, 1992, Chen and Shao 2001, Shao and Tu 1995)
Newer developments with specialised routines by Berger and Rao
For jackknife in case of NN-Imputation cf. Chen and Shao (2001)
I Bootstrap: Same imputation as in the original data in each bootstrap
sample
(cf. Shao und Sitter, 1996)
Computational burdon may be very high!
Pisa, 19th May 2016 | Ralf Münnich | 69 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Variance estimation under multiple imputation
I
c (θb(j) )
Multiple imputation (Rubin, 1987): θb(j) and var
m
P
θb(j)
Multiple imputation point estimate θbMI = m1
I
Multiple imputation variance Estimate
I
j=1
T = W + (1 +
with
within imputation variance W =
1
m
between imputation variance B =
I
m
P
j=1
1
m−1
c (θb(j) ) und
var
m
P
(θb(j) − θbMI )2
j=1
Problem: the imputation has to be proper in Rubin’s sense.
Pisa, 19th May 2016 | Ralf Münnich | 70 (78)
1.
2.
3.
4.
1
)B
m
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Kish’s definition of the design effect
I
The design effect measures the inflation of the variance of an
estimator due to deviation from SRS:
deffKish =
I
2
sm
m
s2
n
More generally:
Vcd θ̂
deff =
VSRS θ̂
I
where cd means complex design, i.e. the variance under the
complex sampling design.
In practice, the design effect may be used as prior information
(design stage) or has to be estimated from the sample
(posterior evaluation)
Pisa, 19th May 2016 | Ralf Münnich | 71 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Estimating design effects
I
The estimated design effect is:
b cd θ̂
V
d=
deff
b SRS θ̂
V
I
Find suitable estimators for both
I
I
the enumerator (classical variance estimation)
the denominator (non-trivial!)
Pisa, 19th May 2016 | Ralf Münnich | 72 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Estimation of the denominator
I
I
I
b SRS θ̂ , because the sample is
Problem how to compute V
received by using the complex sampling design
One possibility is to treat the data as if they had drawn using
SRS (adjustment needed)
In case of unequal probability sampling the denominator is
computed by:
2
b SRS θ̂ = N 2 s N − n
V
n n

2
X 1 
X yi 
1
1
with s 2 = P −1
yi − P −1

πj
πj
πi − 1
πi
i∈S
i∈S
i∈S
j∈S
Alternatively: Make usage of parametric bootstrap (cf. Bruch,
Münnich, and Seger, in submission)
Cf.Pisa,
Gabler
(2008)
19th May et
2016al.| Ralf
Münnich | 73 (78)
Variance Estimation Methods
I
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Some further comments of design effects
I
Use deff to determine the effective sample size:
.
neff = n deff
I
In practice, approximate priors for deff might be used
The European Social Survey is a pretty good example for
using design effects
I
I
I
I
I
prior determination of comparable sample sizes in a European
sample
posterior evaluation of the effects
Model-based design effect estimates are used for two- and
multi-stage sampling (ESS context see above), cf. Ganninger
(2010)
In the meanwhile, different effects are separated, i.e. by
clustering, by unequal probabilities, and by interviewers
Pisa, 19th May 2016 | Ralf Münnich | 74 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Generalized Variance Functions (GVF)
I
I
I
Instead of directly computing variance estimates, a model is
evaluated at the survey estimates. The estimation of
parameters is done from past data or a small subset of the
items of interest.
Useful in case of a large number of publication estimates
Main reasons
I
I
I
Variance estimation can be computational intensive, especially
when many basic estimates are considered. The same is for the
publication of all variance estimates.
This becomes more intense when various combinations of
results (e.g. ratios) are of interest.
When using GVF the variance is simultaneously estimated for
groups of statistics rather than statistic-by-statistic. The
question is if it leads to more stable estimates?
cf. Wolter(2007)
Pisa, 19th May 2016 | Ralf Münnich | 75 (78)
Variance Estimation Methods
Notes
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Models for variance functionals
A model has to be chosen which defines the relationship between
the variance respectively the relative variance of the estimator of
interest X̂ and its expectation X = E(X ).
I The relative variance is defined by: V2 = V(X̂ )/X 2
I Examples of models for the relative variance:
1)V2 = α + β/X (often used, e.g. Current Population Survey)
2)V2 = α + β/X + γ/X 2
3)V2 = (α + β/X )−1
−1
4)V2 = α + β/X + γX 2
5)log V2 = α − βlog (X )
where α, β and γ are unknown parameters that to be
estimated (cf. Wolter, 2007).
Pisa, 19th May 2016 | Ralf Münnich | 76 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Notes
Some final comments on GVF
I
GVF are highly appreciated in statistical production, eg. in
surveys with many variables
I
Gershunskaya and Lahiri (2005) developed a small area
statistics GVF for the monthly estimates of employment levels
http://www.bls.gov/osmr/pdf/st050260.pdf
I
Eltinge (2003) used GVF under non-response
http://www.amstat.org/sections/SRMS/Proceedings/
y2003/Files/JSM2003-000787.pdf
See also Valliant et al. (2013)
Pisa, 19th May 2016 | Ralf Münnich | 77 (78)
1.
2.
3.
4.
Introduction to variance estimation
Linearization methods
Resampling methods
Further topics in variance estimation
Variance Estimation Methods
4.1 MSE estimation for small area methods
4.1 Variance estimation in the presence of nonresponse
Final comments
I
I
I
I
I
I
Due to the development of computer power, more and more
resampling methods are applied
The integration of different aspects into variance estimation
becomes more and more attractive, e.g. non-response or
statistical modeling
GVF are still important for statistical production but need
further development
Variance estimation for change can be seen as ratio or
differences of (dependent) estimates, but with changing
universes
Variance estimation (solely) does not necessarily yield best
confidence intervals – always consider peculiarities in data!
Variance estimation can be seen:
I
I
a priori (design-stage)
a posteriori (quality measurement)
Pisa, 19th May 2016 | Ralf Münnich | 78 (78)
Variance Estimation Methods
Notes

Documentos relacionados

University of California, Berkeley

University of California, Berkeley unknown time dependence, and there are comprehensive book-length treatments of both (see for example Lahiri [2013] for blocked bootstrap approaches and Politis et al. [1999] for subsampling). Thes...

Leia mais

Importance of automatic threshold for image segmentation

Importance of automatic threshold for image segmentation however it only provided reliable results with small densities of roots with no overlapping. Probably the most common and easiest method is before the skeletonization of binary image decribed for e...

Leia mais