Scalable algorithms for Markov process parameter inference

Transcrição

Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Scalable algorithms for Markov process parameter
inference
Darren Wilkinson
@darrenjw
tinyurl.com/darrenjw
School of Mathematics & Statistics,
Newcastle University, UK
MCMC and particle methods
ICMS, Edinburgh
5th–8th September, 2016
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview
Modelling and simulation
Modularity
Overview
Stochastic reaction networks, stochastic simulation and
partially observed Markov process (POMP) models
Likelihood-free (PMMH) pMCMC — exact fully Bayesian
joint inference for the model state and static parameters
(Adaptive) ABC(–SMC)
Functional programming approaches for scalable scientific and
statistical computing
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview
Modelling and simulation
Modularity
Example — genetic auto-regulation
P
r
P2
RNAP
p
q
Darren Wilkinson — ICMS, 8/9/2016
g
DNA
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview
Modelling and simulation
Modularity
1.0
30
400
200
0
P2
600 0 10
P
50
0.0
0.5
Rna
1.5
2.0
Simulated realisation of the auto-regulatory network
0
1000
2000
3000
4000
5000
Time
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview
Modelling and simulation
Modularity
Multi-scale simulation issues
Although not a huge model, this is actually rather challenging,
for both simulation and inference
Some species have very low copy number (predominantly zero
or one), making the diffusion approximation unappealing, and
some have fairly high copy number (several hundred)
The high and low copy-number species are tightly coupled,
leading to very discrete bursty stochastic behaviour even in
the high copy-number species
There are also two pairs of fast binding/unbinding reactions,
making the problem difficult for time-stepping methods such
as τ -leaping
A rather sophisticated hybrid multi-scale algorithm would be
required to simulate this process very accurately and very
fast...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview
Modelling and simulation
Modularity
Modularity: decoupling models from simulation algorithms
For forward modelling, there are clear and considerable
benefits to separating the representation of the model from
the algorithm used to simulate realisations from it:
There are numerous exact and approximate simulation
algorithms, as well as algorithms for static model analysis —
by decoupling the models from the algorithms, it is easy to
apply any algorithm to any model of interest — improvements
in simulation algorithms automatically apply to all models of
interest
Modifying a model representation will typically be much easier
than modifying an algorithm to simulate that model
When assembling large models from smaller model
components, it is often more straightforward to compose
model representations than associated simulation algorithms
Few disadvantages: limit to flexibility of representation, slight
inefficiencies, more sophisticated programming required
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Parameter inference
The auto-regulatory network model contains 5 species and 8
reactions
Each reaction has an associated rate constant — these 8 rate
constants may be subject to uncertainty
The initial state of the model (5 species levels) may also be
uncertain/unknown
There could also be uncertainty about the structure of the
reaction network itself — eg. presence/absence of particular
reactions — this can be embedded into the parameter
inference problem, but is often considered separately, and is
not the subject of this talk
We will focus here on using time course data on some aspect
of one (or more) realisations of the underlying stochastic
process in order to make inferences for any unknown
parameters of the model
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Partial, noisy data on the auto-reg model
100
50
●
●
●
●
●
●
●
●
0
Molecule count
150
200
True species counts at 50 time points and noisy data on two species
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
● ●
● ●
●
●
●
●
●
0
100
200
300
400
500
Time
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Classes of Bayesian Monte Carlo algorithms
In this context there are 3 main classes of MC algorithms:
ABC algorithms (likelihood–free)
Completely general (in principle) “global” Approximate
Bayesian Computation algorithms, so just require a forward
simulator, and don’t rely on (eg.) Markov property, but
typically very inefficient and approximate
POMP algorithms (likelihood–free)
Typically “local” (particle) MCMC-based algorithms for
Partially Observed Markov Processes, again only requiring a
forward simulator, but using the Markov property of the
process for improved computational efficiency and “exactness”
Likelihood-based MCMC algorithms
More efficient (exact) MCMC algorithms for POMP models,
working directly with the model representation, not using a
forward simulator, and requiring the evaluation of likelihoods
associated with the sample paths of the stochastic process
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Modularity and model decoupling for inference
Decoupling the model from the inference algorithm is just as
important as separation of the model from a forward simulation
algorithm
The key characteristic of likelihood-free (or “plug-and-play”)
algorithms is that they separate inference algorithms from the
forward simulator completely — this strong decoupling has many
advantages, with the main disadvantage being the relative
inefficiency of the inferential algorithms
The likelihood-free algorithms rely heavily on forward simulation, so
can immediately benefit from improvements in exact and
approximate simulation technology
There is no reason why efficient likelihood-based MCMC algorithms
can’t also be decoupled from the model representation, but doing so
for a reasonably large and flexible class of models seems to be
beyond the programming skills of most statisticians...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Partially observed Markov process (POMP) models
Continuous-time Markov process: X = {Xs |s ≥ 0} (for now,
we suppress dependence on parameters, θ)
Think about integer time observations (extension to arbitrary
times is trivial): for t ∈ N, Xt = {Xs |t − 1 < s ≤ t}
Sample-path likelihoods such as π(xt |xt−1 ) can often (but not
always) be computed (but are often computationally difficult),
but discrete time transitions such as π(xt |xt−1 ) are typically
intractable
Partial observations: Y = {yt |t = 1, 2, . . . , T } where
yt |Xt = xt ∼ π(yt |xt ),
t = 1, . . . , T,
where we assume that π(yt |xt ) can be evaluated directly
(simple measurement error model)
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Bayesian inference for latent process models
Vector of model parameters, θ, the object of inference
Prior probability distribution on θ, denoted π(θ)
Conditional on θ, we can simulate realisation of the stochastic
process X, with probability model π(x|θ), which may be
intractable
Observational data Y, determined from x and θ by a the
probability model π(Y|x, θ) — for “exact” algorithms we
typically require that this model is tractable, but for ABC, we
only need to be able to simulate from it
Joint model π(θ, x, Y) = π(θ)π(x|θ)π(Y|x, θ)
Posterior distribution π(θ, x|Y) ∝ π(θ, x, Y)
If using Monte Carlo methods, easy to marginalise out x from
samples from the posterior to get samples from the parameter
posterior π(θ|Y)
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Likelihood-free PMMH pMCMC
Particle Markov chain Monte Carlo (pMCMC) methods are a
powerful tool for parameter inference in POMP models
In the variant known as particle marginal Metropolis Hastings
(PMMH), a (random walk) MH MCMC algorithm is used to
explore parameter space, but at each iteration, a (bootstrap)
particle filter (SMC algorithm) is run to calculate terms
required in the acceptance probability
The “magic” of pMCMC is that despite the fact that the
particle filters are “approximate”, pMCMC algorithms
nevertheless have the “exact” posterior distribution of interest
(either π(θ|Y) or π(θ, x|Y)) as their target
If a sophisticated particle filter is used, pMCMC can be a
reasonably efficient likelihood-based MCMC method —
however, when a simple “bootstrap” particle filter is used, the
entire process is “likelihood-free”, but still “exact”
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Particle MCMC (pMCMC)
pMCMC is the only obvious practical option for constructing a
global likelihood-free MCMC algorithm for POMP models
which is exact (Andrieu et al, 2010)
Start by considering a basic marginal MH MCMC scheme with
target π(θ|Y) and proposal f (θ? |θ) — the acceptance
probability is min{1, A} where
A=
π(θ? ) f (θ|θ? ) π(Y|θ? )
×
×
π(θ)
f (θ? |θ)
π(Y|θ)
We can’t evaluate the final terms, but if we had a way to
construct a Monte Carlo estimate of the likelihood, π̂(Y|θ),
we could just plug this in and hope for the best:
A=
π(θ? ) f (θ|θ? ) π̂(Y|θ? )
×
×
π(θ)
f (θ? |θ)
π̂(Y|θ)
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
“Exact approximate” MCMC (the pseudo-marginal
approach)
Remarkably, provided only that E[π̂(Y|θ)] = π(Y|θ), the
stationary distribution of the Markov chain will be exactly
correct (Beaumont, 2003, Andrieu & Roberts, 2009)
Putting W = π̂(Y|θ)/π(Y|θ) and augmenting the state space
of the chain to include W , we find that the target of the chain
must be
∝ π(θ)π̂(Y|θ)π(w|θ) ∝ π(θ|Y)wπ(w|θ)
and so then the above “unbiasedness” property implies that
E(W |θ) = 1, which guarantees that the marginal for θ is
exactly π(θ|Y)
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Particle marginal Metropolis-Hastings (PMMH)
Likelihood estimates constructed via importance sampling
typically have this “unbiasedness” property, as do estimates
constructed using a particle filter
If a particle filter is used to construct the Monte Carlo
estimate of likelihood to plug in to the acceptance probability,
we get (a simple version of) the particle Marginal Metropolis
Hastings (PMMH) pMCMC algorithm
The full PMMH algorithm also uses the particle filter to
construct a proposal for x, and has target π(θ, x|Y) — not
just π(θ|Y)
The (bootstrap) particle filter relies only on the ability to
forward simulate from the process, and hence the entire
procedure is “likelihood-free”
See Golightly and W (2011) for further details
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
PMMH inference results
kTrans
3000
4000
5000
20
40
60
80
100
10
kDiss
3000
4000
5000
Density
0.0
ACF
0.4
0.0
2000
0
20
40
60
80
100
1.0
2.0
Value
kRDeg
kRDeg
kRDeg
4000
5000
0.0
1.0
Density
0.8
ACF
0.4
0.0
3000
2.0
Lag
0.6
0
20
40
60
80
100
0.0
0.2
0.4
0.6
0.8
Lag
Value
kPDeg
kPDeg
kPDeg
3000
4000
5000
Density
1.2
1.4
0
100
0.8
0.4
ACF
2000
1.0
250
Iteration
0.0
1000
1.5
Iteration
2000
50
1.0
kDiss
0.005
0
40
kDiss
0.0
1000
30
Value
0.025
0
20
Lag
2.0
1000
0.020
0.000
0
Iteration
1.2
0
Value
Density
0.8
ACF
0.0
2000
0.8
1000
1.0
Value
0
Value
kTrans
0.4
30
10
Value
50
kTrans
0
20
Iteration
Darren Wilkinson — ICMS, 8/9/2016
40
60
Lag
80
100
0.005 0.010 0.015 0.020 0.025 0.030
Value
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
PMMH inference results
log(kTrans)
2000
3000
4000
5000
40
60
80
100
2.0
3000
4000
5000
0.0
0
20
40
60
80
100
−0.2
0.0
0.2
0.4
Lag
Value
log(kRDeg)
log(kRDeg)
log(kRDeg)
3000
4000
5000
0.0
0
20
40
60
80
100
−3.0
−2.5
−2.0
−1.5
−1.0
Value
log(kPDeg)
log(kPDeg)
log(kPDeg)
Density
ACF
0.4
0.0
3000
4000
5000
0
20
Iteration
Darren Wilkinson — ICMS, 8/9/2016
40
60
Lag
80
100
−0.5
0.0
0.0 0.4 0.8
Lag
0.8
Iteration
2000
0.8
0.3
Density
0.8
ACF
0.4
0.0
2000
0.6
0.6
Iteration
−4.5
1000
1.0
Density
0.8
ACF
0.4
0.0
2000
4.0
2.0
log(kDiss)
−6.0
0
3.5
log(kDiss)
−1.5
1000
3.0
log(kDiss)
−3.0
0
2.5
Value
0.4
1000
0.4
0.8
20
Lag
−0.2
Value
0.0
0
Iteration
0.0
0
Value
Density
0.8
ACF
0.0
1000
0.8
0
Value
log(kTrans)
0.4
3.0
2.0
Value
4.0
log(kTrans)
−6.0
−5.5
−5.0
−4.5
−4.0
−3.5
Value
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
PMMH inference results
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●
0.8
●
●
●
●
●
●
●
●
●
●
●
●
0.6
●
●
●
●
●
●
●
●
●
●
●
●
●
0.2
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
log(kDiss)
●
●
●
●
●
●
● ● ●
●
●
●
●
●
●
●
●
●
−0.2
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●● ●
●
●
●
●●
●
●● ●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●●
●● ●
● ●
●
●
●
●
●
●●
●
●
● ●
●● ●●
●
●
●
●●
●
●
●●
● ●
●●
● ● ● ●
●
●
●
●
●
●
●
● ●●
●●
● ●
●
●●
●
●
●
●
●
●
● ●●
●
●
●
●
●
●● ●
●
● ● ● ●
●●
●
●
●
●
●
● ●
●●
●
●●
●
●
● ●●
●
●
●
●
●
●
● ●●
●
● ● ●
●
●
● ●
●
●●
●● ● ●
●
●
● ●●
●
● ●
●
●
● ● ● ● ● ●●
● ●
●
●
●●
●
● ●
●
●
● ●
●
● ●
●● ●
●
● ●
● ● ●
●●●
●
●
● ●●
● ● ●●● ●
●●
●●
●
●
●
● ●
●
● ●●
●
●
●
●
●
●
●●
● ●
●
●
●
●
● ●
●
●
● ●
●●
●
●
●
●
●●●
●
●
●
●●
●●●
●
●● ●●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●
●
●●
●●
● ● ●
● ●
●
● ●
●
●
●
● ●
●
●
●
●●
●●
●●
● ●● ●
●● ●
●
●●
● ●
●● ●
● ● ●
● ● ●
●
●●
●
●
●
●● ●●● ● ●
●
●
●
●
●
● ●● ●
●
● ●
● ●●
● ● ●●
●
●
●●
●●
● ●● ●
● ●●
●● ●
● ●
●●
●
●
●
●● ● ●
●
●
●
●●
●
●
●
●
●
●
● ●
●
●
●
● ●
●
●●●
●●●
●● ●
● ● ● ●● ● ● ●●
●
●
●
● ●●
●
●● ●
●
●●
●
●
●
●
●
●●
●
●●
●
●
● ●● ●
●● ●
●●● ●●●●
● ●●●● ●
●●
●● ●●
●
●
● ●
●
●
●
●
● ●
●
●●
●●
●
●● ●● ● ● ●
●
●
● ●
●
●
●
●●
● ● ●●
●
● ●●
●●
● ●
● ● ● ●●
●
●
●
●●
● ● ● ● ●● ●●● ● ●● ●●●●●●
● ● ●
●
●● ●
●● ● ●
●●
● ●
●●● ●
●●● ● ● ● ●●
●
●
●
●
● ● ●
●
●● ●● ●
●
●●
● ●
●
●
●
●●
●
●●
●
●●
●
● ●
●
● ● ●●●● ●
● ● ●
●● ● ●
●
●
●
●
●
● ●
●
● ●
●
●
● ●●●
● ●●
●
●●
●●
●
●
●● ●
●
● ● ●
●
●
●● ● ●● ●
●
●●●●●●●●●●
●
●● ● ●●
●
●
●●
●
●
●
●
● ●
●●
●
● ●●
●
●
●
●●●● ●
●
●
●
● ●
●●
● ●
●● ●
●
●
●
● ● ● ●●
●●
●
●
●● ● ● ● ●
●
●●
●
●● ●
● ●●
● ●● ●● ● ● ●● ●
● ●
● ●●●
● ●●●
● ●●
●
●●
●● ●
●
●
●
●●
●
●
●
●
● ●
●●
●● ●
● ●●●●
●●●●●●●●● ●●● ●●●
●
●
● ●●●
●● ●
●
● ●●
●●
●●
●
●●
●●
●
●●
● ● ●● ●●
●● ●
●
●●
●● ●
●
●●
●● ●●●●●● ● ● ●●
●
● ●
●
●
● ●
●
●
●
●●
●● ●
●●●●
● ●●●
●● ● ●
●● ●
●
●● ●● ●
●
● ● ●
●
●
● ●●
● ●
●
●
●
● ● ●
● ●●
●
●
● ●●●● ●●●●● ● ● ●●
●● ●●● ● ●
●●
●
● ●
●●
●
● ●●
●
●
● ●
●
●
●
●●●
●
● ● ● ● ●●● ●●
●
● ●
●
●
● ● ●
●
● ● ●●
● ●
●
●● ●● ● ● ●
●●
●
●
●●
● ●●●●●
●● ● ●
● ●●
● ●●
●
●
● ● ●●●●●●●
●
●
●● ● ●
●
●● ●●● ●
● ●
●
●● ●
●
●
●●●● ●
● ●
●●
●●
●
●● ●● ●
● ●●●●●●
●
●
●● ●
●●
●
●
●
●
● ●●
●●
●●● ● ● ●
● ●●● ● ●●● ●
● ●● ●
●
●●●●
●
●
●
● ●
● ●
●
●●
●●● ●
●
●●
●
●
● ●●
●●
●
●
●●
●
●
●
●●
●●
● ●●
●● ●
● ●●●●●●● ● ● ●● ● ●●
●
●
●● ●
●●
● ●●●
●
●●
● ●●●● ●
●● ● ●●
●●
●
●●
● ●
●
●
●
● ●●●
●
● ●●● ● ● ●●●● ●
●●● ●● ● ●●
●
●●●
●● ●●
● ● ●
●
●
● ●●
●●●●●● ● ● ●●
●
●
●
●● ● ●
●
●
●
● ● ●●
● ●
●●● ● ●●
●
●● ● ● ●
● ●
●●
●
●● ●●
●● ●●
●
● ● ●●
● ●
●● ●
● ● ●●
●● ● ● ●
●
●
●
●●●
●●
● ●
●
●
● ●●●
●●
●●●
●
●
●●
●
●●
● ●
● ●● ●●
●●●
● ●● ●● ●
●
●
● ●● ● ● ● ●
●
●
●●● ●●● ● ●● ●●
●
●●
●
●
●
●
●● ●
●
●●
●
●●
● ●●
● ●●
●
● ●● ●
●
●●
● ●●●●
●
●● ●
●
●
●●
●
●●●●●●●●●
● ●●● ● ● ●●
●●●
● ●
●
●
● ●●● ● ●●●
●
●● ●
●● ●●● ●
● ● ●
●
●
● ●●●
● ●●
●●
●
● ● ● ●
●
●●
●●
● ●●●
●●●
●
●
● ●
●
●
●●● ●●● ●●●●
●
●
● ●● ● ● ●
● ●● ●
● ● ●● ●
●
●● ●
● ●● ●
●
●
●
● ●
●
●
●
●
●
● ●
●
● ● ●
●●● ●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ● ●●● ● ● ●
● ●
●●
●
●
●
● ●
●
● ● ●● ●
●
● ●●● ●●
●
●
● ●● ●
●● ●
●● ● ●●
●
● ● ●●
●
●●●●
●
●
●●●
●●
●
●
●
●
●
● ●●
●
●
●
● ● ● ●●
●
●
●
●
● ● ●●
●●
●
●
●● ●●●
●●●● ●
●●
●●●
● ●
● ●●
●
● ●●
●
●●
●
●● ●●
●●
●
●
●
● ●●
●
●
● ● ●● ●
● ●● ●
●
●●
●
●
●
●
●● ● ● ● ●●
●
●●
● ●
●● ●
●●
●
● ●●
●●●
●
●
●
● ●●●
● ● ● ● ● ●●
● ● ●●
●
●● ●
● ●
●● ●
● ● ● ●●
●●
● ●●● ● ●●●
●●
●
● ●● ●
●● ● ●●
●● ● ●
● ● ● ●
●●
●
●●●
●● ●●●● ●●● ●
● ●
●
●
● ●
●
●● ●●● ● ●
● ●
● ●
●●
● ●●
●
●
●
●●
●● ●●
● ●
●●
●
●
● ●●
●
● ● ●●
●●
●● ●
●●
●●
● ●
●●●
● ●●●
●●
● ● ●● ●
●● ●●
●●
● ●
●
●
●
●
●
●
●
●
●
●●●● ● ●● ●●
●●●●●
●
●
●
●
●● ●
●●●
●● ● ●●●
●
●●●●
●
● ●●
●●●
●
●
●
● ●● ●
● ● ●●●
●●
●
●●
● ● ●● ●●●●
● ● ●●
● ●
●
●
●●
● ●● ●
●●
●●●
●
●●
●●
●
●●
●● ● ●●
● ●
●
●
●
● ●
●
●
● ● ●● ● ●
●●● ●
●
● ●●●● ●
● ●●
●●●
●●
● ●●● ●
●
●●
● ●
●● ●
●● ●
●●
●
●●
●● ● ●
●
● ●
●
●●
● ●●
● ● ●●● ●●● ●
● ●●● ●●●●●
●
●● ●
● ●●
●
● ●● ●● ●
●●
●
●●
●
●
●●
●
●
● ●
●
●● ● ●● ●
●●
●● ●
●
●
●●
●
● ●●●● ●
● ●
● ●●● ●
●
●
● ● ●● ●
●●●●
●
● ●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
● ●●
● ●●
●
●●●●
● ● ● ●●●● ●
●
● ●●● ● ●
●
●●
●●
●●● ●
●
●● ●
●
●
●
●●●● ● ●●
●
●
●
●● ●
● ●
● ● ● ●● ●
●
●
●● ●● ●●
●●
●
● ● ●●
●●●●●
●
● ●●
●●●●●●
●
●● ● ●
●
●●
● ●
● ●●
●●●●● ●●● ●●
● ●
● ●● ●●●●●● ● ●
●
●
●●
●
●
●●
●
●
●
●
● ●
●
●●● ●●● ● ● ● ●
● ● ●● ●
●
●●
●
●
● ●
●●● ● ●
●
●
●
●●●
●
●●●
●
●●●
●●● ●
●
●
●
● ●
● ●
●
●
●
●●●● ● ●
●
●
●
●
●●
●●
●●
●
● ● ●● ●
●
● ●●
● ●●
●
●
●
●
●●●●● ● ●
●● ●
●
●
● ● ●●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●● ● ● ●● ●● ● ●● ●
●●
●
●● ●
● ● ●
●
●●
● ● ●●●
●●
●●● ● ●●●
●
●●
●● ● ●●●●●
●
● ●
● ●
●●
● ●● ●
●●
●● ●●● ●●
●
●●
●
●●
●
●
●
●
●●● ●
●
●
●
●●● ● ● ●● ●
●● ● ●
●
●
●
●
●● ●
● ●● ●
● ●
●
●
●● ●●● ● ●
● ● ● ●●●●
●
●
●
●
●●
● ●
● ●
● ● ●
● ●●
●●● ●
●
● ●
●
●
●●● ●●
● ●●
●
●
●● ● ● ●
●
● ●
● ●● ● ● ●
● ●● ●
●●● ● ●●
●
●
●
●
● ●●
●
●● ●
● ●●
●●
●
● ●
●
●
●● ●●
● ●
●●
● ●●
● ● ●● ●
● ● ● ●●
● ● ●●
●
●
●
●
● ● ●●●
●● ●●
● ●●
●●
●
●
●● ●
● ●
● ●● ●●●●
●●
●●
● ●
●
●
●
●
● ●● ●
● ●● ●●
● ●●● ●● ●● ●●●●●
● ● ●●
●
●
● ●
● ● ●● ● ●
● ●
● ● ●
●
● ●
● ●
●
● ●●● ● ●
●
● ●
●●
●
●
●
●●●●
●
●
●●
●
●
● ●
●● ● ●●
●
●●
●
●●
● ●●
●
●● ●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●● ●
● ● ● ● ●● ●
●●●●●●●●● ● ●
●
● ●
● ●● ●
●
●
● ● ● ●● ●●● ●
●
●●●
● ●
●
●
●
● ● ●
● ●●
●
●
●●
● ●●
● ● ● ●●
●
●●●●
● ● ● ●●
●
● ●● ● ●● ●●●●●● ●●
●
●
●
● ● ●●
● ●
● ●● ●
● ●
●●
●
● ●●●
● ●●●●●
●● ●●● ●●●●
●●●●●
● ●
●
●●
●
●●●●
●
●
●
●
● ●
●● ●●●● ●●●
●
●
●●● ●
● ●
● ● ● ● ●●● ●
● ●● ●
● ●●
●
● ●
●●●● ●● ●
● ●●●
●
● ● ●
●
● ●
●●
●●●
● ●●●
●●
●● ●● ●●
●
●● ●
● ●
●● ● ● ●●
●●●
●●
●
● ●
●
●
●
●
●
● ●●
● ●
●
● ●● ●● ● ●
● ●
●●
●●● ●
●
●
●●
● ●
●●
●
● ● ● ● ●●●●●● ● ●● ●● ● ●●
●●
●● ●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
● ● ● ●● ●
●
●
●
●●
● ●
●
●
●●●●●
● ●● ●● ● ●● ●● ● ● ● ●
●
●
●
●
●
●
●
●
● ● ●● ●
●●●
●
●● ●●
●●●
●
●●
●
●● ●
●
●●●● ●● ● ●
● ●●●
● ●
●●
●
●
●●●● ●
●
●
●●●
●●
●
●
● ●●
●
●
●
●●
●
●
●
●●●●
● ●●●
●● ●● ●● ●
●● ● ●● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
● ●
●
●
●
●
● ●
● ●● ●● ●● ●
●
●
●
● ●●
● ●
●
● ● ●
●
●
●
● ● ●
●●●● ● ● ●
●● ●
●
● ●
●● ●● ●
●●●
● ●●
● ●●
● ●●
●
●
● ●●
● ●
● ●
●● ●● ● ●●● ● ●●
●●● ●
●
●● ●● ●●
●
● ● ●●
●
●
● ●
●
●
●● ●
●
● ●●
● ● ●
●
●
● ●
●●
●
●●●●● ●●● ●
●
●●●● ●● ● ●●
●●
●●
●
● ●● ●● ● ●●
●
●●●●
● ●● ●
●
●
●
●
●●
●
●● ● ●●
● ●
●
●
●
● ●● ●
●
●
●
● ● ●● ●
● ●
●●
● ●
●●
● ●●
●●
●
● ●
● ●
●
●
●●
●
●
●
●●● ●●
●● ● ● ● ●
●
●
●
● ● ●● ● ●●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
● ● ●● ●●
●
● ● ●
●
●
●
●
● ●● ●●
●
●●●● ● ●● ●●
●
●
● ● ● ●●●
●
● ●●
● ● ● ●●
● ● ●
● ● ●
●●
●
● ●
● ●
●● ●
●
●●
●
●
● ●● ● ●●
●● ●
●
●
●
●
●●
●● ●
● ● ●
● ●
●●
●
●
●
●●● ●● ●
●
●
●
●
●
●
●
●●
●
●●
●●
●●
●
●●
●
●
●●●
●
●● ●
●
●●●
●
● ● ●
●
● ●
●
●
●
●
● ●
●●
● ●
● ●●
●
●
●
●
●
●●
● ●● ●
●
●
● ●●
●
●
●●
●
●●
●●
●
● ●
●
●● ●
●
●
● ●
●
●●●● ●●
●
●
●
●
● ●
●
● ●●
●● ●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●●
●
●
●
●
● ●●
●
●●
●
●
● ●
●
●
●
●
●
●
●
●
● ●
●● ●
● ●
● ● ●
●
●
● ●
●●
●
●●
●
●
●
● ●
● ●●
●
●
● ●
●
●
●● ●
●●
●
●
● ●
● ●
● ●
●
●
●
● ● ●●
●●
●
●
●● ● ● ●
●
●
●●
●
● ● ●
●●
●●
●●
●●
●
●
●
●●●
●
●
●
●
● ● ●●●
● ●
●●●
●
●
●
● ● ●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ● ●
●
●●
●
●● ● ●
●
●
● ● ● ● ●
●
●
●
●
● ●
● ● ●●
●
●
●●
●● ● ●●●
●
●●
●●●
●
●
●
● ● ● ● ●
●
●
●
●
● ●
●●●
●
●●
●
●●
● ●
●
●
● ●
●
●●
●
●
●
●
●
● ●●
●
●
● ●
●
●
●
● ●● ●
●
● ● ●
● ● ●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●● ● ●
●
● ●
● ●
● ●
●
●
●
●
●●●
●
●
●
●● ●
●
●
●
●
● ●
●
●
● ●
●
●
●
●
●
●
● ●
●
●
●●
● ●●
●●
● ● ●
●
●
●
●
●
● ●
●
●● ●
●
●
● ● ●●●
●
●
●●
●●
●
●
●● ●
●
●
●
●
●
●
●●
● ●
● ●●
●
● ●
● ●
●
●
●
●
●
●
●
● ●
●
●
●
●●● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●● ●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●
●
● ●
●
● ● ●
● ● ● ● ●●●
●
●
●
●
●
●● ●●
●
●
●
●
● ●● ●
●●
●●
●
●●●●
●
● ●
●
●
●
●
●
● ● ●
●
● ●
●
●
●
●●
●
●
●●
●
●
●
● ●
●
● ●
●
●● ●
● ●● ●●
●
●
●● ●
●
●
●
● ●
● ● ●
●
●
●
●
●
●
●●● ●
● ●
●
●
● ●
● ● ●
●
●●
● ●
●●
●
● ● ● ●● ● ●
● ● ● ●●
●
●
●
●
●
● ●
●
●
●
●●
●
●● ●
●
●
●
●
●
●
●
● ●● ●●●●●●●
●
● ●
●
● ● ●●
●● ● ●●
● ●● ●
● ●●●
● ●
●
●
●
●
●●
●
●
●●
●
●●
●
●
● ●●
● ●●
●
●
●
●●
●
●
●
●●● ●
●●● ●
●
●●
●
●
●●●
●
●
●
● ●
●
●
●
● ● ●●
●
● ●●
●
● ●
● ●
●
● ●●
● ● ●
●
●
●
●
●
●
●●
●
●● ● ●●
●
●● ●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●● ●
● ●●
● ●
● ●
●●
● ●●● ●● ●●●
●
●
●● ● ●
●● ●
● ●
●●
●
●
● ●●
●
●
●
●
●
●
●
●
●
● ●●
● ● ● ● ●●
●●
● ● ●●
● ●
●●
●
●
●
●
● ●●●
●
●●●
●
●
● ●
●●
●●
●
●
●
●
●●
●●
●
●
●
●
●
● ●
●
●
● ●●
● ●
●
●
● ●●
●
●●
●●
● ●●
●
●
●
● ●●
●●
●●
●
●
● ●●
●
●● ● ●● ●●●
●●
●
●●
● ●
●
●
● ●●●●
●
● ●● ●● ●● ● ●
●
●
●●●●
●●
● ●●
●●●●
●
●●●●
●
●●
● ● ●●
● ●●
● ●● ● ●
●
●
●
● ● ●●
●●● ●●
● ●
●
●●
●
● ●
● ●● ●●
●● ● ●
●
●
● ●
●
●
● ●●● ●●
●
●
●● ●
●●●
●
●
●● ● ●●● ● ● ●●
●
●
●●●
●
●
● ●●●
●
● ●
● ●●
●
●
●
●
●● ●●●
● ●
●
● ●● ●
●● ●
● ●●
●
●● ●
● ● ●●●●
●●
●● ●●
●●
●●●
●●
● ● ●
●● ●
●
●
●●
● ● ●● ●●
●● ●
● ●●●●
●
●● ● ● ● ●
●
●
●●● ● ● ●●●●
● ●● ●● ● ●
●
●●
● ●
●
● ● ●
●
●
●
●
●●● ●●●● ●
●
●
● ●●● ●● ●
●
● ● ●
● ● ●●●● ● ●● ● ●
●
● ● ●
●
● ● ●● ●●
●
●● ● ● ●●
● ●
●●
● ● ●
●
●
●
●●
● ●●
●
● ● ●●● ●●
● ●● ●●●●●
●
● ● ●●●
●
●● ●●● ●
● ●● ● ● ● ●●
●
●
● ●●
●
● ●
● ● ● ●●● ●
●●●●
●●
● ●●
● ●● ● ●
●
●●
●
●
●●
●●
●
●
● ●
●●
●
●●
●
●
●
● ● ●●
● ●
● ● ● ●●
●● ●●
● ●● ●
● ●● ●● ●
●● ● ●
●
●
●
● ● ●●● ●● ●●●
●
●
●
●
●
●●
● ● ●
● ●● ●
●
●
● ●
●
●●●●●● ●● ● ●● ● ● ●●●
● ●● ● ●
●● ●●●●
●
●
●●●
●●
●
●
● ●●
● ● ●
●● ●
●
●●●●
●
●
●
●
●●
● ●●
●
●
●
●● ●
●●● ●
●● ● ●●
● ●
●
● ●●● ●
● ●●● ●
●
●●
● ●● ●● ● ●● ●● ●
●
● ● ●
●
●
●●
●
● ●●
●●●
● ●
●
● ●
● ●
●●● ●●
●
●
●
● ●● ●●● ●
● ● ●
●●
●●
● ●●
●●
●●
● ●
●
●
●
●
●● ●
●
●●●
●
●●●●
●●
●
●
●● ● ●
●
●●●
●
●
●● ● ● ●● ● ● ●
●●● ● ●
●
● ●●● ●
● ●● ● ●●● ●●●●● ●
●
● ● ● ● ●●
●
●
● ● ●
●
●●● ●
●● ●
● ●
●
●● ●
● ●●
●●
● ●● ●
●
●●●● ●●● ●
●●
● ● ● ●
●
● ● ●
● ●●●
●● ●●
● ●●
●● ●●
● ●● ●
●
●
● ●
●
●
●
●
●
●●●●● ●
● ●
● ●
●●
●● ●● ●
●● ●
● ●
●●● ●●
●
●
● ●● ●●●●
●●
●●
●●●
●●●
●● ●●
● ●● ● ●
●
●
●● ● ●
●● ● ●
●
●
● ●●
●
●●
●
●
● ● ●●● ●● ● ● ●●
● ●●● ●●
● ● ●●●
●
● ●
●
● ● ●● ●
●●
● ●●
●●
●●
●
●
● ●
●● ●
●
●● ●●
●
●●
●
●●
●● ●●
●● ●● ●●●● ●● ● ●
●
● ●●
● ●●
●●● ● ●
● ●●●
●● ●
●●●
●
● ● ● ●
● ●
●
●
●●
●●
●● ●
●●
●
● ●● ● ●● ●
●
●
●●
●
● ●●
●●
●
●
●●●●● ●
●
●
●
●
●● ●● ●● ●
●●●
●
● ● ●●●
● ●
● ●●●
●● ● ●
●
●●● ●
●
●●
●● ● ●●
● ● ●● ●● ●
●
●
● ●
●
●● ●
●
●● ● ● ●●
●
● ●●
●
●●
●
●
●●
● ● ●
●
● ●●●
●
●
● ●
●● ● ●● ●●
●
●
●●
●●
● ●
●●
●●
●●● ●
● ●● ●●●
●●
● ●●
●●
● ● ●●
●●
●
● ● ●●●● ●●● ●●●
●●
● ●● ●
●●
●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
● ● ●
●● ● ●
●
● ● ●● ●
●●●●●●●
●●●● ● ●●●
● ●●
●
●
●●
●● ● ●●●●
●● ●
●
●
●
●
● ●● ●● ● ●●●
●● ● ●
● ●●
● ● ●
●
●● ●
●
●●
●● ●
●● ●● ●● ● ●●
●●● ●
● ●●
●●●●
●●●●● ●
●●● ●
●
●
●
●● ●● ●
●●
●
●
●●
●●●
●●
●
●
●●
●●
● ● ●●●● ● ● ●●● ●
●
●
●● ● ●●
●●●●
●● ● ●
● ● ●●
●
●
● ●
●
●
● ●●
●●●
● ●●
●● ●
● ●
● ●● ●
●
●
●●
●●
●
●●
●
●
●
● ●●●●
●
●●●
●●●
●
●●●
● ●●●
●●●●
●
●●
●●
● ●● ●● ●● ●
●●
● ●● ●●
●
●
●
●
● ●●
●●●● ● ● ●● ●● ●●●●●●
● ●
●● ●●● ● ●● ●
●
●
● ●
● ●●
● ●
●● ●●●●●●
● ● ●● ● ● ●
●
● ●
● ●●
●
● ●
● ●●
● ●
●
●● ●●●
●● ● ●●
●
●
●
● ● ●●
●
● ●
● ● ●● ●
●
●
●● ●● ●
●
●●
●● ● ●
●
● ● ●●●● ●●●
●
●
●●
●●
●
● ●● ●● ●● ●● ●
●●● ● ●
● ●
●●
● ●● ● ●
●●● ●●●
●
●
●
● ●●●●●● ●
●● ●●● ●● ●● ● ● ●
●
●
●
●●
●
●
●
● ● ● ● ●● ●
●
● ●
●
●●●●
● ●●
●
● ● ●●●● ●
●●
●
● ● ●
●
● ● ●● ●
● ●●
●
●
●● ●
●●
●●
● ● ●●●
●
●
● ●
●●
●
● ●
●●● ● ● ● ●
●
●
● ●●
● ● ● ●●● ●● ●
●●
●
●●
● ●
● ● ●● ● ●
● ● ●● ●● ●
● ●● ●●●
●
●●●●●
● ● ●
● ●
● ● ● ●●●
●●
●● ● ●●
●● ●●
●●
●
●● ● ●
● ●
●
●●
●● ●●
●
●●
● ●● ●
●
● ●●
●
● ●●
●●●
● ●● ●
●● ● ●●● ●
●
●
●
●
●
● ●● ●
●●● ●● ●
●
●
●
●
● ●
●
●●● ●
●●
●
● ●●
● ●●
●● ●
●
●
●●
●
●
●●●● ●
● ●
●● ●
●
● ●● ●● ●●
● ●
●● ●● ●●
● ● ●
●●
● ● ● ●
●
●● ● ●
●
●
●●●
●
●●
●● ●
● ●●●●
●●
● ●● ●
●● ●
●
●●
●
●●●●
●
●
●
● ●● ● ●●●●
●●
●
●
●●
● ●
● ●
●● ●
●
●●
●●
● ●●
●●
●● ●●
● ●● ●●●
●
● ● ●
●
●
●
●
●● ●● ●●●
●● ● ●
●
●●● ●
●● ●● ●
●
●● ●
●●● ● ●
● ● ● ● ● ●●
●
●
●●●●
●
●
●●
●● ● ●
●
●● ● ●●●
●● ●● ● ●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
●● ●
● ● ● ● ●● ●●●
● ●●● ● ●●● ●
●
●
● ●●
● ●● ●
●
●
●
● ●
●● ● ● ●
●●●
●
● ●
●
●●
● ● ● ● ● ●● ●
● ●●● ●●
●
●●● ●●●
● ●●
● ●●●
●
●
●●●●●● ●●● ● ●
●●
● ●●
●
●
●●
●
●
●●● ●
●
●
●
●
●● ● ●
●●
●
●
●
●●
●
●
●●
● ●●● ●
●
●
●
●●
● ● ●● ●
● ● ● ●●
● ●●
●● ●●
●● ●●
●● ●
● ●
●● ● ● ●●
● ●●
● ●●
●
●● ●● ●● ●●●
● ● ● ●● ● ● ● ●
● ● ● ●●●●●● ●●
● ● ● ● ●●●
●
●●
●
● ●●
●
●●
●●
●●
●●●●●●●●
●
● ●
●●
●●
●● ●●●●●●
●●●●●●●●●● ● ● ●●●●●● ●
● ● ●●
● ●
●
●
●
●●●
●
●●
●●
●
●
●●
●
●● ●
●●
●
●
●
● ●●● ●●
●● ●
●●
●
●●
●●●●● ●●
● ●●
●●
●
●● ●
●
● ●● ●●● ● ●
● ●● ● ●
●
●
●
●●●
●
●
●
● ●● ● ●
●●● ●● ●
●
●
● ●● ●● ●
●
●
●
●
●
●● ●
●
●
●
●
●● ● ●
●
●
●●
● ●
●
●●
● ● ●
●●
●
●
● ●
●
●
●●
● ●●●●●● ●●
●
●
● ●●●●
● ● ●● ● ●
●●
●●●
●
●● ● ● ●●
● ●●● ● ● ● ●
●
●●●
●●● ●
● ● ●●
●
●
●
● ●
● ●●●
●●
●
●● ● ● ●
●
● ●●
● ● ●●
●●
● ●
● ●
●●
●
●
●
●●●● ●
●
●
● ●
●●●●● ●
●●
● ●
●
● ● ●
● ●
●
●●
●
●● ●
●
● ●● ●
● ● ● ●●●●● ● ●
● ● ●● ● ●
● ● ●●●
●
● ●
●
● ●
● ●●
●
●●● ●
●● ●● ●●● ● ●
●
● ● ● ●●
● ●
●
● ●
●
● ● ●● ●
●
●
●
●
● ●
●
●
●● ●
●
● ●
●●
●
●●● ●●
●
●●
●●
●●
● ● ●●
● ●
●●
●
●●
●
● ●●
● ●
●●
●
●
●●
● ●●●● ● ●●
●● ●
●●● ●
● ● ●●
●
●
●
●
●● ●
●●
●
● ●● ●●●
●
●
●
●
●●
●
●●
●●
●●
●●●●●●●● ● ●
●
● ● ●●●
● ●●
● ●●●
●●
●
●● ● ● ●● ●
● ●
●
●●
●
●
● ●● ●● ●●
●
● ● ●
● ●● ●
●
●●
●
●● ● ● ●●
●
● ●●
●●
● ●●
●
●
●
● ●●●
●
●
●●
●
● ●
●
● ● ●●●
●●
●●
●
●
●
●●
●●●
●●
●
●● ●
●● ●
● ●● ● ● ●
● ●
●● ●
●
● ●● ●
● ●●
●
● ●● ●
●
●
●
● ● ●
●
● ● ●
●●●
●
● ●
●
● ●●
●
●●
●
●● ● ●
●
●
●
●●
●●
●
●●●
● ●● ●
●
●
● ● ●
●●
●●
●
●
● ●
●
●
●
●● ● ●
● ●● ●●
●
● ●● ●● ● ●● ● ●● ●
● ●●
●
● ●●● ●
●
● ●●
●
● ●● ● ● ● ●
●
●
●
● ● ● ●●●●
●
● ●
●
●
● ●●
●●
●● ●●
●
● ●●●
●● ●
●
● ●●●● ●● ●● ●●●●
●●
● ● ● ●●
● ●●●● ● ●
●
● ●
●
● ●● ●
● ● ●
●
●●●
● ● ●
●●
● ● ● ● ●●
●
●
●● ●●
● ● ●●● ●
●●● ● ● ●●
●●
●
●
● ● ●
● ● ● ● ● ●● ●
● ● ● ● ● ●●
● ●
● ●●
● ●●
●
●
● ● ●● ●● ●
● ● ●● ●
● ●
●
● ●
● ●● ●● ● ●
● ●●
● ●
●
●● ● ●
●
● ● ● ●●
●
●
● ●● ●
● ● ●●
●
●● ● ● ●
●●● ●
● ●●
●●
●
● ●
●●
●
●●
●
●
●
●
●●
● ●● ●
● ● ● ● ●
●● ●
● ● ●
● ●●
●●
●
●
●
● ● ● ● ●●
●●
●
●
● ●●●
● ●● ● ● ●
●
● ●
●
●
●
●
●● ●
●
● ●●● ● ● ● ● ●
●
●
●
●
●
●
●●●
●
●● ●
●
●
●
●
●● ●● ● ●
●●
●
●●
●
●
●
●●
●
●●
●
●
● ●
●● ●●
● ●
● ●●
● ●
●● ●●
● ●
●
● ● ●
● ●
●
●
● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
● ●
●●● ●
●
●
●
●
●●
●●
●
●
●● ●
●
●
●●● ●
●
● ●
● ● ● ● ●
● ●
●
●●
●
●
●
●
●● ● ●
● ●
●●
● ●
●
●
●
●
●
● ●
●●
●
●
●
●
●
●
●
●
●● ●
●
●
● ●●
●●
● ●● ●●●
●●
●
●
●●● ● ● ●
● ● ●●
●●
●●● ●●
●
●
●
●
●●
●
● ●●●
● ● ●●
●
●
●
●
●
●
●
● ●
●●
●●●
●
●
● ●
●
● ●●
● ●
●
● ●●
●● ●● ● ●
●
●
●
●● ●
●
●●
●
●●
●
● ●
●●
●
●
●
●●
●
● ● ●
● ●
●
●
●
●●
●●
●
●
● ●
●
●● ●
●
● ● ●● ●
● ● ●● ● ●
●● ●
● ● ● ●●
●●
●●●
●
● ●● ●
●
●
●● ● ●
●● ●
●
●
● ●
● ● ● ●●
● ●
●
●
●
●
●●●
●●●
● ● ●● ● ● ●●
●
●●
●
●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
● ●●
● ●
●
●● ●
●
●
●
● ●
●
●
●
●
●
● ●●
●
●
●●●●
●
●
● ●
●●
●
●
●
●
●
●
● ●●
●
●
●
●
●● ● ● ● ●
●
●
●
●
●
●●● ●●●
●
● ●
●
●
●
●●
● ●
● ●
● ●
●
●●
● ●
●
●
●
●
●
●
●
● ●
●
●● ● ● ●
●
●●
●
● ● ●●
●
●●
●
● ●
●
●
● ●●●● ●
●●● ●
●
●
●●
●
●
● ● ●●
●
● ●
●● ●
●
● ●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●●
●
●
●●
●
●
●
● ●
●
●
●
●
● ●
●
●
● ●
●
●
● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
● ●
●
●
●
● ●●
●
●
●●
●
●
●
●
●
●●
●
●●
● ●
●
●
●
● ●
●
● ●
●
●
●
●●
●
●●● ●
●● ●
●
●●
● ●
●●
●
● ● ●
●● ●
●● ●
●
●
●
● ●
●●
●●
●●
● ●● ●
● ●
● ●
●
● ● ● ● ●●● ● ●
●
● ● ●
●
●
●
● ●
● ● ●●● ● ●
● ● ●●●●● ●
● ● ● ●
●
●
●
● ●●
●
●
●
●● ● ●
●● ●
●●
●● ●● ●
●
●● ● ● ● ● ●
●
●
●● ● ●●
●●
●
●●
●
●
● ●
●●
●●
● ●
●
●●
●●●●
●
●●●
● ●
●●
● ● ●●●
●
● ●●
●●
●
●●
●
●●
●
●● ●
●●
●● ●
●
●
●● ●
●
● ● ● ●●●
● ●●
●●
●● ●●
●● ● ●
● ●
●
● ●●● ●●
● ●
●●
●
●
●
●
●
●●●
●
●
●
● ● ● ● ● ●●
●
●● ● ●
●
●● ● ●
●
●
● ●
● ● ●
●
● ● ●
●
●●
●● ●
●
●● ● ●
●
●
●
● ●●
● ●●●● ●●
●● ●
●
●●
●
● ●●
●
● ● ●● ●●
●
●
●
●●●
● ●
●
●●●
●● ● ●
●
●
● ●● ●
●
● ● ●
●
● ● ●● ●
●●
●
●●● ●●
●
●● ●
●●
● ●
● ●
●●●●
●
●●
● ● ●●
●
● ●● ●
●●
● ●
●
●●
●
●
● ●●●●●
●
●
●
●
●
●
●
●
●● ●
●
●●
● ●●●●
●
●
●● ●●● ● ● ●
●
●
●
●
● ● ●● ● ●●
●● ●●
● ● ●
● ●
●
●
●
●
●
●
●
●●
● ●
● ● ●
●
● ●
●
●
●● ● ●
●●
●● ● ●
●
●
● ●● ●
● ● ●●
●
●
●● ●
●
●● ●
●
●● ●
●● ●● ●
●
●
● ●
● ●●
●●
●● ●
●
● ●
● ●● ●●
●● ●
● ●● ●●●
●●
● ●●
●● ● ●●
●
●
●
●
●
●●
●● ●
●
● ●
●
● ●● ●
● ●●
● ●●
● ● ●●●●●
● ●
●
●●
●●
●●● ●
●● ● ●● ● ●
● ● ●●
●
● ●● ●● ●
●●
● ●●
●
● ●●
● ●
●● ●
●●
●●●
● ● ●
●
●
●
●●● ●
●
● ● ●●● ●●● ●
●
●● ● ● ●
● ●
● ● ●●
●● ● ●
●
●●●
●
● ●●●
●●
● ●●● ●
●● ●●
● ●
●●●●
●●
●
●●●
●● ●
●
●●
●●●
● ●
●●
●
●
●
● ●
●
● ● ●
●
● ●
●
●●
● ● ● ●●
●●● ●
●●● ●
● ●●
●
● ●
●
●
●● ● ●
●● ●
●
●
●
●●● ● ●
●
●●●
●
● ● ●
●●● ● ● ● ● ●
●
●
●
●●
●●
●
●●
●
●
●● ●
● ●
● ● ●● ●
● ● ● ●●
● ●
● ●●
●
● ●●●●●
● ●●
●
● ●
●
● ●● ●●●
●●
●●
●
●●
●●●●
●● ●● ● ●
● ●●●
● ● ● ●●●
●
●
● ● ● ● ●●
●●
●● ●● ●● ● ● ●
● ●
●
● ●
● ●
●
● ●●
● ●●
●●
●● ● ●
●
●
●
● ●
●●
●
●
● ● ● ●
●
● ● ●● ●●●●●●
● ● ●● ●
● ●
●●●
●
●● ● ●●● ● ● ●●●●
● ●
●● ●
● ● ●● ●
●
●●●
● ●●● ● ●● ●
●●
●● ● ●
● ●
●
●
● ● ●●●
● ●● ●●● ● ●●● ●●
●
● ●●
●● ● ●● ●
● ●●
●
●
● ● ● ●
●● ●
● ●●
● ● ●
●
●● ●
●
●
●● ●●
●
●
●
●
●
●
● ● ●
●
●●
●● ●
●
● ●●
●
●●
●● ● ●
● ●
●●
●●
●
●●
● ●●
●
●
●
●●●
●
●●
●● ●
●
●
●●
●
● ●●
●
● ●
●
●●●● ●
● ●●
●
● ●
●
●●
●● ●●●● ●
●
●
●●
● ●●
● ●●●● ●●
● ●
● ●● ● ● ● ●
●
●●
●
●
●● ● ●
● ● ●● ●
● ● ●●
●
●●
●
●● ●● ●● ●●●
●● ●
● ●
● ●● ●
● ● ● ●● ●
● ● ●
● ●● ●
●
●
● ●● ●●●● ●● ●
●
● ● ●
●● ● ●
●
●
●
●
●
● ●●
●
●● ●
●
●
●
● ● ●●●
●●
●●
●●
●
●●● ●●
●
● ● ●●●●
● ●● ● ● ● ●●● ●●
●
●●● ●
●● ●
●● ● ●
● ● ●
●
●
●● ● ●
● ●● ● ●
●
● ●●
●●
● ● ●
●
●
● ●●●
●
●● ●
●●●
●
● ●●
●
●
●
●
●●
●
●
●
● ●● ●
●●
●●
●
●
● ●●
●
●●●●●
●
●
●
●●
●●●
● ●●●
●
●●
●●
●
● ● ● ●●
●
● ●
●
● ● ●●
●● ●
●● ●
●●
●
● ●● ● ●
● ●● ●●●
●● ●● ● ● ●
● ●● ● ● ●● ●
●●● ● ● ●● ● ●●
●●
●●
●●●
● ●
●
●
●● ●●
●
● ●●
●
●
●● ●
●
● ●
●
● ● ●
●
● ●●●
●●
●●
●● ●●
●●●●●
● ●●
● ●●
● ●●● ●●●
●
●●
●
●
●
●
●●
●
●●●●●
● ●
●● ● ● ●● ● ●● ● ●
●●
●
● ●●
●●
●
●
● ● ● ● ● ●●●●
●
●
● ●
● ●●
●● ●● ● ● ● ●
●
● ●●
● ●
●
● ● ● ●● ●
● ●●
●
● ● ●●●●
●● ●
● ● ●●
●
● ●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●● ●
●
●●
●
●
●
●
● ● ●● ●
●
●
●
●
● ● ●
● ●●
● ●
●●
● ●●●
●●
●
●
● ●
● ●● ●
● ●● ●
●● ● ●●● ● ●
●
● ● ●●
● ● ● ● ●● ●
●
●● ●●
● ●●
●●●
●
● ●●
●● ● ● ●
● ●
●
●●
● ●
●● ● ●●●●
● ●●
●
●
●●●● ● ● ● ● ●
● ●● ● ● ● ● ● ●
●
●●●● ●
●●
●
●
● ● ● ●
●● ● ● ●● ●
●
●● ● ●
●●
●
● ●● ●
● ●●
●
●
●● ●
●● ●
● ●
●●
●●●
●
●
●● ● ●●● ● ●●
●● ●●
●
● ●● ●
●● ● ● ●● ●
●●●
● ● ● ● ● ● ●●●
●
●
●●●
●●●● ● ● ●●●
● ● ● ●
●●
●
●
●● ●●
●
●
● ● ●●
●●
● ●
● ●
● ●● ●●
● ●
●
●
●
●●
●●
●
●●●●
●●
●● ●
● ● ●
●
●
● ●
●
●● ●●
● ●●
●● ● ● ● ● ●
●●●
●
●● ● ●
●
●●●
●● ●
●
●
●●● ●● ●
●
●
● ● ● ●●
●● ●
● ●
●
●● ●●
●
●●● ●●●
●● ●
●● ●
●
●
●
●● ●
●
●
●● ● ●●●●●
●
●● ●
● ● ●
●●
●
●
●● ● ● ●
●●●● ● ●
● ●●
●
●
● ● ●
● ● ●
●● ●
●●●● ●
●●
●
●●
● ●●●
●
●
●
●● ● ●●● ● ●
●●●●
●
●
● ●
●
● ● ●●● ●
●● ●● ●●●
● ●
●●●
●
● ● ●●● ●
●
●
● ●●●
●●
●
●
● ●
●● ●●
●
●●
●
●●
●
●
●
●
● ● ● ●●●● ● ● ●
●
●
●
●
●●
●● ●
● ●
● ●
●
● ●
●
●● ● ● ●
● ● ●
● ●●● ●● ● ●●
● ● ● ●
●
●●
●
● ●
●
●
●
● ● ● ●
● ●●● ● ●
●
●
●● ●●
● ●
●●
●● ●●
●
● ● ●● ● ●
●
●
●● ●
●●●
●
● ● ●● ●
●●●●●
● ● ● ●● ●
● ●●●
●● ●● ●●
● ● ●
●
● ●●
●
●
●●
●● ●
●●●
●●
● ●
●
●
● ●
● ●
● ●● ●
●● ●●
● ●●
●
●
●
●●
●
●● ● ● ●
● ●
●
● ●●
●
●●
● ●●
●●●
●
●●●
● ●●● ● ● ●
●
●● ●
● ● ● ●● ●● ●
●●
●●●
● ●
●●
●● ● ●● ●
●
●
● ●
●●
● ●
● ●● ● ●
● ●
●
● ●● ● ● ● ● ●
● ● ● ●●
●
●●
●●
● ● ●
●
●
●
●●●
●●
●●
●
●● ● ● ●
●
●
● ● ●
● ●●●● ●
●●
●●● ●●●● ●● ● ●
●●●● ●●●
●
●
●● ●● ● ●
● ●●
●
●● ● ●
●
●
●
●
● ●● ●
● ●● ●
●
●●
●● ● ●
●
●
●
●
● ●
●
●
● ●
●
●●
● ● ●
●
●
●
●
●●
● ● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
● ● ●● ●● ●
● ●
●
●
●●
●
●
●
●●● ●
●●
●
● ● ● ●●
●
●
●●
●●
●●
● ● ● ●●●●
●
● ●●
●
●
●
●
●
●
● ● ●● ●
●
● ●●
●
●● ●● ●
●
●● ● ● ●● ● ●●
● ●
●
●
●●
● ●●
●
● ● ●●●● ●
●
●●●● ●
●
● ●
●●
● ●
● ●
●
● ●
●
● ●●● ●
●
●
●
● ●
●
●●●●
● ●
● ●
●● ●●●
●
●
●
●
●
●
●
● ●
● ●●●●●●
● ●
●● ● ●
●
● ●
●● ● ● ●● ●●●
● ●
●
●●● ● ●
●
●
● ●●
●●●●● ● ● ●
● ●
●●
●
●●●
●
● ●● ●
●
●
●●
●●
●
●● ●
●
●●●●
●
●●●● ●
●● ●
●
●●
● ●● ●
●
●●●● ●
●●●
● ●
●●
●
●
●● ● ● ●
● ●●●
● ● ●●●
●●
●
●
●●●
● ●●●●●● ●●
●●
●●
● ●
●●
●●
● ●●●
●
● ●●
● ●●● ●●●
● ●●
● ●●
●
●
● ● ● ●● ● ●●
●
●●● ● ●
●●
● ●
●
●●● ●
●
● ● ●●
● ●
●●
●●
● ●
●●
●
● ● ●
●●
●
●
●
● ●
● ●●●● ● ● ●●
● ● ● ● ● ●●●
●●
●● ●●●●● ●●
●●●●● ●
●
●
●● ●
● ●
●
● ● ●●
●●●● ●●●● ●
● ●
● ●●
●● ●●
● ●●
● ●● ●
●●
●●
●●● ●
● ●● ● ● ●● ●●●●● ●●
●●
●●●●
●
●
● ●●
● ●
● ●●
●●
● ● ●
●
●●
●
●
● ●●
●
● ●
● ● ● ● ●● ● ● ●
●
●●●
●● ●
● ●●● ●●
●
●
●● ●
●
●
●●
●
●
●●● ● ●●
●
●
●
●
●
●
●
●
●● ●● ●
● ● ●
●●●● ● ● ●●
●
●●
●● ●
●●
●●●●
●● ●● ● ●
●
●
●● ●● ●
● ●
●●
●
●
●●
●●
●●
●●
●
● ●●
● ●●
●
●
●
●●
●●
●●●●
●
●
●●●
●● ●
●
●● ●
●
● ●● ● ●●●●
●
●●
●
● ● ●●●
●
●
●●
●● ●●●●●●
●
●
●●
●●
●
● ● ●● ●
●
●
●● ●
●
●
●●
● ●●●●
●
●
●
●● ●●
●
●
● ●●
● ●
●
● ● ●
● ●● ●
● ●
● ● ●
●●
● ●● ●
●
●● ●●● ●
● ●
● ●●
●
●
●
●●
●● ● ● ●●
●
●
●● ●
●●
●● ● ●
●●
●
●●● ●●●
●
● ●
● ●●● ●●
●●●
●
●
● ●●
●● ●
● ●● ●●
●●
●
●
●●
●
●
●●
●● ●
●
●● ●
●
●
● ●
● ●
●
●
●
●● ●●● ● ●●● ● ●●●● ● ●
● ●●
●
●
●●●
●
●
●
●
●
●
●●
● ●●
●
●
●
●
●●
●
●
●
●
●● ● ● ●
●
●
●
●
●
●
●
●● ●
●
●
●
●
● ●
●
● ● ●
●
●
●
● ● ● ●● ●●
● ● ●●
●
●
●
●
●
● ●
● ●●
●
●●
●
●●
●●
●●
● ●
● ● ●● ●● ●●
●●
●● ● ●
●
●
●
●●●
●●
● ●
●
● ●
●●
● ●●●
●
●
● ● ● ● ●●●
●● ●● ●
●
●● ● ● ●●
●●
● ●
●● ● ●
● ●●
●
●
●
●●
●●●●
●
● ●●
●
●● ●
●
●
● ●●
●
● ●●
●
●●●● ● ●
● ● ●
● ●
●
●
●
●
●
● ●●
● ●
●
●●
● ● ●● ● ●●
●● ●
● ●●
●
●
●
●●●● ●
●● ●● ● ●●
●
●
●●
●●
●
● ●
●
●
●
●
●●● ●
●
●● ● ● ● ●
●● ●
● ●
●
●
●●
●
● ●
●
●
●●
●
●
● ● ●● ●
●
● ●
●
●● ● ●●
●● ●
●
●
● ● ●
●●● ●
●
●●
● ●
●
●
●
●
● ●● ●
●
●●●
● ●
●●
● ●
● ●●
●
●●
● ● ●●
●
●
● ●● ●
●
●●
●
●●●
●
●
● ●
●
●
●
● ●
●●
●
● ● ●●
●
● ●
●
●
●
●
●
●
●●
●
● ●● ● ●● ●●
●
●
●
● ●
●
● ● ● ●
●● ●
●●
●
●
●
● ● ●● ● ●
●
● ●
●
●
●
●● ● ● ●
● ●●●
●●
●●●
●●
●
●
● ● ●
● ● ●●
●
●
● ●
●
● ●● ●
● ● ●
●●●
●●●
● ● ●●
● ●●●●
●
●
●
●
●
●
● ● ●
●
●
● ●
●
●
●● ●
●
●●
●
●
●
●
●
●
●
●
● ●
●●●
● ● ● ● ●
●
●
●
●
●●
●
●
●
●
●
● ●
●
● ●● ●● ● ●
●
●●
● ●
●
● ●●
●
●
●
● ●●
●
● ●●
●
●
● ● ●●
●
●●
●
●
●
●●
●
● ●
●
●
●
●
●
●●
● ● ●
●
● ● ●●●
●
●
● ●● ● ●
●●
●
●
●
● ●●
●
●
●
●
●● ● ●●
●
●
●
●
●
●●
● ● ●●
●
●
●
● ● ● ●●
●
●
●
●
●
●
●
● ● ●
●
●
●
●
●
● ● ●
●
● ●
●
●
●●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
● ●
●
● ●
●
●
●●
●
● ● ● ●
●
●
●
●●
●
●
●
●
● ●●
●
●
●
●
●●
● ● ●●
●
●
● ● ● ●
●● ●
●
●
●
●
●
●
●
●
●
●
● ●
●●
● ●
●●
●
●
●
● ● ● ●
● ●
●
●
●
●● ●
●
●
●
●
●
●● ● ● ● ● ● ● ●
●
●
●
●
●
●●
●●
● ● ●●●● ●
● ●
●
●●●
● ●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
● ●
● ● ● ● ● ●● ● ●
●
●
●
●
●
●●
●
●
● ●
● ● ●●
●
● ●
●
●
●●● ●
●
●● ● ● ● ●
●
●
●
●
● ●●
●
●●
●
●
●
● ●
●
● ●●
●●
●●
●
●
●
●●
●
●
●
●
●
●●
● ●● ● ●
●
●
●
● ●
● ● ●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●
●●
●
●
●
●
● ● ●
● ●●
●
●
●
●
●
● ●●
●
●
●
●
●
●
● ●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
● ●
●
●
●
●
●
●
●
●
●
●
● ● ●●
●
●
●
●●●
●
● ●
●●
●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
●
● ●
●
●●
● ●
●
● ● ● ● ● ●●
● ●
● ●
●●
●● ●
●
● ●● ●
●●
●
●
●●
●
●
●
● ●
●
●
●
●●
● ●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
● ●
●●
●
●
●●
●
●
●●
●
●
●
●
●●
●
● ●
● ●
●●
●
●
●●
●● ●
●
●
●
● ● ●
●
●
●●●
●●
●●●
● ●● ●
●
●
●
●●●
●●●
●●
●
●
●
●
● ●
●
●
●● ●
● ●
●●● ●
●
●● ● ●
● ●●
●
●
●
●
●
●
●
●
●● ●
● ●
●● ●
●●
●
●
● ● ● ● ●●
●●
●
●
●
●
●●●
●
● ●● ●
●
●
●
●
●● ●
●
●
●
●● ● ● ● ●●
● ●● ● ● ●
●●
●●
●
●
● ●●●● ● ● ●● ● ●
●
●
● ●●
●
●
●
● ●●
●● ● ●●
●
●
● ●
●
● ● ●●
●
●
●● ●●
●
●
●
●
●
●
●
●
●
●●●
●● ● ●●● ● ● ●●●● ● ● ●●
●
●
●
● ●● ● ●● ●
●● ●
●
●
●●
● ●●
●
●●●●● ●●● ●●●
●
● ●
●
●● ●
● ●● ●
●●●
●● ●
●
●
●
●●
●● ● ● ●●● ●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
● ●● ●
●
●●
● ●
● ●
●
●●
●●●● ●●
●● ●
●
●●
● ● ●● ● ●●
●
● ●●●● ●
●
●
●
●
● ● ● ● ●● ●
●
●●● ●
●● ● ● ●● ● ● ●
● ●
● ●● ●●●
●
●
●
●●
●
●
●
●
●
● ●●●
●
● ●
●
●
● ●● ● ●
●
●
●●● ● ●
● ●
●
● ●●●
● ●
● ●●● ●●
● ● ●● ●
●
●●● ●
●
●●●
●
●
●●●
●
● ●● ●● ●
●
●
●
●
● ●●●
●
●●
● ●●
● ● ● ●●
●
●● ● ●
●
●● ●●●
● ●● ●● ● ●●●
● ● ● ●●
● ● ●●
●
● ●● ●
●
●
●●● ●● ●●
●●
● ● ●●●●
● ●
● ●
●●
●
●●
●
● ●● ●● ●
●
●● ● ● ● ● ●●●
●● ● ●
● ●
●●
● ●● ● ●
● ●
●
●
●● ●
● ●
●●● ●●● ● ●●●
●●
●●●
●
●● ●
● ●●
●
●●
●
● ●● ●
●
●
● ●
●● ●
●●
●
●
●
● ●●
● ●●
●●
●
●●● ●●
●
●● ●
● ● ●● ● ●●
●● ● ●
●
● ●●●●●● ●
●
●● ● ●●
●
●
●●●
●
● ●●●●
●
●●
●
●
●●
●
●
● ● ●●● ●● ● ●●
●●
● ● ●
●●●●●●
● ●●
● ● ●● ● ●●● ●●●
● ● ●
● ●● ●
● ● ●● ●●
●
●
● ●●
●
● ●
●
●
●●●●
● ●
● ● ● ●● ●
● ●
●●
●●● ●●● ● ●
●
●
●
● ●●
● ●●
●●
●● ● ●●● ● ●
●
●
●
● ●●
●●
●●●●●
●
● ●●●
●
●●●
● ●
●●
●● ● ●
●
●
●
●
● ●●
●●
● ●●
●
●●
●●
●●
●
● ●●●● ●
●● ●
● ●
● ●●
●
● ●
● ● ●●● ● ● ●●
●
●
● ●●
●●
●●● ●●● ●
● ●●
●
●
●●
●
● ●
●●
●
●
●
●●● ●● ● ●●●
● ●●
●● ●
●●●● ●●●
●●
●
●
●
●
●
●
●●
●●●
●
●●
●●●●
● ● ● ● ● ●● ●
●
●
●● ● ●●
●●●●●●
●
●
●
●
● ●● ●●
● ●
●
● ●
● ● ●●
●●
●●●
●●● ●
●
●
●
● ●
●●● ● ●●
● ● ●●
●●
●
●
● ●●●●
● ●● ●●
●
● ●●
●
●● ●
●●●
● ●
●● ● ●
●● ●
●
●● ●
●
●
●
●●●●
● ●●
● ●●●
● ●●● ●
●
●●
●●
● ●
● ●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●●●●● ●●●●●
● ●
● ●
●
● ● ●●●
● ● ● ●●●
●●
●
●
●
●
● ●
● ●
●●● ●●●●●
●
●● ● ●
● ●●● ●●
●●● ●●
●
●●
●●●
●● ●
●● ● ●
●●
●
●●
●●
● ● ● ●● ●●●
●● ●
●
●● ●
●
●
●
● ●●● ● ●
●
●●●
●
●●● ●
●●●
●
●
●●●●●●
●●● ●● ●● ● ●
●●
●
● ●
●
●
●●
●
●
● ●
●
●
●
● ●●
●●
●●●● ●
●
● ●●●
●
● ●
● ● ● ●●
●
●
●●● ●
●●
● ● ●
● ●
●
●● ●●● ●●
●
●●
● ●
●
●
● ●
●
●●
●●●
●
● ● ●
●
●●
● ●
●●
●
●●●
●
●●
●●●
●
●●●●
●●
●●
● ●●●●●
●
●
●
●●
●●
● ●●● ●●
● ●● ● ● ●
●
●●●●●●●●●●
● ●
●●
●●●● ● ● ●●
●● ●●
●● ● ●
●
●●
●● ●●
●
●●●● ● ● ● ● ●
●
●●
● ●
●
●●
●●
●
●●
● ●● ● ● ● ●
● ● ● ●●● ●●● ●●
●●
●●
●●
●
●●
●
● ●●
●●● ●
● ●
● ● ● ●
●●●
●● ●
●
● ●● ●●●●
● ● ●●
●
●
● ●●●●●●
●●
●●
●
●
●●
●
● ●●
● ● ●●
●
●
●● ●●
● ●●
●
●
●
● ●
●●●
●
●
●● ●●
●
●●
●● ●
●
●
●● ●
● ●
●●● ●
●● ●●●●
●
●
●●
● ●● ● ●
●
●● ●●●
●
● ●●
●
●
●
●●●● ●● ● ●
●●
●
●●
●
●● ●
●
●
● ●●● ● ● ● ● ●● ●
● ●● ●
●
● ●
●
● ●●●●●● ●●●
● ●● ●●
●●
●●
●● ●
● ●● ● ● ● ●
●
●
● ●●
●
●●●●
●
●● ●●
●
●
●
●● ●
●
●
●●
●●
●
●●
● ●
●
●
●●
●●
●
●●
● ●●●●
●
●
● ● ●● ●● ● ●
●
● ● ●●
●
●●● ●● ●
● ●●
●
● ● ●● ●●● ●● ●
●
● ●●● ●
●●
●●●
●
●
●●
●
● ●●
●●●●●●
●
●
●
●●● ●
●
●
●
●●●
● ●
●
● ●●
●
●●●
●
●●
●
● ● ●
● ●
●
●● ● ● ● ●
● ●●●
●●●●
●●
●
●
●
● ●
●●
●●
●
●
●
●●●
●●● ●● ●
●
● ●●● ●
●
● ●● ●
●●●
● ●●●
●
●
●
●●● ●
● ●
●
●
●● ●
● ●
●
●
●
●
●●●
●●● ●
●
●
● ●● ●●● ● ●
●
● ●
●●
● ●
●
● ●●
●
●●
●●
●
●●
●
●●●● ● ● ●
●
●●
●● ●
●●● ●
● ● ●
● ●●
●
●
●
●
● ●●
●●
●● ●● ●●●●● ● ● ● ●
●
●●●● ● ●● ● ● ●
●●
● ●●●●
●●
● ●
●● ● ● ●
●●
●
● ● ●
● ●
●●●● ●●
● ●
●●
●●●
●
●
●
●●●●
●●
●● ●●
●
●
● ● ● ●● ●
●
● ● ●●●
●● ●●●●
●
●
● ●●● ●●
●
●●●
●
●●●● ● ●● ●●●●●
●●
●
●●
●●● ●
●
●●
●
●●
● ●
● ●
●●
●● ●●● ●
● ●
● ●●
●
●
●●●
●
●
●
●
● ●
●
● ●●● ● ●● ● ●●● ●
●●
●●
● ● ●
●
●●
●●
●
●
● ●
●
● ●● ●●●
● ●
●●
●
●●
●
●●●● ● ●●
●● ●● ● ● ●
●
●
●● ●
●●●
●
●
● ● ● ● ●●
●
● ●● ● ● ●
● ●
●
● ●
●●
●
●
●
●●
● ●●● ●● ●
● ● ●
●
● ●●●● ●●●● ●● ● ● ●● ●
●●
●
●
●●
●●
●
●●●
●
●
●● ●●● ●●
● ●● ●●●●●
●
●
●
●
●●●●●●
●
●
●
●●
●●
●●
●●●
●●● ●
●
●● ●●
●●
●●●●
●●
●●
●
●
●
●
● ●●●●●●
●
●
● ●
● ●
●
●
● ●●
● ●● ●●●
●
●● ●●● ●● ● ●●●
● ●● ●
●
●
●●●
●
●
● ●
●● ●●
●
●
● ●
●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●
●● ●●● ●
● ●
● ● ● ●● ●
●●
●
●
●
● ●
● ●● ●●●
●
●
●
●● ● ●
● ●●●●
●●
●●
●
●
● ●●●● ●
●
●
●●
● ● ●●
● ●
●
●●
● ●
●●●●
● ●●●●●
●●
●● ● ●
● ● ●●
● ●●
● ●● ●
●● ●●
● ●
●●● ●●
● ● ●●●
●
● ●●●
● ●●
●
●●●● ● ● ● ● ●
●
●
●
● ●●
●●
●
●
●
●
● ●●
● ●
●
●●
●●
●● ●
●●
●
●
● ●●●●
●●● ● ●●
●
● ●●
● ●● ●
●● ●
● ● ●●
●
●
● ● ● ●●● ●
●
●● ●
●
●
●
●
●●● ●
●
●● ●
●●
●●
●●
● ● ●
●●● ●●
● ●●
● ●
●●
●●● ●● ● ●
●
●●
● ●
●●●● ●
● ●●
●●●
● ●●
● ●
●
●
●
●
●
●
● ●
●●
● ●● ●● ●●● ●●● ●
● ●●
●● ● ●
●
●
●
●●● ●●●● ●
●●● ● ● ● ●
●
●
● ●
●
● ●●
●●
●●
●
●
●●●●
●●●●
●
● ●● ● ● ●
●●●●
●● ●●
●●
●●●● ● ● ● ●
●
● ●●● ●● ●
●
●● ●● ● ●●
●● ● ●● ● ● ●●
● ● ● ● ●● ●
● ● ●●
●
●●●
● ●
● ●●
●
●
●
●●
● ●
●●
● ● ●●●●
●
● ●
●●
●
●● ● ●
●●
●
●●●●
●
●
● ● ●● ●
●●●
● ●
●● ● ●
●●
●● ●
●
●●
●
●●●
●● ●
●
●
●
● ●
●● ●
●
●
●●
●
●
●
●●●●●
● ●
●
● ● ●
● ●● ●
●
●●●
●● ●
●
●
●
●●
●
●● ●
● ●●
● ●
●● ●●● ●● ●●● ●
●
● ●● ●
●●●
●●
●●
● ●●● ● ●
●
● ● ●
●●
●
●
●
● ● ● ●●
● ●
●
●
●● ●●
● ●●●
●● ●● ●
●
● ●●
●
●
●
● ●
● ●
●● ●
●
●●● ● ●
●● ●
●●
● ●
●● ●●
● ●
●●
●
● ● ●● ●● ●●
●
●
●
●●
●● ●●●●
●
● ●●
● ●
●
●●
●
●
● ●●
●● ●●
● ●●
● ●●
●
● ●● ●
● ●●
● ●
●● ●●
●
●●● ●● ●
● ● ●●●●● ● ●
● ●
● ●●
●
●●● ●●
●● ●
● ●
●
●●●
●● ●
●
●
● ●● ●
●● ●● ●●●●●
●●
● ●
● ●●
● ●
●
●●● ●● ●
● ●
●● ●● ●
●●●
● ● ● ● ●● ●
● ●
●●
●●
●●
● ●
● ●●
●
●
● ●●●
● ●●
●●
●
●
● ●●●
●●● ●
●
● ●●
●
●
● ●●●● ● ●● ●●●●
●
●● ●●●● ● ●
● ●
●
●
●
●
●
● ●●
●●
●
●
● ●
● ●
● ● ●●
● ●● ● ●
● ●
● ●● ● ●● ●
●
●
●
● ●●●● ●
●●
●
● ●●
● ●●
●
●
●● ● ● ● ● ●● ●
●
● ●
●
● ●●
●● ●
●
●●●●
●
●
●● ● ● ●● ● ● ● ●
●●
● ● ● ●●
●
● ●● ●● ● ●●
●●
●
●
●● ●
●●
●● ● ● ● ●●●●
● ●
●
●
●●
●
● ●
●●
● ●
● ●
●
● ●
●
●
●●●
● ●●●
●●
●● ●
● ● ● ●● ●●●
●
●●
●●
● ●
● ● ●● ● ●
●●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ●
● ●
●
●● ●
●
●
●● ● ● ● ●
●●● ●● ● ●
●
● ● ● ●● ●
●●
● ●
●
●●
●
●
●
●
●
●
● ● ●
●●
●● ●
● ●
●
●● ●
●
● ● ●● ●
● ● ● ●
●●
● ● ● ●●● ●
● ●●●
●●
●
●
● ● ● ● ●●● ● ● ●
●
● ●● ●
●●
●●
● ●
●
● ● ●●
●
● ●●
●
●
●●
● ● ● ●●●
● ● ●●●●●●●● ● ●
●
●●●
●
● ●●●
●
●
●
●
● ● ●● ●●●● ●
●
● ●
●
●
● ●●● ●●
● ●● ● ●
●●●
● ●
●●
●
●
● ●●
● ●●
●
●
●●
●
●●
●
● ●
●
● ●●
● ● ● ● ●● ●
●
●
● ●
●
● ● ●
●
●
●
●
●● ● ●
●
●●● ●
●
●●●
●
●●
●
●●●
● ●
●
●
●
●
●
●●
●
●● ●
●
● ●●●●●●
● ●
●
●●
●
●●
●●
●● ●●● ●●
●
●
●●
●●
●
●
●
●
● ● ●
●
●
●●
● ● ● ● ● ●●
● ● ●●
●
●
●
●●
●
●
●
● ● ●
●
●
● ●
●
● ●
● ●● ●
●
●
●
●
● ●
●
●●●
●●
●
●●
●
●
● ● ●
●
●● ●
●
● ●● ●●
●
●● ●
●
●
●● ●
●
●● ● ●
● ●
●● ●
●
●
●●
●●
● ●
● ●
●
●
●
● ● ●
●
●●●●●●
●
●●●●
●
●
●
●
●●
●
● ● ●●
●
● ●
●
●
●●
●
● ●
● ●
● ●●
● ●
●
● ●
●
●● ●
● ● ●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●● ●●
● ● ● ● ●●
●● ●
● ● ●
●
●
● ●
●
●●
●
●
●● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●● ● ● ●
●
●
●
●
●
● ●
●
● ●
●●
●
● ●
● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●● ●●
● ●● ●
●
●
●
● ●
●
●
●
●●
●
●
●
● ●
●●
●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
● ●
●
●
●
● ●
● ● ●
●
●
●
● ●
●
●
●
●
●
●
●
●●
● ●●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
−3.5
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
● ● ● ● ●
●
●●
●
●
●
●
●
● ● ●
●
●
●
●
●
●
●
●
●
●
●
●● ●
●
● ● ● ● ●● ●
●
●●
●
● ●● ●
●
●
●●
●●
●
●
●
●
● ●
●
●●
●
●
●
●●●
●
●●
●●
●
●
●
●
●
●
●
●
●● ●
●
●
● ●
●
●● ●
●
●
●
●
●
●
●●
●
●● ●
●
●●
● ●●
●●
● ●
● ●
●
●
●
●
●
●●
●●
● ●●
●
●
●
●
●
●●
●
●
●● ●
●●
●
●
● ● ● ● ●
●●
●
● ● ●● ●
●●●●
●●
● ● ●●
●
● ● ●
●● ●
●
●
● ●
●●
●
●
●
● ● ●● ● ●●●
●
●
●
●●
●● ●
●●●
●
●
●
●
● ●●
●
● ● ● ●● ●
●
● ● ●●
● ● ●●
●
●
●
● ● ●
●
● ●
●
●
●●●
●●
●
●
●●
●
●
●● ● ●
●● ● ● ●
●
●●
● ●●
● ●●
●
●●
●
●
●
● ●● ●●●
●●
●●●
●
●●
●●
● ●●
● ● ●●
●
●
●●
●● ● ●●
●
●● ● ●
●
●●
●●
●
●
●●●
●●
●
●
●
●
●
●
●●
●●
● ● ●●●
●●●
● ●●
● ● ● ● ●●
●●
●
●
●● ●
●●
● ●
●
●● ●
●●●●
●
●●
●●● ●●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●●
●
●●
●●
●
●
● ●●●
● ● ●●
●●
● ● ●● ●
●
● ● ●
●●
●
●
● ●●
● ●●●●●
●● ● ●●
●
●● ●
●●
●● ●
●●●
●● ● ● ●
● ●●
●
●
●
●
●● ● ●
●●●
●●● ● ● ● ● ●
● ● ● ●
●
●● ●
●
●●●● ●● ● ●● ●● ● ●●●●
●
●
● ●●●
●
●
● ● ● ●
● ●
●
●
●●●
●
●● ● ●
●
●
●●
●
● ●● ● ● ● ● ● ● ●
●● ●
●●
●
●
●●
●● ●● ●●
●
●
●● ●● ● ● ●
●● ●
●
● ● ● ● ● ●●
●●
●
●
●
●● ● ●●
●●●
● ●● ●
●●
●
●
● ● ● ●●●
●
● ●●●
●●
● ● ● ●
●
●●
●
● ●
●●●●
●● ●
●
● ●● ●
●●
● ●●●●
● ● ●● ● ● ● ● ●
● ●●
●
● ●
●
●●
● ● ● ●● ●●● ●
● ●
●●● ●● ●● ●
● ●● ●
● ●
●
●●●● ● ●
● ●●● ● ●
●
●
●●
● ●
●
●
●
●●
● ●
● ●●●● ●
●●
●
●●
●●●●●
● ●
●
●●●
●●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●● ●●●●
●● ● ●
●●● ●
●●● ● ●●
●●
●
● ●●
●● ● ● ● ●
● ●●● ●
●● ●
●● ●
● ●●
●
●● ●
●
●● ●
●
●
●
●
●
● ● ● ●●
●
●●
●●
●
● ●●
●●●
●
●
●
● ●●
●
●●
●● ● ●
●●●
●
●
●
●
● ●●●
●
● ●●
●
●● ●
●●
●● ●●
●●
●●
●●
●
●
●● ●
●● ● ●●● ●● ●
●
●
●●● ●
●●
● ●●● ● ●
●
● ● ●
●●
● ●
●● ●
●●
●● ●
●● ●●
●●●
●
●
●
●●●● ●
● ● ●
● ●
● ● ● ●●● ● ● ●
●●
● ●● ●●● ●●●●●
●●
●
●
● ●
● ●
●
● ●
●●
●
●
●●
● ●
●● ●
● ●●
●
●● ●●
●● ●●
●●
●●
● ●
● ● ●
● ●●
●
●
● ●
● ●● ●● ●●● ●
●●●●● ●●●
● ●●
●
●●● ● ● ● ●
●
●
● ●●
●
●
●
●●
●●●●
●
●●●
● ● ●●
● ●
●
● ●
●●● ● ●● ● ●
●●
● ●●
●●● ●
●●
●●
● ●● ●●
● ●●
●
●
●● ● ●
● ●
●
●
●● ●
●
●
●
●
●●●
● ●
●
● ●●●
●●
● ●●
●
● ● ●● ● ●● ● ● ● ●
●●●
●●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●● ● ●● ●● ● ●●
● ●●
●
●
●
●
●●● ●
●● ●●●
● ●●
● ● ● ●●
●
● ●
●
●●
● ● ●
● ●
●●
●●●●●● ●
●●
●●●●●●●●
● ●● ●●● ● ●
●
●●● ●
●●
●
● ● ●
●
●
●
● ●● ●
●● ●
●● ●
●
●●
●● ●
● ●● ●
●
● ●●
● ●
●●● ●●●
●
●
●
● ●● ●●●
●
●
●●
●●
● ● ●
● ●●
●
●● ● ●●● ●
●●
●●
●● ●
●●●
● ●
●●●
● ●
●● ●●
● ● ●●
●● ●●● ●
●
●
●●●
● ●●● ●
●● ●●
●●
●
●
●
● ●
●
●
●●
●●
●
●
● ●●● ●
●
● ●● ●● ●●
●●●
●
●●
●●
●●
● ● ●●
● ● ● ●●●
●
● ●
●
●
●● ● ● ●
●●
● ● ●
●
●●●●●● ● ● ●●
●●●●
●●
●
●
●●
●●
●
● ● ●●●●●
●
● ●●
●
●●
●
● ●● ●
●●
●
●
●
●
●
●●
●●●
● ●●● ●●● ●
●
●
●
● ●●● ●
●
●
● ● ●●
●● ● ● ●●
●●
● ● ●●● ●
●
●● ● ● ● ●●●●
●●
● ●● ●●
● ● ● ●●● ●●
●●●
●●●
●● ●●● ●●
●
● ●● ●
● ●●
●
●●● ● ●
● ● ●● ● ●
● ● ●●
●● ●●
● ● ● ● ●
●●
●
● ● ●●
●●
●
●●● ● ●●● ●
●● ●
●
●● ● ●
●● ● ●
●
● ● ● ●●●
●
●
●
●●●
●
● ●
●
●
●
●
●●●
●
● ●●
●
●
● ●● ● ● ● ● ●●●
●
●
●
●●● ●
●
● ●●
●
●● ●●●●
●
●●
●
●
● ● ●●●● ●● ●
●●
●●
●●●●●
●● ●
●●●●
● ● ●
●
●
●●● ●●
●●
●
● ●● ●● ●●● ●●
● ● ● ●●
● ●●●
●● ● ●
●●
●
●
●●
● ●●● ●
●●
●
● ●●
●
●● ● ●
● ● ●●
●
●● ●● ●●● ● ●
● ●
●●
●●● ●● ●●
●
●
●● ● ●● ●
● ● ●● ●
●
●●
●● ●● ●
●
●●
●
●
● ●
●●
●●
●●●●● ●●●
● ●
●
● ●●
● ● ●● ●●
●●
● ●
●●●●●● ●
●●
● ●●
●
● ●●
● ●●
●● ●
●
●
●
●
●●
●● ●
● ●
●●
●
● ●● ● ●● ● ●
● ● ●● ●
●● ● ●
●●●
●● ● ●● ●● ● ● ● ●
●
●
●●
●
●
● ●
● ● ●●
●● ●●● ●● ●●●● ●
●
● ●●
●●● ●● ● ● ●● ●
● ●
●● ●
●
●
●
●● ● ●
●●
● ●
●●
● ●●●
●● ● ●● ●● ●
●
●
●●
●●
●
● ●
● ●● ●
●
●●
●
●
●
● ●● ●
●●
●● ● ●● ● ●● ●
●
●
●●
● ●
●
●
●● ● ●●
●
●●
●
●●
●
●● ●
●
●●
●
● ●
●● ●●
●●
●●●●
●
●
●●
● ●
●●
●
● ●●●●
●●
●● ● ●● ●
●
●
●
●●● ●●
● ● ●●● ● ● ●
●● ●
●●
●●● ●
● ● ● ●●
●● ●●●
●●●
●
● ● ● ●●●
●●
●
● ● ●●● ●● ●
● ●
●
●●●
● ●
●●
● ●● ●● ● ●
● ●●●
●
●●● ●●●●
●
●●
●
● ●
●
●● ●● ●
●● ● ● ●●
●
●●●● ● ● ●● ●
● ●● ●
●
● ●● ● ● ●●
●● ●●
●● ● ●
● ● ●● ●
●●
●
●
● ● ●●
●● ●
● ●
●●
●● ●
●
● ● ●
●●
●
● ●●● ●●●●●● ●
●
● ●●● ● ●●
●
● ● ● ●●● ●
●
● ●●
● ●
● ●
● ● ●
●
●●
● ● ●●
●
● ●● ●
●
●
●●●
● ●
●
●● ●●●●●●●●●●●
●●
● ●●
●
●● ● ● ●● ●●● ● ●
●● ● ●●
●
●● ● ●●●● ● ●
● ●●● ●● ●
●●●
●● ●
●
●●●●
● ●
●
● ●●
● ●
● ●
●●
●
●● ●
●
●●
●● ●
● ●●
●●●●● ●●● ●
●●●●●
●● ● ● ●
● ● ● ●●●
● ●●
● ●
●
●
●
●●
●
● ● ● ● ●●
●● ●●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
● ● ●●
● ● ● ● ● ● ●●● ●● ●
●● ● ●● ●
● ●●
●
● ●
●
●
● ●
● ●
●
●
● ●
●
●
●● ●●
●● ● ●
●
●● ● ●● ●
●●
●●
●●●
●
● ●
● ●● ● ●
●●
●●●●
●● ●
●
● ●
● ●
●
●
●●●●
●
●● ●●● ● ●●
●●●● ●● ●
●●
● ● ●●
●
●
●
● ●●
●●●
●●
● ● ●
●●
●
●
● ● ● ●● ● ●●●●● ●●
●● ● ● ●
●
● ●● ●
●●
● ●
● ●●●●
●
●●●
●
●
● ●● ●●● ●● ●
●
● ● ● ●● ● ● ●
● ●●
● ●●● ●● ●
●
●●●
●●
●●
●
● ●●
● ●● ●
● ●●●● ● ● ●● ● ● ●
●
● ●●
●●
●
● ●
●● ●●●
●
●
●
●
●
●
●
●
●
●● ●
●
● ●
●● ●
● ●
●● ●
●
●
●●
● ● ●●
● ●
●
●
●● ● ● ● ●
●
● ●
●
●●
●● ● ●
●● ● ●
● ● ●●
●
●● ●
●● ●●●●●●
● ●● ● ●
● ●●
●●●
● ● ●
●
●●
● ●●●●●●●
●● ●
●●
●
●● ●
●●●●●
●
●● ●
●●●
●
●●
●●● ●
●● ●
●
● ●●●
●● ●
●
●
●
●
●
●
● ● ●●● ●
●
●
●
●●
● ●
●● ● ●
●●
●● ●●●
●
●●● ●●
●
●
●
●
●●
●
●
● ● ●●●
●●●
● ●●
● ●●
● ● ●● ●
●
●●●●●●
● ●
●
●
●●
●● ●●
●●
●
● ●●
● ●
● ●●● ● ● ●● ● ● ● ● ●●● ●
● ●
●●
● ●●● ●●● ●
●● ●
●
●●● ●
●●
●● ●
●
●
● ● ●
●●●●
●●
●
●
● ●
●
●
●● ● ●●
●
●●●
●●
●
● ●●
●
● ●●
●
●●
●
●●
●
● ●
●
●●●
●
●●●
●● ● ●
● ● ● ●● ● ●
●● ●
●●● ●
●
● ●
●● ● ●
●
● ● ●●●
●●●
●●●● ●●
●● ● ●
●●
● ● ●●● ●
●●
●●●
● ●●● ●
●
●
●
●
● ●● ●●●
● ●●
●●
●●
●
●●●
●
●● ●●
●
●
●●
●
●
● ●
● ●●●
● ●
●
● ●●● ● ●
●
● ●●●
●
●●
●●● ●● ●
●
●
●
●● ●●●
● ●●
●●
●●
● ●● ● ● ●
●
● ●●●
●
●
●
● ●● ● ●
●
● ● ●●
● ● ●●●
●
● ●
● ●● ●●
●●
●
●
●
●
● ●
●
● ● ●●
●
●
●
● ● ● ●●
●
● ● ●
●●●●●●●●
● ●● ● ● ● ● ● ●
●● ●
●●
● ●●●
● ● ● ●●● ●
● ●
●
●● ●
●●
● ●●
●
●
●
●●● ●● ●●● ● ● ●
●● ●
●
● ●
●
●
●●
●
● ●
●
● ●●● ● ●●
●
● ●●●● ●
●
● ● ●
●
●
●● ●
●
●
● ●●
●
●● ● ●
●
●●
●●●
●● ● ●
● ●
●
● ●
●●
● ●●
● ● ●● ● ●
●
●
●
●
● ●
●●●●●●●●● ●●●●
●
●●
●
● ●●
●
●
●
●
●
● ● ●
● ●
●
●
●
●●
●
●
●●●●● ● ●● ●● ● ●
●●
●
● ●●
● ●●
● ● ●
●
●
●
●● ● ● ●
●●
●
●
●
●● ●
●
● ●● ●●
●
● ●● ● ● ●
●
●● ●
● ●
●● ●
● ● ●●●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●● ● ●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●● ● ● ● ●
●
● ●● ●
●
●
●
● ● ●●
● ● ● ●
●●
●
●●● ● ●
●
● ● ●● ●
●● ● ●● ●
●
● ● ●●
● ●
●
● ●●
● ● ● ● ● ●●
●
●
●● ●
●
●●
●●
●●
●●
●●● ●
● ● ● ●● ●
●●●
● ● ●
●●
●
●● ● ●●●●
●
●
●
● ●
●● ●
● ●● ●● ●● ●
●●
●
●
●
●
●●
●
●
●● ●
●●
● ●
●
●
●
●
●
●● ●● ● ●
●●
●
●
●
●
●
●
●
● ●
● ● ●●
●●● ●●
●
●
●
● ●● ●●
●
●
●
●
● ● ●
● ●●
●●
● ● ● ● ●
●
●
● ●
●
●
●
●
●●
●
● ●●● ●●●●●
●
●●
●
●
● ●
●
●
●
●
●●
●
●
●
●
●
● ●●
● ●● ●
●
●
●
●
●
●
●
●
●● ● ● ● ● ●
●
●
●
●
●
●
●● ● ● ●●
● ● ●●
●●
● ●
●●●
● ●●
●● ●●
●●
● ●
●
●
●
●
●
●
●
●
●
● ● ●● ● ●●
●
● ●
●
●●
●
● ●
●
●●
● ● ●
●
●● ●
● ●
● ● ●
●
●
● ●
●
● ●
●
●
●
● ● ●● ● ●
● ●● ● ●● ● ● ●
●●
●
●●
● ● ● ● ●● ●
●
●●
●
●
●
●
● ●
●
●
● ●
● ●
● ●●●●
● ●●
● ●
● ●
●
●
●
●
●●
● ●
●
●●
●
●
● ● ●●
●
●
●
●●
●
● ●
●
●
● ●
●
● ●
● ●●
● ●● ●
●
●
● ●●
●
●
●
●
●
●●
●
● ●
●●
●●
●
● ●●
● ●
●
● ●
●
●●
●
●●
●● ●
● ●
●● ●
●
● ●
●● ● ●
●
● ●●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●● ●
●
●
● ●● ●
●●
●● ●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●●
●
●
●
● ●●
●●
●●
●
●
●
●
●● ●
● ●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
● ●●
● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
−4.0
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●● ●
●
●
●
●
●●
●
●
●●
●
●
0.4
●
●
●
●
●
−4.5
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●●
● ●
●
● ● ●
●
●
● ●
●
● ●
●
●
●
●
●
●
●
● ●
●●
●● ●
●
●
●●
●
● ●
●
●●
●●
●
● ●● ●
●
●
● ●
● ● ●●
●
●
●
●
●●
●
●
●
●
●
●
● ●
● ●
●
●
●
●●
●
●
●●
●
●
● ●
●
●
●
●
● ● ●●
●
●● ●
●
● ●
●
● ●
● ●
●
● ●
●● ●
●
●
●
●
● ●
●
●● ●
●● ● ● ●
●● ● ●
●●
● ●
●
●
●
●
● ●
●
●● ● ●●
●●
● ●
●● ●
●● ●
●
● ●● ●
●●
● ●
●
●
●
● ●● ●
●
●
●
●
●●
●●
●
● ●
●
●
●
● ● ●●
● ●
●
●
●
●● ● ●
●● ●
●● ●
●
●●●●●
● ●
●●●
●● ●
●
●
● ● ●● ●●
●
● ●●
●
●
●
●●
●● ●●
●
●
● ●● ● ●
● ●
●● ●
●●● ● ●● ● ●●
●
●
●●
●
● ● ●●
●●
●●
●●
●
●
●●
●
● ●
●● ●●
●●
●
●●
●
●
● ●●●
●●
●
●●
● ● ●●●●
●
● ● ● ●● ●
●●
● ●●
● ●
● ●● ●●
●●
● ●● ●
●
●
● ●●
● ●●
●●
●
● ●● ● ●
●
● ●
● ● ●
● ●● ●
●
● ●●
●●
●
●●●● ●●
●
● ●
●
●
●
● ●
● ● ●
●●
● ● ● ●● ● ●●●
● ●
●
●
●●
●
●●
● ●
●
●
● ●●
●
●●
●
● ●●
●●●● ●
●
●
● ●●● ●
● ●● ●
●●● ● ● ●
● ● ●
● ●●
●●
●● ●●
●
●●
●
●
● ●
●●
● ● ●●
●●
● ●
●●
●●●
●● ●● ●●
●
●
● ● ●
● ● ●●
●
●
● ● ● ●
●
●
●
● ●
●
●● ●
● ●
● ●●
● ● ● ●●
●
● ●●●
● ●
● ●
● ●
●● ●
●
●
● ● ● ●●
●
● ●●
●●
●●
●
● ●●
●
●
●
●●
●
●
●
● ●
●
●●
● ●
● ●● ●●
●
●●●● ●
●●●● ● ● ●● ●
● ● ● ● ●● ●
● ● ● ●●●
● ●●
●●
●●●
●
●● ●
● ● ● ● ●
● ●● ●
●●
●●
●●
●● ●● ●
● ●
● ● ●
● ●●
●
● ●
●● ●
● ●●●
● ● ● ● ● ●●
●
●
●
●
● ● ●● ●
● ●
●
●●
●
●
●
●
● ●
●
●● ●
●
●
●● ● ●
●
●●
●● ●
●● ●
●
●
●
●
●
●● ● ●●●
●
●
●● ●
●●● ●● ● ●●
●● ●
●
●●● ●●●
● ●● ● ● ● ●●● ●●●●
● ●●●
● ●● ● ●
●●
●●●
●●
●●
● ●
●●
●
●●●● ●●● ●
●●
● ●●
● ● ●● ● ● ●● ●
●
●
●
● ●●
●●
●●●●
●● ● ● ●
●● ●
●● ● ● ●
● ●
●●● ●
●
● ●● ● ● ●●●
●
●
●
●
●● ●●
●● ●
●● ●
●●
● ● ●●
●●● ●● ●● ● ●●
●●
●
●● ● ●●●● ●
●
● ●
●
●
● ●
● ●
●
●●●
●
●● ●
●●● ●●
●
●● ●●
●●
●● ● ● ●
●● ●
● ● ● ●● ● ●
●
●
●●
●● ●
●●
●
●
●
●●
●●
●
● ●●●
● ● ● ● ● ●● ● ● ●
●
●●●
● ●●●
●●
●● ●
●
● ●●●● ●
●
●
● ●
●
●
●
●●
●
● ●
●● ● ●●●
●
● ●● ●● ●
●●
●●●
●●
●●
●
●
●●
●●● ●
● ●● ●●
●
●
●●●
● ●● ●
● ●●
●
●
●● ● ● ●
●●
●●
● ●
●
● ●● ●●
● ●●● ●
● ●●
● ●●●
● ●● ● ●● ●
●
● ●● ●
● ●●
●●
● ●
●●
●●
●
● ●
●● ●●● ●
●●●●●
●
●
●● ●●
●
●●
●●●● ● ●●● ●
● ●
●
●
●●● ●●
● ●●
●●
●● ● ●
●
●
● ●
●
●●●
● ●●
●●
●●
● ●● ●●● ●● ●●
●●
●
●●
● ● ● ● ●● ●
●●● ●
●
●
●
● ●●● ● ●
●
●●
● ● ● ● ●●
● ●● ●
●●
● ●● ●●
● ●
●
● ●●
●●
●● ●
●●
●
● ●●●
●
●
● ●
●● ●
● ● ●●
●●● ●●●●●
●
●●
●
●
●
● ● ●●
● ●● ●
●●
● ●● ● ●
●●
●●
● ●●● ●●
●●
●
● ●●
● ● ●●
● ●
●●
●●
●●●
●
● ●●
● ●●
● ●
●
● ● ●●
●●
●
●●
● ●
●●● ●
●
●●
● ●
●●
● ● ● ●●● ●
●●●
●●
●
● ●●
●●● ● ● ● ●
● ●
●● ●
● ●
● ●
●●
● ●●●● ●●
●● ●●●● ●●
●
●
●● ●
●
●
●● ●●
●
●●
●
●
●●
● ●
●
●●
●●●● ● ● ●●
●●●
● ●● ●●●
●
●●● ●●
● ●● ● ●● ● ●
● ● ●●
● ●●
● ● ●●
●
●
●
●
●● ● ● ●
●
● ● ● ● ● ● ●●
●●
●
● ● ●● ● ● ●●●●
●●●●
● ●
● ● ●
●
●●●
● ● ●●●
● ●
● ●
●●●
●
●
● ● ●●
●
●●
●
●
●● ●
● ● ●
●●●● ●
●
●
● ●●
●
● ●● ●
● ●
● ● ●●
● ● ●
● ● ●●●
●
●
●
● ●
●●●●● ●●
●● ●
●
●● ● ●
●●
●
●●
●●
●● ● ●● ● ●
● ● ●
●●
●●
●●●●● ● ●●● ● ●●
● ●●● ● ● ●
●●
● ●● ●● ● ● ●●●
●
●●
●
●●
● ●
● ● ●●●
●●
●●●
●
●
●
●●
● ●
●●
●●
●
●
●● ●●●
● ● ●●
●
●●●
●●●
●●● ●●●
●
●● ●
● ●
●●
●●
●●●
●● ●
●●●● ●●●●
●● ● ●
●
●●
●
●● ●
●
● ●●
●● ●
● ●
●
●●
●●●● ● ●
● ● ●● ● ● ●
●
●
●● ●
●
●
●● ●
●●
●
●●● ●● ●● ● ●● ●
●
●
●●
●●●
●●● ● ●
●
●●●
● ● ●
●
● ●● ●
●
●
●
●
● ● ● ●●● ● ●● ● ●●●
● ● ●● ●●
●
● ●● ●
●● ●●●
●●
●
●
●●●● ● ● ●●
●
● ●
●●● ● ● ●● ●●●
●
● ●
●●
●●●●
●●
● ● ●●
●
● ●●●●●
● ●● ●
● ● ●● ●
●●
●●
●● ●●
● ● ●●● ●
● ● ● ● ●● ●
●●
● ● ●
●● ●
●● ●
●
●● ●
●●●● ●●● ● ●
●● ● ●
● ● ●●
●●● ●
● ● ●●●● ● ● ●
●
●●●●● ● ●●● ●
●● ● ● ●
●
●● ● ●
●●
● ●● ●
●●
●● ● ● ● ●●
●● ● ●
●
●●●
●
● ●●●
●● ●
●
●● ●
●●
●●
●●
● ● ●
●
●
●● ●●●●
●
● ● ●
●●
●
●● ●●●● ● ●
●●
●
●● ● ●● ●
● ● ●●● ●●
● ● ●●● ●● ●
●
●
● ●●●●
● ●
● ●
●● ● ●●●●●●●
● ●
●●
●
●
●●
●
●
●●
●
● ● ● ●● ● ● ● ● ●● ● ● ● ●●
●
●
●
●
●●● ● ● ●
● ●●● ●●●●
● ●●
●●
●●●
●
● ● ●●
●●
●
●
● ●●
● ●●●● ● ●●● ●●● ●●●● ●
● ●●
●
●
● ●
●●● ● ●
●
●
●●●●
●
● ●
●
●
● ●
●●
●●
● ● ●● ●●
●●
● ● ● ● ● ●● ●
●
●●
●
● ●
●
●
●●●● ●●●●●●●●
●
●●●●
● ●
●●
●●
●●
●
● ● ●
●
●●●● ●●
●
●
● ●●
●●
● ●● ●●●
●
●
● ● ● ●●
● ● ●●
●
●
● ● ●●●● ●
●●
● ●●
● ●
●
●
●
●● ● ●● ● ●● ●
●● ●●
●● ●
●● ●●●
●
●● ● ●● ●●
●
●
●●
● ●
●
●● ●
●● ● ● ●
●●● ●● ●● ●
●●
● ● ●●
● ● ●● ● ●●
●
● ●●
●● ●
●
● ●
●
● ● ● ●●
●● ●●
●●● ●
●●
●● ● ●
●
●
●●
●
●●● ●●
●
●
●●
●
●
●
●●
●●● ●
● ●● ● ●
●●
●● ● ●
●
●
●
●
●
●
● ● ●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●●
● ●
●
●
●
● ●
● ● ●● ●● ●
●● ●
●
●
● ● ●● ●●
●
●
● ●
●● ●
●●
●●●●
●
●●●●●● ●●
●●● ● ●
● ● ●● ●
●
● ●
●●● ●● ●●
●●●●
● ●●
● ●●● ●
●● ●
● ● ● ● ●● ●●●●●●
●
●● ●
●
●●
●
●● ●●●
● ●
●
●
●●
●
●● ●
●
●
● ●
●
●●
●●
●
●● ●
●● ●● ●●
● ●●● ●
●
●
● ● ●
●● ●
●
●
●
●● ●●
●● ● ●● ●● ● ●●
●
●
●
●
●
●
● ● ●●
●● ● ●●●
●● ●
●●●
●
●
●
●
● ●
●
● ●
● ●●●
●● ●
●
●
●
●●●●● ●● ●● ●
●●●● ●●
●
●●●
●●
● ●
●●●
● ● ●● ●
●●●
●●
●
● ●
●● ● ●●● ● ● ● ●
●●
●
●●
●
●
● ●● ●● ●● ●
●
●
●
●
●● ●● ●●
●●●●● ●
●
●
●●●
● ●
●●
●
● ●
●
● ●
●●
●●
● ●●
●●
●
●● ● ●
● ●●
● ●
● ●● ●●
●●●● ● ● ●● ● ● ● ● ●● ●●
● ● ●●
●
● ● ●
●●
●
● ● ●●
●●
●
●●
●● ● ●
● ● ●●● ● ●●● ●
●●● ●
● ●● ●● ●
●
●●
●● ●
● ●● ● ● ● ●
●
●●●●
●
● ●● ●
●●●
●
●
● ●
● ●●
● ●● ● ● ●●
●
●
●●
●
●●
●●
●
●●
●
●
● ●
●
●
● ● ●●● ●
●●
●●
● ●●
● ●●● ● ● ● ●● ● ●●● ● ● ●
●●
● ●
●
●●●
●
●● ●
●●
●
●●
●
●
●●
● ● ● ●●●●● ● ●
●●
●●
●●
●
●
●●
● ● ●● ●●● ● ●
●
●
●●●
●
●●
●● ●
●●● ●●
●●
●
●
●●
●
●
● ●
●● ●
●
● ●●
● ●
●
●
● ●
●
●
● ●
●● ●●
●●● ●●● ●
●● ●
●●
●
●●
● ●● ●●
● ●●●●●
●
●●
●
●
●
●
●
● ●
● ●
●
●
●
●
● ●●●
●
●●●
●
● ●●
● ● ●●
●
●
●
● ●
●
●
●
● ● ●
● ●●● ●●
● ●●● ● ●
● ●
●
●●
●
● ●● ●
● ● ●
●●
●●
● ●
●●●● ● ●●●●
●
●
● ● ●● ●●●● ● ●
●
●
●●
●
●
●
●
● ●● ● ●
●●
● ●●
● ●● ●● ●
●
● ●
● ●●
●●● ●
●●
● ●
●
●● ●
●
● ●●
●
●●
●● ●●
●● ●
●●
●● ●
●
●●● ● ● ●
●
●
●
● ● ● ●●●● ●●●
● ●●
●
●
●
●●
●
●
● ● ●●
●●
●
● ● ●● ●●●
●
● ● ●●
●
● ● ●● ●
●
●
●
●●● ●
●
●
●●
●
● ● ● ● ●●●●● ●
●●
● ●●
●
●● ● ●
●
●
●●
● ● ●●
●
●
● ● ●
●
●
●
●●● ●●
●
● ●● ●●
●
●
●
● ●●
●
●●
●
●●
●● ●
●
●
●
●
●
●
●
●
●●●● ●●
●● ●
● ●
●●●
●
●
● ● ● ●● ●
●●
●
●
●
●
●● ● ●
● ● ●● ● ● ● ● ●
●
●●● ● ● ●●
●●
● ●
●
●
●●
●
●
●
●
● ●
● ● ● ● ● ●●●
● ●●
● ● ●
●
●
●
●
● ●● ● ●
●●●● ●
●● ●●● ●●●●
●
●
● ●
●
●
● ●●● ●
●● ●●● ●
●
●
● ● ●
●● ●
●
● ●
●
●● ● ●●
●
●●
●● ●
●
●●
●●
●
● ●● ● ●
●
● ● ●●
●● ● ● ●● ● ● ●
● ● ●●●
● ●
●
●
● ●●
● ●
● ●●
●●
●●● ● ● ●
●
●
● ●●
●
● ●
●
●
● ●● ●●● ● ● ● ●
● ●
● ●
●●
●●
●
● ●●●
●
●
●
● ● ●● ●●
● ● ●●●
●●●
● ● ● ●
● ●● ●
● ● ●
●
●●
●●● ●
●
● ●●
●
●
●
●
●● ●
●●
●
●
●
●●
● ● ● ●● ●
●● ● ●
●
●
●
● ●● ● ●
● ●●● ●
●
●
● ●
●
● ●●●●
●●
●●
●
●
●●
●
●
●● ●
● ● ● ●● ●
● ●● ●
●
●
●●
●
●
●● ● ●
●
● ●
●
●● ● ●
●
●●
●
● ●● ●●
●
● ●
●
●
●● ●
●
● ● ●●
● ●
● ● ● ●
●
● ●●
●
● ●●
●
●
●
● ●
●
●● ●
●
●●
●●
●●
● ●
●
●
●●
●
● ● ● ●
●
●
● ●
●
●
●
●
●●
●
●
●
●
● ●● ●
● ●
●
●
● ●
● ●
●
●
● ●
● ●
●
●
● ● ● ● ● ● ●
●
●
●●
● ● ● ●●
●
● ● ●
● ●● ●
●
● ●
●●
● ●
●
● ●●
●
●
● ●●
●
●●
●
●● ●
●●
●
●
●
●
●
●
●
● ●
●
●
●● ●
●
●●●
●
● ● ●●
●●
●
●
●●
●
●
● ● ● ●
●
●
● ●●
● ●●
●● ●
● ●
●
● ● ●
● ●● ●
● ● ●● ●
● ●●● ●
●●
●
●
●
●
●
●
●● ● ●
● ●
●
●● ● ● ●
● ●
●● ● ●
●
●● ●
●
●
●
●● ●
●●
●
●
●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
●
●
● ● ●● ●●
●
●
●
● ●
●
●
● ●●
● ●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
● ●
●● ● ● ●
●●
●
●
●● ●
●●
● ●
● ●
●● ●
●
●●●
● ●
●
●
● ●
●
●
●
●
● ●
● ● ●
●
●●●
● ●●
●
●●
●
●
● ●● ● ●
● ●
●●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
● ●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
●
●
●● ●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ●
●●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
−5.0
●
●
●
●
●
●
−5.5
●
●
●
● ● ●
● ●
● ●
●
●
●
●
●
●
●● ●● ● ●
●●● ● ●●●
●
●
●
● ●●
●
●
●●
●●
●
● ● ●●
●●
●●● ● ● ●
●●
● ●● ● ●
●
● ●●
● ● ●● ●● ●●
●●●
● ●●●●
●●●
●
●● ● ● ● ●
●
● ●
●
● ●●
● ● ●●● ● ● ● ● ●
● ● ● ●●●
● ●
●
● ●
● ●● ●
● ●●
●
●
●●● ●
●
● ●●
●
●●
● ●●●●● ● ● ●●●
● ● ●●
●
●● ● ● ●
●
●●
●● ●
● ●
●●
● ●
● ●●
●●
●
●
●
● ●
● ●● ●
● ● ● ● ●● ●
●
● ● ●
●
●
●
●
●
●● ●● ●●●●
●
●
●
● ●● ●
●
● ● ●●
● ● ● ●●
●●
●●
●
●
● ●
● ● ● ● ● ● ●●● ● ●
●●
●
●●
●
●
●
● ●● ● ●
●
● ●● ●
●
●●
●
●
●
●
●●●
●
● ●●
● ●
● ● ●●
● ●
● ●● ●● ●
● ●
●
●●
●●
●
●●
● ●●
●●
● ● ●●
● ●● ●
●● ●● ●
● ●
●
●●●
● ●●●
●
● ● ●●
●
●● ● ● ● ● ● ●● ●●● ●●
●● ●
●
●●
● ●●
●● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
● ●●
● ●●
●● ● ●●
● ●
●
●●
●
● ●● ● ●●
●
●
● ● ●● ●
●
● ●
●● ● ●
●
●
●
●●
●
● ●
●●●●● ● ●
●●● ● ●●● ●
●
●
● ●●●
●
●
● ●●
●
● ●
●
●
●●
●
●
● ●●
● ●
●●●
● ●●
●
● ● ●● ●
● ●● ●●
● ● ●●
●
●●
●
●●●●
●
●
● ● ●●
●
●
●
●
●●
●● ●●
● ● ● ● ● ●●●● ●
● ●●
● ● ●●● ●
●●
●
●
●
● ● ●● ●
●●
●● ●
●●
●
●●● ● ●
● ●● ●●●●
●
●
●
●
●● ●●
●● ● ●
●
●●
●
●●
●●
●
● ●
● ●
●
●●
●
●●
●
●●
●●
● ● ● ●● ● ● ● ●
●● ● ●
●●
●
●
● ●● ● ● ●
●● ● ●
●●
●
●
●
● ●
●
●
●
● ●●
●
●
●●
●
● ● ●●● ● ● ●
● ●●
●
●
● ●●
● ●●
●●
●●
● ● ●●
●
● ● ●●
●●●●
●● ● ●
●
● ●●
●● ●
● ●
●
●
● ● ● ●●
●●
●● ●●●
●
●
●●●●
●●
●
●●
●● ●
●●
●●
● ● ●●
●
●
● ● ●● ●●
●
● ●
●
●● ●●
●● ● ●
● ●● ● ●●●
● ●●● ● ●
●
●●
● ●
●
●● ●
●●
●●
●
●
●● ●●● ● ●
● ●
●●● ●
●
● ●
●
●●
●●
●● ● ●● ●●
●●
●
● ●● ●●● ●
● ● ●
●
●●
●●●
● ●●
● ●●●
●●
●
●●
●
●
● ●● ●
●
●
●●●
● ● ● ●
●
●
●●
●
● ●●
●
●
●● ● ● ●●●●
●●
●
●
●●
● ●● ●●
● ● ● ●● ●●●●●
●
● ●
●
●
●
●●●●●
●
●
● ● ●●
●
● ●●● ● ●
●●
● ●
●
● ●
●
●
●
●
● ● ● ● ●
●● ●
●● ●
● ●
●
●
●
●
●●
●●
●
●
●
●
● ● ● ● ●● ●●
●
●●
●●● ●
● ●● ●
●
● ●
● ●
●
●●
●●
●
●●●
●
●●●●● ●●● ● ● ●
● ●
●●
●
●
●●●
●
●
● ● ●
●●● ●
●
●
● ●
● ● ●●
●●
●● ●
●● ● ●
● ●
● ●●
● ●● ●
● ●●
●
●
●
●
●
● ●●●● ●●● ●● ●● ●
● ●●
●
●● ●
●
● ● ●
●
● ● ●
● ● ●
●● ●
●●● ● ●
● ●
●
●
● ● ●●
● ●●
● ●●● ●
●
●
●●
●●
●●
● ●●● ●● ● ●● ● ● ●
●● ●
●
● ●
● ●
● ●● ●● ●●
●●
● ●●
● ●
●
●
●
●
●
●●●
● ●●●●●●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●●
● ● ●●
●●●● ● ●
●●
●●
●●
●●
●
●● ●
●
●
●
●
●
●
● ● ●●● ●●●
●●●
●
●● ●●●
●● ●
●
●●
●
● ●
●
●●● ●
●
●●
●
●●
●
● ●
●
●
● ●
● ●
●● ● ●●● ●
●
● ●
●
●
●●
● ● ●●●●●●●
●●
●● ●
●
●
●
● ●●●●
●●●
●
●● ●
●●
● ●●
● ●
●● ●● ●
● ● ●
●
●
●
● ● ●
●●●
●●
●
● ●●●
●
● ●
● ●
●
● ● ●● ●
●
● ● ●● ●
●
●
●
●
●
●
●●
● ● ●●
●●
● ●
●●
●
●
● ● ●
●
●● ● ●●
●
●
●
●●● ●●●
●
●
●●
● ● ● ● ● ●
●
●
●● ●
●
●
●●
●
● ●●
●
●
●
●
● ●
●●
● ● ● ●●●
●
●● ● ●
●
●●● ●●● ● ●●●
●● ● ● ●●●
●●●● ●●●
●
●
●●
●
●●●●
●
●
●●
●●●● ●
●
● ●
●● ●
●
● ●
●
●●●
●
● ●●
● ●
●
●
●
●
●
●
●
●● ●●
● ●●● ●●●●
● ●●
● ●● ●
●
●
●
● ●●
● ●● ●
●● ●●
●
●
●●
● ●
●
●
● ● ●● ●●
● ●
● ●● ● ●
●
●●
●
●
●
● ●
● ●
● ●
●
●
● ●●
●
●● ●
● ●●
● ●●
● ●●
●●
● ●●
●
●
● ●
● ● ●● ●
●●●●●
●● ●
●
●
●
●●●●
●
●●
●
●●
●●
●● ●●
● ●
●● ● ●
●
● ●●
●●●
●
●●
●
●
●
●
●●
● ●● ●● ●●●● ● ●●●
●●● ● ●●
●●
●
●
● ●●● ●●●●
●
● ● ●
● ●
● ●● ●●●●
●
●
●
●
●
●
●● ●
●
●
●
● ● ●● ●●
●● ●● ●
● ●●●●● ●
●●
●●
●
●
●
●●● ●
●
●
●
●●●●
●●
●●●
●
● ● ●
●
●
●
●
●
●●
●
●
●
●
●●●
●● ● ●
●●●●
● ●● ●●
● ●●
●
●
● ●
●
●
●●
●
● ● ●
●● ●● ●●●●●
● ● ● ●
●
●● ●●
●
●
●● ●
●
● ●
●
●
●● ● ●
●
●
●
●●●●
● ●
●● ● ●● ●● ●
●●
● ●
●● ● ●
●
●
●● ●
● ●●
●
● ●●
● ● ● ●
● ●
●
●● ●
●
●
● ●●
● ● ● ●
● ●● ● ●
●
● ●● ● ●
●●
●
●●
●●● ●●● ●●
● ●●
●
● ●●●
● ●
●●
●● ●
●
●
●
● ●●●
●●
●●
● ● ●● ●● ●
●●● ●
●●
●
●
●
● ●● ●
●
●
●
●
●
●
● ● ●●● ●
●●
●
●● ●
●
●
●●
●●
● ●●●● ●●● ●●
● ●●
●
● ● ●● ●
●
● ● ● ●●●● ●●● ●
●
● ●
●
●
●
● ●●● ● ● ●
● ●
● ●
●●●
●
●
●● ●●
● ●
●●●●
● ●●●● ●●●● ●
●
● ●●●●●● ●
● ● ●● ●
●● ●●●● ●
●
●
● ● ● ● ●●●
●
● ●● ●●●
●
●
●
●
●
●●
●●●
●
●
●
● ●● ●
●
●
●
●
● ● ●
●
●●
●
●
●
●●
●
●
●
●
●●
●●● ●● ●●
●●
●●
●
●
●
●
●
● ●●
●●
●
●
●
● ● ● ● ●●●●●
● ●
●
● ●
● ●●● ●● ●
●●●
●● ●● ●● ●
●●
●
●
●
●
● ●
● ● ●● ●● ●●● ●
●
● ●
● ●
●
●
●●
●
●
●
●
● ● ● ●●●● ● ●
●● ●
● ● ●
●
●
●●
●
● ●
● ●
●
●●
● ●●●●
●
●
●●
●● ● ●
●● ●
●●● ●●● ●● ●●
●
● ●●●● ●●
●
●●
●● ●
● ●
●●●
● ● ●● ●● ●
●
●
●
●
●
●
●
●
●● ● ●
●
●
●
●
●●
● ●●
●●
● ●
●
●●
● ●●●
●●● ● ●
●
●● ●●● ● ●●
●
●
●
●●
● ● ● ●● ●● ●
●
●
●
●
●
●
●
●
● ●
●
●● ●
●
●●
●
●● ●
●
●
●● ● ●●
●
● ●
●
●●
●
●
●
● ● ●
●
●●
●
● ●
●
●●
●
●
●
●
●● ●
●●
● ●● ●
●●●
●
● ● ●● ●●
● ● ● ●●● ●●● ●●
●
●●
●● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
● ●
●
●
●
●
●
● ●● ●
● ●
● ●
● ●● ● ● ●
●
●●●
● ●●
●●
●● ●● ●
●●●
●
●
● ●
●
● ●
● ●
●
●●
●
●
● ●●
● ● ● ●● ● ●
●
●
● ● ●●● ●
● ● ●
● ●
●●●
●●
●●●●● ● ● ● ● ●●● ●
●● ● ● ● ●
●●
●
●
●
●
●
● ●●
●
●
●
●● ●
●●●
●
●●
● ●
●
●●● ●
●
●
● ● ● ●●●
●● ● ●● ● ●● ●● ● ●
● ●● ● ●●
●
●●
●
● ●●
● ● ●●
●
●● ● ●
●● ●●● ●●●● ●
● ●● ●●●
●
●●
●●
●
●
● ● ●
●
●
●
●
●
●
●●
●●●
●
● ●
● ●●
●
● ●● ●
● ● ●●
●
●
● ●
●
● ● ●● ● ●
●● ●● ● ●●
●
● ● ●●●
●●
●
●
● ●
●
●
●● ● ●
● ●
● ●
●
● ●● ●●● ●● ● ●
●
●
●●●● ● ●
● ●● ●●● ● ●
● ● ●● ●● ● ●●
●
●
●
●●●● ●● ●
●
●
● ● ●●●●●
●
● ●●●
●
●● ●● ● ●●● ●●
●
●● ●●
●
●●●
●● ● ● ●
●
●
●
●
●● ●
●
●
●●
●
● ● ● ●
● ●● ● ● ●
● ●●
●
●
●
●●
●
●
●
●
●●
● ●●●●
●
●
●
●●
●
●●
●●● ● ● ●
●
●●
● ● ●
● ●● ● ●● ●
● ●● ●●● ● ● ● ●
●
●
●
●
●
● ●
●● ●
●● ● ●●
● ●
● ●● ●
●
● ●
●
●
● ●
●● ●
●●
●●●
●
●● ●● ●●●
●●
●
● ●● ●● ●
● ●
●
●
●
●● ●●
●
●
●● ●
●●
●
● ● ●
●
●●●
●●
●
●● ●
●● ●●
●● ●
●
●
●
●
●● ●●
● ●●
●●●
● ●●
●
●
●
●
●●
●
●
●
● ●● ● ● ● ●● ●●● ●●●● ●● ●
●
● ●● ●●●● ● ● ● ●●
●
● ●●
●
●●●●
●
● ●● ●
● ● ●●
●●
●
●
●
●
●
● ●● ● ●
●●
●● ●●
● ●● ●●●
●● ●●●
●●
●●
●
●
● ● ●●● ●●
●● ●
●
●
●
●
●
●
● ● ● ●● ●● ● ●●
● ●
●●
●
●●
●● ● ●
●● ●
● ● ●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
● ●
●
●● ● ●●● ●●
●●
● ●
●
● ●●
●●
● ● ● ●
● ●●
●
● ●
●
●
●
●
● ● ● ● ●●
●
●●
●●●
● ●
●
● ●● ● ●
●
●
● ●●
●●
●● ●
●●
●
●
● ●●
●
●
●
● ●● ●●●●
● ●
● ●
●
●
●
●●
●
●●●● ● ●●
●
● ●
● ●
●
●
●
● ●●●●
●● ●
●●●● ●●
●
●
● ●●●●
●
●● ●
●● ● ● ●
●
●
●
●●
●● ● ●●● ●
●●
●
●
●● ●
●
●
●● ●● ●●
●
● ●● ●●
●●
●
● ● ●●
●
●●
●●
● ●●
●
● ●●
●●
●●
●●●
●
●●
● ●●
●
●●
● ●
●●
●
●
●
●
●
●
●
● ●●
●
●
●●
●●●
●
●●●● ●
●
●●
●
● ●● ●
●
● ●●
● ●● ● ● ●●● ● ●●● ●
●
● ● ● ●● ● ●
● ●
● ●
●
●
●● ●
●
● ●
●
●●● ● ●
●
●● ● ●● ●● ●
●●
● ● ● ● ●●
●
●
●
●●
●
●
●● ● ●
●●
●
●● ● ● ●●
●● ●
● ●●
●● ●
●
●
● ● ● ●●●● ●
●
●● ● ● ●
●
●
●
●
●● ●
●●
●●
●●●
●
●
●●● ●●● ●●● ●
●
● ● ●
●
● ●● ●● ●
●
●●● ●
●
●
● ●
●
●●● ●
●●
●
●
●
●
●
●
● ●
●●
● ●
●
●
●
●
●
●
●
●
● ●
● ●
● ● ● ● ●●
●
● ●●●
●
●
●● ●
●
●●●
●
●
● ●
●
● ●●
●
●
●●● ●●
● ● ●
●
●● ● ●● ●●
●● ●● ● ● ●●
●
●●
● ● ● ●●●
● ●
●●●●●
●
●
●
●● ●●
●●
● ●●
●
●●
●
●
●
●
●●●
●
● ●● ● ●● ● ● ●
● ●● ●●
●●
●
●
●●
●
●
● ● ● ●● ● ●
●
●●
●● ● ● ● ●●
● ● ●●●
● ●
●● ● ●
●●
●
● ●
● ●●● ● ●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●
● ●
●● ●●
●● ● ●
●
●
●●
● ●
●
●
● ●
●
●● ● ●
●
●
● ● ●
●
●● ●
●● ●
●
●●
●●
●
● ● ●
●
● ●
●
● ●● ●
● ● ● ● ●
●●
●
●
●
●
● ● ● ●●
●
●●
●
●
●
●
● ●●
●
●
●●● ●
●
●
●
●
●
●
● ●●
●
● ●●●
●
● ●
●
●
●
●
● ●
● ● ●●
●
● ● ●●●
●● ●●
● ● ● ●
●●
●
● ●
●
●●
●
●●
●
●
●
●●
●
●
● ●
● ●● ●
●● ●
●● ●
●
●
●
●
●●
●● ●
●
●
●●
●
●● ● ●
●
● ●●
●
●
●
●●
● ●
● ●●●
●● ● ● ●
●
●
●
●
●
●
●
●
● ●●●
●
●
● ●●
● ●
●
● ●●● ● ●
●●
●
●
●
● ● ● ●
● ●●
●
●
●●
●
●
●
●
●
● ●
●
●
● ● ●● ●
●
● ● ●●
●
●
●
●
●●
●● ●
●
●●
● ● ●●●
●
●●● ● ●
●
●●
●
●
●
● ●●
●
●●●
●
●
●
●
●
● ●
●
●
● ●
● ●
●●
● ● ●
●
● ●● ●
● ●
●
●
● ●
●
●●●
●●
● ●
● ●
●
●
●
●
●
● ●●
●
● ●
●
●●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
● ●
●
●
● ●
● ●
● ●
● ●
●
●
●●
●
● ● ●
●
●
●
● ●
●
●
●
●
●
●
● ●●
●
● ●●
●
●
●
●
● ●●●
●●
●
●
●
●
●●
●
●
●●
●
● ●
●
●
●
●●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●●
●
● ●●
● ●
●
●
● ●
●
●●
●
●
●
●●
●●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
log(kTrans)
● ●
●●
●
●●
●
● ●
● ● ●● ●●
●
●
●
●●
●●●● ●
●●
●
●
● ● ●
●
●●●
●●
● ●● ●
●
●
●●●
● ● ●●
● ●● ●
●●
●
●●● ●
●●
●●
● ●●●● ●
● ●
●
●●
● ●
●●
●●
● ● ●
●● ● ●●●●●
●●
●●● ●● ●●●●●
●
●
●
● ●
●●●
●●
●
●
●
● ●
●
●
●
●
●●●● ●
●
● ●
●
●
●
●●●
●
●●●
●● ●
●
● ● ●● ● ● ● ●
●●●
● ●
● ●
●
●●
●
●● ●
●
●
●
●
●
● ●●●
●
●
● ●●
● ● ●
● ●● ●● ●
●
●
●● ●
●● ●● ● ● ● ●
● ●● ● ●● ●
●
●
●
●●
●
● ●
●
●
●
● ●
●
●● ● ●
●
●
●
●●●●
●
●
●● ● ●
● ● ●●●● ● ●● ●
●
●●●
●●
●
● ● ● ●●● ●● ● ●
●
● ●
● ●●
●
●
●
● ●●
●
●
●
●
●
●
●● ● ●
●
● ●
● ●● ●
●●●●
●
● ●● ●
● ●
●● ● ●● ●
●● ●
●
● ● ●●●
●
● ●
● ● ● ●●
●● ● ●●● ● ●
●●●● ●
●●
●● ● ● ● ●
● ●
●
●
●
●● ●●
●
●
●
●
●●● ●● ●
●
●●●
● ●
●
●
●
● ● ●
●
●
●
●
●
●
●●
●
●●
●
●
● ●●● ●
●●
●●●●●
● ●●
●● ●● ●
●
●
●
●
●
● ●
●●
●● ●●
●
● ●
●
●
●●
●●
● ●
●
●
●
●● ● ● ●●●
● ●●
●
●
●● ●
● ●●
●
●●● ●● ●●●
● ●
●
● ● ● ●● ●
●●●●
● ●● ●●
●
●●
●
●●
● ●
●
●●
●
●●●● ●●●
●● ●●●●
●
●
●
●●
● ● ● ● ●●
●● ● ●● ●
●
●● ●
●
●
● ● ●● ●
●
● ●
●
●
●
● ●●●
●
●● ●
● ● ●
●
●
● ● ●
● ● ●
●
●
●
●● ● ●
●
●
●●
● ●
●
●●●●
●● ●●
●
●● ●
●●
● ●● ●
●
● ●●
●
● ●
●
●
●
●●
●
●●●
●
●● ● ● ●
● ● ●●
●
● ●
●
● ● ●● ● ● ●● ●
● ●●●
● ●●
● ● ●
●●
●●
●●●
●●
●
●
●●●
● ●
●
●
●
● ● ●●
●●●●
●
● ●● ●
●●
●
●●
●
●
● ●
●
●●● ● ●●● ●●●● ● ●● ●
● ● ●● ● ● ● ●
●●●
● ●●●
●●
●●
●● ●
● ● ●●●
●● ●
●
●
●
●●
● ●
●
●
●●
●
●● ●
●
● ●●
●
● ●
●●
●●●
●
●
●● ●
●●
●●
●
● ● ●●●●
●● ●
● ●
● ●
● ●
●●●
●●
●
●
●●
● ●●●● ●● ● ●● ●● ●
●
● ●
●●
●
●●● ●
●● ● ● ●
● ●●● ●●
●
●
●
● ●
● ●
●●●● ●● ● ● ●
● ● ●● ●
●
●
● ●●● ●● ●
●●●●
● ● ● ●●●●●● ●●● ● ● ●● ●●● ●
●●
●
● ●●●●●●● ●
●
●
● ●
●
● ● ● ●●
●
●
●●● ●●● ●●
● ●● ●●●●
●
●
●
●
●
●
●
●
●
●●
●●
●● ● ● ● ●
● ●●●
●●● ●● ● ●● ●
●
● ●
● ●●●
●
● ●●
● ● ●
●
●
● ●●
●
●● ● ●● ●●● ●
● ● ●
●
● ● ● ●● ● ●● ●
● ●●
●
●
●
●● ●
●
● ●●●
●
●●
● ●
●
●
● ●●●● ●●
●●● ●
●●
●●
●● ●
● ●
●
● ●● ●
● ●
● ●● ●
●
● ●●
●●
●
● ●
●
● ●●
●
●
●
●
●●
● ●
●
●●
●
●
●●
● ●
● ● ●
● ●
●●
●
● ● ● ● ● ● ●●●● ● ● ●●
● ● ● ●●● ● ●
●
●●
●●
●
● ● ●
●
● ●●
●●● ●
●
● ● ●● ●●
●
●●
●●● ●●
●●
●●●● ●● ●● ●
● ●
●●● ● ●●
● ●●● ●● ●
●
●● ●
● ●
●● ●
●
● ●●●●● ●●
● ● ●●
●●●●●
● ●
●
● ● ● ● ●●●
●
● ●●
● ●
●
● ●●●● ● ●●
●
●● ●
●
●
● ●●
●●● ● ● ● ●
●
●●●●
●●● ●
●●
●
● ● ● ● ● ● ● ●● ● ●● ●●
●●
●●
● ● ●●
● ●●
●● ●
● ● ●●
●
●●
●●
● ● ● ●●
● ● ●
●
●
● ●●
●●
●
● ●
● ●● ●● ●
●
●
●
●●
● ●
●
●
● ● ●
● ●
●● ●● ● ●
●
●●
●● ●●
● ●●
●
●
●●
● ●●
●
●
● ●●
● ● ●
●
●
●
● ●
● ●●
●●
●
●●●● ● ●
●
●●●●
● ●
● ● ●
●
●●●●
● ● ●●
●
●
● ●●
● ●●● ●●
●●
●● ● ●●
●● ●
●
●
●
● ●●
●
● ●● ●●●●● ●● ●
● ●●
●
● ●● ●● ●
● ●● ● ●● ●
●●
● ● ● ●● ● ●
●●
●
●● ●
●
● ● ● ● ●●
●
● ● ● ● ● ● ●●● ● ●
●
●●
●
● ●● ● ● ●●
● ●
●●
●● ●
●●
●
●●●●●●●
●
●
●●
●●●
●
●
●
●
● ●
●
●
● ● ●●●● ● ●
●● ●
●●
●
●●
●●
●
● ● ●●
● ●●
●
●●● ● ● ●●
● ●
●
●●
●●
●
● ●
●●
●●
●
●●●● ● ● ● ●● ●● ● ●●
● ●
●● ●●● ●●
●
●
●●
●
●
● ●
●
●
●
● ●●
● ●● ● ●
●●
● ● ●●
● ●
●●●
●
●
●
● ●●● ●
● ●
●●●
●●●
●
●
●
● ● ● ●● ●
●
●● ● ●●
●● ●
● ●●●●●
● ●●
● ●● ●●● ●● ●●●●
●●
● ●
●
●●
●
● ●●
●
●
●
●
●
● ● ● ● ●●●
●● ●● ●
●
● ● ●
● ●●
●● ●●
●
●●
●
●● ● ●
●●
●
● ●
●
● ●
●
●● ● ●
●
●● ●●
● ●●
●●
●●●●
●
●●
●●
● ●● ●
● ●●
●
●
● ●●● ●● ●●●●●
●
● ● ● ●●●●●
●
●
●●
●
●
●●● ● ●● ● ● ●● ● ●
●
●
●
●
●
● ●
●
●
● ●
● ●●
● ● ●●
● ●
●
●
●
●●
● ●
● ●●●
● ●
●●
●●●
●●●
●
● ●
● ●●
● ●●
●●●●●
●●
●
●●●
● ●● ●● ●
●●● ●● ●
●●
●
●●
● ● ●
● ● ● ●●
●●●
●
●
●
● ●
●
●● ● ●●● ● ●
●
● ●
●
● ●● ●
● ● ● ● ●● ●●
●
● ●
●●
●
●
●
● ●
● ● ● ●● ●●● ●●
●
●
●● ●●● ● ●
● ●●●●●●●
●
● ●
●
● ●
● ● ● ● ●●
●
●
● ●● ●●●
●
● ● ●
● ● ●●
● ●
●
●●
●
●
●
● ● ●●●
● ● ●●●●●● ●
●
●●● ●
●●
●
● ●●
●
●● ●
●
●
●●●
●● ● ●
●●
●●
●
●
● ●
●● ●
●
● ●
●
●
●
●● ●
● ●●
●
●●
●
●●
●
●● ● ● ●
●
● ● ●
● ●
● ● ●●
●
●
●● ●
●
● ●
●
● ●
● ● ●
●
●●
●● ● ●
●
● ●
●
●● ●
●
●
●●
●
●●● ●
● ●●
●● ● ●
●
●
● ● ●●●
●●
● ●●● ●●●
●●● ●
● ●●
●●
●● ●
●
● ●●
●●
● ●● ●● ● ●●
●●
● ●
●●
●
●
●●●
●●●●
●
●●
●
●● ●●●●●● ● ●● ●
●
●
●
●●●
●
●
● ● ●●●
●
●
●
●
●● ●
●●
● ●
● ●● ● ●
●
● ● ●●● ● ● ● ●● ●● ●
●
●
● ●● ● ●
● ●●
● ●●
● ●● ●
●●
●
●
●●●●●
●
●
● ●●
●
● ●
●
●●
●● ●
●
●
● ●●●
●
● ●
● ● ● ● ●● ●
● ● ●●●
●
● ● ● ●● ● ● ●
● ●●●
● ●●● ●●
●
● ● ● ●● ● ● ●
●● ●●●●● ●●
●●●● ●●
●
●●
● ●
●
●
●
●●● ●● ● ●
●● ●●●●
●
●●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●● ●
●
●●
●
●●
●●
●
●
●
● ●●
●●
● ● ●● ● ●
● ● ● ● ●●
●● ● ● ●
● ●
●
●
● ●●
●
●● ●
●●
●
●● ●●
●
●●
●●● ●● ●
●● ●●
●●
●●● ●●● ●
● ●●
●● ●
●
●● ●
●
●● ● ●
●●
● ●
●
●
● ● ● ● ● ●●
● ● ●● ● ● ●
●● ●
●
●
●
●●●
●●●
● ●
●
●
● ●
● ●
●● ●
●
●
● ● ● ●●
●
●
●
●●●
●●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
● ●
● ●●
● ●
●
●
●
●
● ●
●
●
●●●●
● ● ● ●
●● ● ●● ●
●●
●●
●●● ● ●
● ●●
●
●
●
●●
●
●
● ● ●● ●
●
●●●
●
●
●
● ●● ● ●● ●●
●●
● ● ●● ●
●●●
●●
●
●●
● ●
●
●
●●● ● ● ●
● ● ●
●
●●● ● ●
● ● ●● ●● ● ● ●
●
● ●● ● ● ●●
●● ●●
●● ●
●●● ●●
●
●
●●● ● ● ●
●
●
●●
●●
●
●●
● ●
●●● ●
●●
● ●
●
● ● ● ●● ● ● ●
● ●
● ● ● ●●
●
●
●●
●●
●●
●
●
● ●●
●
●●
●●
●
●
● ●
●
●
●●
●
●
●
● ●
●●
●● ●●●
● ●● ● ● ● ● ● ●
● ●
●
●
●
●●
● ●
●
●
●
●● ● ● ● ● ● ●
●
● ●
● ●●
● ● ● ●●
●
●
● ● ●● ●● ● ● ●
●
●
●
●
●
● ●●
● ●●
●●
●
●
●● ● ●●● ● ●●
●
●●●
●●●
●
●
●● ●
●●● ● ● ● ● ●●
●
●
● ●
● ●
● ●
●●● ●
●●
● ●
● ● ●●
● ● ●●
●●
● ●●
●
●●
●●●● ●
●● ●
● ●● ● ●
●●
●
● ●
●● ●
●
●
●
●● ● ●●● ●●●
●● ●
● ●
● ● ● ●●
●
● ●● ● ● ●●
●
● ● ●●
●
● ● ●
●
●●
●●●
● ●● ● ● ●
●
●
●
●
●
●
●
●
●
● ● ● ●
●
●
●●● ● ●
●
●
● ● ● ●●
●
●●● ● ●●●●● ● ● ●●●● ● ●●●
● ●●
● ● ●
●
●● ●●
●
●●
●●
●
● ●
●
●●
●
● ●
●
●●
●
●●●
● ● ●●●●
●
●
●
●● ●
●
● ●●● ●
●
●
●
● ●●
● ●
●
● ●●
●● ●●
●
●
●●
●
●
●
● ●
●●
●
●
●●
●●
●● ●
●
●
●
●
● ● ● ●● ●
● ●
● ● ●
●● ●●●●●● ●
●
●
●
●
● ●
●
●
●●
●
●●● ●●●
●●
● ●●
●● ● ●
● ●
●●● ●● ●●● ● ●
●
● ● ●●
● ● ●●
●● ● ●●
●● ●●
●●
●
● ●●●
●
●●●
●●
●●
● ●●
● ●●● ●
●
● ●●● ●● ●●
●
●● ●
●●
●●
●● ● ●
● ●
●
●●
● ● ● ● ● ●●
●
●
●
●● ● ● ● ●
●●● ● ●●
● ●●●
●●
●
● ●●●
●● ●
● ●
●
●
●
●
●
●
●
●● ●● ● ●
●
● ●
●
●●●
●
●●
●
●●
●
● ●●●
● ●●● ●●
●
● ●
●●●●
● ●
● ●●● ●
● ● ●● ●●
●●●
● ●● ● ● ● ●
●● ●
●●●
●●
● ●
●
● ●
●● ● ● ●●
● ●
●
●● ●● ● ●● ●
● ● ●●
●●●
● ●●
● ● ●
●
● ●●
●
●
●
●
● ●●
●
●
● ● ●● ●●●●●
●
●●● ●
●
●●
●
●● ●
● ● ●
●
● ●
●
●
●
● ●●●● ●●
●● ● ●● ●
● ● ●
● ●
● ● ●
●
●●
●
●
●
●
● ●
●● ●●●
●●●
●
●● ●
●●
●
●
● ● ●●
●●
●● ●
● ●● ●
●
●●
●● ● ● ●●
● ●
● ●● ●
●● ●
●
●
●
●
● ●● ●●
●
●
●
●●
●●●●
●
● ●
●
●●● ●
●● ●●
● ●●
●●
●●
●
● ●
●
●
●
●
● ● ●
●
● ●
● ●● ●● ●
●
●
●●
●●●
●●
● ●●
●
●
●● ● ●
●
●
●●
●●● ● ● ● ● ●● ●
●
●●
●
●
●
●●● ●●● ●
●
● ●
● ●●
●●●●● ●
● ●
● ● ●
●
● ● ●
●●
●
●
●
● ● ●
●
●
● ● ●●● ●
● ●● ● ●
●● ●● ●● ● ●
●
● ●
●● ●●● ●
● ●
● ● ●●
●
●
● ● ●● ●
●
●
●
●●
● ●●
● ●
● ●●●
● ●
●
●●
●
●
●
●
●●
●
●
●●
●●●
●
● ● ● ●●
●●
●
●
●
●●
●●
●●
●
●● ● ●
●● ● ●
●● ●
●
●
●
●
●● ●● ● ●
●●●
●
●● ● ●
●
●●
●
● ● ● ● ●
●
● ●
●●
●
●
●
●●●
●
●
●
●●
● ●
●●
●
●
●
● ●● ●
●●● ● ●
●
●
● ●●
●
●
● ●
●
●
● ● ● ● ●●
●
●
●
●
● ●
● ● ● ●
●● ●
● ● ●●
●● ●
●
●
●
● ●●
●
●
●
●●
● ●● ●
●
●● ●
● ●
●●
● ● ● ●
●
● ●
● ●
● ● ●
●
●
●●●●
●● ● ●
●●
●
●
●
●●
●
●
● ●
●●
● ●
● ●
●
●
●
●
●
● ● ● ●●
●
● ●● ●
●
● ●● ● ●
● ●●
●
● ●●●●●●
●
●
● ●
● ● ●
●
●●
●
●
●
●
●● ●
● ● ● ●●
●
● ●
●
●
●
●●
● ●
●
●
●
●
●●
●
●●
● ●
● ●●
●
●● ● ●
● ●
●
●● ●
● ●
●
●
● ●
●
● ●
●
●
● ●
●●
● ●● ● ● ● ● ● ● ●●
●
●
● ●
● ●●
●●
●
●
● ●
●
●
● ● ● ●
●
● ●●
● ●
●
●●
●
●
●
●
●
●
● ●
●●
● ●
●
●
●
● ●
●●
●
● ●●
●
●
●
●
●
● ●
● ●
●
●
●
●
●
●●●●
●
●
●
●
●
● ●●
●
●● ●
●
●●
●●
●
●
●
●●
●
●●
●
●
● ●●● ●
● ●
●
●
●
●●
●● ●
●● ●
●
● ●
●●●
●
●
●●●
●
● ● ●
●●
●●
●
●
●
●
●
●● ●
●
●●
● ●
●● ● ● ●
●
●
●●
●
● ●●●
●
●
●
●
●
●
●
●
● ● ●● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●●
4.0
●
●
●
●
●
3.5
−6.0
●
●
●
●
●
3.0
●
●
2.5
●
●
●
●
●
● ● ●● ● ● ●
● ● ● ●
● ● ●● ●
●
●
●● ●●● ●
●
●
●
● ● ●
●
●●
●●
●●
●
●●
●●
●●●
●
●
●
●
●
● ● ●●
●● ●
●
●
●
● ●●● ●
●● ●
●
●●●● ● ● ●●●●●●
●
●● ●
● ●●
●● ●
●
●●
●● ●●●●●●● ● ● ● ● ●●
●
●●
●● ● ● ●
●
●
●
● ● ●●●● ● ●
● ●●
●●
●● ●
●●● ● ● ●●● ● ● ● ●
● ●●● ● ● ● ●
●●● ●●● ● ●●
● ● ● ●
● ●
●●● ●
● ●●
●
●●
●
●
● ●
● ● ● ● ●● ●
● ●
● ●
●
●
●●●
● ●●●● ●●
● ● ●●● ●●●●●● ● ●●
● ●●
●
●●● ●● ●
● ●
●
●
●
●●
●
●
● ● ●●● ●●●
●
●
●
●● ●
●
●
●●
● ● ●
●
● ●● ● ● ● ●●●●
●●●●
●
●
●
● ●● ●
● ●●● ●●
●
●
●●
●
●●●
●●
●
●●
● ● ●●
●● ●
●
●
●
●●
●
●
● ● ●
●● ●
●
●
●
●
● ●
●●●
●
●●
●●
● ●
●● ●● ●
● ●● ● ●
● ●
●
●●●●
●● ●● ●● ●
●●
●
●●
●
● ●●
●
● ●
●
●●
●
● ●
● ●
●
●
● ●
● ● ●●
●●●
●●● ●● ●●●
●
●●
●
●● ● ● ●● ●●
● ● ● ● ●● ● ●
●
● ●
●● ●
●● ●●● ●●
●●
●●●
●● ●
●
● ● ● ● ● ●
●●
●
●● ●●●
●●● ●●
●
●
● ●
● ●●●
●●
●
●
●●
● ●● ●
●●
●● ● ● ●
●●●●● ● ● ●
●● ●●●●● ●
● ●● ●
●
●
● ●
● ●●
●
● ●●
●●●●●
● ●
●
●● ● ●●●
● ●●
●● ● ●●● ●●●●● ●●
● ● ● ●●
●●
●●
● ●
●●
● ●
●
● ● ● ●
●
●● ● ●●
● ● ●
●
● ●
●
●●
●●
● ● ● ● ●●●
●● ● ●
●
●●
●● ●
●
●●
● ●
●
●
● ● ●●
●●
●
● ●●
●●● ●
● ●●
● ●●●
●
●
●
●
● ● ●●
●●
●
● ●● ●
●
●● ●
●
● ●
●
●
●
●
● ● ● ●
●
●●●●● ●● ● ●●●
●
●●● ● ● ●● ●
●●
● ● ●●
●
●● ●
●● ●●●● ●●●
● ● ●
●● ●
●●
● ● ● ●
●
●
●●●
●
●● ●
●
● ● ●●
●●●
●●● ●
● ● ●
●●
●●
●●
●●
●● ●
●● ●
●
● ●●●● ●
●●● ● ●●
●
● ● ●
●
● ● ●● ●
●●●●
●
●
●
● ● ●
● ●
●●
●● ●
● ●
●●
●●
●● ●●●
●
● ● ●
● ●
●
●●
●
●
●
● ●●●● ● ●●●
●●●●
●
●
●●● ●
●● ● ●● ●● ●●
●
●
●
● ●
●●
● ● ●
● ●● ●
●●● ● ●●●
●
●
● ●●
● ● ●● ● ●●
●●●●●
●●●
●●
●
●●●●
● ●● ● ●● ● ● ●●
●
●
● ●
● ●
●
●●● ●
●
● ● ● ●
● ●●
●
●●
●
●●
●● ●
● ●
●
●●
●
●
● ● ● ●● ●
●●●●●
●
●●
●●
●●
●●● ●
●● ● ●
●●
● ●●● ●
● ● ● ● ● ●
●●● ●
●
● ●●
● ●● ● ● ● ●● ● ● ●
●●
●●
●
●●●●●●
● ●
●
●
● ●
●●
●●
●●
●●
●
●
● ●
● ●●●
● ●
● ●●
●● ●●
●
●
●
● ●●
●
●
●● ●
●●●●●●
●
●
● ●●
● ● ● ● ●●
●
● ●●●
●
● ●● ● ●
●
●● ●
●
●
●●
●● ●●● ●● ●●
●
●
● ●●
●
●
●
●
● ●
●●● ●●●●●●● ●
● ●
●● ●
●● ●
● ●●
●●●
●●
●● ●●
●● ● ●●● ● ●
●●
●●
●● ● ●
●
● ●●●●
●●● ●
●
● ●
●● ●● ● ● ●
●
● ●●
●●●
●
● ●● ●●●
●
●
●
●
●● ● ●
● ● ●●
●
●
●
● ● ●●● ● ● ●●● ●
●
● ●
● ●
●●
●●●
●
● ●●●
●●● ●● ●
●
● ●
●● ● ●●● ●
●●
● ●
● ●
●●●
●
●●
● ●
●
●
● ● ● ●●●
● ●● ●
●
●●
●
●
● ●●●
●
●●
● ● ●●●●● ●●● ●
●
●●
●● ● ●
●
●
●
●
● ●
●
●
●
●●
● ● ●
●●●● ●●
● ● ●●
● ●●
●
●●● ●
●●● ● ●●
● ● ● ● ●
●
● ●
● ● ● ●● ●
●
● ●● ●
● ●
●
● ●
●●● ● ●●●
●
●
●
● ●
● ●●
●●
●● ●
●
●
● ● ●●● ● ●●● ●●
●●
●●
● ●● ●
● ●
●●
●●●
●●
● ●
●●●
●
●
●
● ●
●
● ● ● ● ●● ●●
● ●●●
●●
●
● ● ●
● ●● ●●●
●●●
●● ●● ●● ●
●
●
● ●●●
● ●●
●●
●
●●
●
●
● ●● ●●
● ●
●●
●●
●
● ● ● ●●
●●●
●
●
●
●●
●
●
●
●
●
●
● ●
●
●●
●
●
●●
●
●●● ● ●
● ● ●●●
● ●●● ● ● ●●
●●●
● ●
●
● ● ●
●
●● ●● ● ●
● ● ● ●●●
●
●
●●
●
●
●
●● ●●● ● ●●
●
● ●●
● ●●
●
●● ●●
●● ●●
●● ●●
●●●● ●
●
●●●●●●● ●
●● ●
● ● ●● ●
●●
●●
●●●
●
●
●●
●●● ●
● ●
● ● ● ●●
● ● ●● ●
●
●●
● ●
●
● ●● ● ● ●●● ●●● ● ●●● ●●●●● ●●
●
● ●●
●
●●●
●
● ● ●●
●
●
●
●
●●
● ●
● ●●● ●
● ●●
●● ● ● ● ●●
●●
●● ●
● ●
●● ●
●
●● ●
●
●
●
●●
●● ●●
●
●
●● ●
●
●● ●● ●
●●
●
●
●
●● ●
●● ● ● ● ●
●
●
●
●
● ● ● ● ●● ●
●
●●●●
●
●
●
●●●
●●
● ●● ●
●
●●
●●
●
●
● ● ●●● ●●
●
●●
● ●●
●● ● ●●●
● ●● ●
●
●
●●
●●
●● ●
●
● ●
●● ●●
● ●
●
● ●
●
●● ● ●
● ●
●●● ● ● ●●●
●●
●●●● ●
●● ●
●●
● ●
●
●
●
●●
● ● ● ● ●●
●
●● ● ● ● ● ● ●
● ●● ●●
●
●● ●
● ●●●
●
● ● ● ●●●
●
●
●
● ●●
●
●
●● ●●
●
●
●●
● ●
●
● ●
● ● ●●●
●●
●
●● ●●
● ●
●●
● ●●
●●
●
●●●●●
●
●● ●
● ● ●●●●
●
● ●●●
● ●●
● ●
●
●
●
● ●
●
●
●●
●● ● ●
●●
●
● ●●● ●●
●●● ●
●
● ● ●●● ●● ●●
●
● ●●
●●● ●● ●●
●● ●
●
● ●
●●
●●●
●●
●
●
●
● ●●●●
●● ●●
●●
●
●
● ●
●●●● ● ●●●●●
●
●● ●
●
●●●
●●
●
● ●
●●
●●● ●
●●●
● ●
●
● ● ●●● ●●
●●●
● ●● ●
●●● ●●●
● ● ●● ● ● ●● ● ● ●●
●
●
●●●● ●
● ●
●●
●●
●
● ●● ● ●●●●
● ●
●●
●●● ●
● ●
●
● ●●● ●
● ●● ● ●● ●
●
●●
●● ●
●●●
●● ●
●
● ●●●
●
●● ●●
●
●
● ● ● ●●
●●
● ●
● ●
●
● ●●
●●
●
● ● ● ● ● ● ● ●●●●●● ● ● ●●●
●
●
●●
●● ● ● ● ●●
●
●
●
●●
● ●
●
● ● ●●
● ●
●
●
●
●● ●
●
●
●● ●
●
●
●
●●● ●
●
●
●
●●
● ● ● ●● ●● ● ●
● ●●● ● ●
● ●●●●
●
● ●
●
●
● ●●
●
●
● ●
●●● ● ● ●●
●
●
●
● ● ●
●
●
● ●● ● ● ●● ●
● ●● ● ●●
●●● ●●
● ●● ●
●
●
●
● ● ● ●●● ● ●●● ●●● ●●
●
●
●● ●●
● ● ● ●●
●●
● ●● ●●● ●●
●
●
●
●●● ● ●● ● ● ● ●
● ●
●● ● ●● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
● ●● ● ●
●●
●● ● ●● ●●●
●
● ●● ●
●●●
●
●●
● ●
●● ●●●
● ●●● ●
● ●●
● ●
●●
●
● ●● ●●
●
● ●●
●
●
● ●
●
●
●
●
●
●
● ●
●●● ● ●●
●
●●●
●
● ● ●● ● ●●
● ●●
●● ● ●
● ●
●● ●
●●
●●
● ● ●●
●
●
● ● ●● ●●
●● ● ●● ●●
●●●
● ●
●●
●
● ● ●●
●● ●● ● ●●●
●
●
●●●
●
● ●● ●● ●●
●●
●● ●
● ●● ●●
●●
●●
● ●●●● ●
●●
● ● ●
● ● ●● ● ●
● ●● ● ● ●● ●●
●
●
●
●●
● ●
●
●
●
●
●
● ●
● ●●●
●● ● ● ● ● ●●●●●
● ●
● ● ●●●●
●
●
● ●●
● ●
●
●
●
●●
●
● ● ●●
● ●● ●
● ● ●
●
● ●● ●●
●●
●
●● ● ●
●
● ●●
●
●●
●●
●● ● ● ●
●
●●
●● ● ●●● ●
●
● ● ● ●●
●● ● ● ●
●
● ●
●
● ●
●
●
●
●●●
● ●● ●●●● ●
● ●
● ● ● ●●
●
● ●● ● ●●
●● ●●● ● ● ● ●
●●● ●
●● ●
●●
● ●
● ●●
●
●●
●● ● ●
●●
● ●●● ●
●
●
●
●● ●
●●
●
● ●
●
●
● ●● ● ● ●
●●● ●● ●●●● ● ●
●●
●●
● ●● ●●
●
●
●
● ●● ●
● ●
●●
●
●
● ●
●●
● ● ●
● ●●●
● ●
●
● ●
●
●● ●● ●
●● ● ●● ●●
● ● ●●
●
●
●
●
●
●
●
●
●
● ●●● ●●● ●●
● ●
●● ●
●● ●●●
●
● ●●● ● ● ●●● ●●
●
●
●● ●
● ●●● ● ●
●
●● ●
●
●●
●
●
●●
●
● ● ●●●● ●●
●●
●
●
●
●
● ●● ●
●
●
●
●
●
●● ●
●
● ●●
●
●●
●
●
●
●●
●●
● ●● ● ●● ● ●●●●●●● ●● ●
●
●
●●
●
●
●
● ●●●
● ● ●●●
●●
●
●●●
●
●●
●●
●
●
●
● ●● ●
●
●
●●
●● ●
●
● ●
●
● ●● ●
●
●● ●
●● ●●●
●
●
●●●
●● ● ● ●●
● ●● ●
●●●
●● ● ●● ● ●●●
●
●
●
●
●
●
●●
●
●
●
●
●
● ● ● ●● ●
●
●
●
●
●
● ● ●
● ●●
●
●
●
●
●
●
●●
● ●●
●● ● ●
●
● ●
●●
● ●
●●
●● ●
●●●
● ● ●
●●●●●
●
●
●
●● ●●
●● ●
●
●
●●
●
●
●
●●
●● ●
●● ●
●●
●
●
● ●
●
● ●●
●
●●
●
●
●
●●
● ● ●●●
●
●
●
●
●
● ●
●
●● ●●
● ● ●
●●
●
● ●
●● ●
●●
● ●● ●
●
●● ●
● ●
●
●●
● ●●● ● ●● ●
● ●●
● ●● ● ●
●●●● ● ● ● ● ● ●● ●● ● ●
●
●
● ●●
●●●
●●
●●● ●
●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
● ●● ●
●
●
●●
● ●●●
●●
●
● ●
●● ●
●● ●
●●●
●
● ●
●
●
●
● ●● ●
● ●
●
●● ●●●
●●
● ●
●
● ●
● ●
●
●●
●● ● ● ● ●
●● ● ●
●
●
●
●● ●● ●
●●
●●
●
●
●
●●
●● ● ●
●●●● ●
●
●● ● ● ●●
●● ●
●●● ● ●
●
●●
●
●●
●
●●
● ● ●●
●
●
●
●
●●● ●● ● ●
●
●
●●
●
●●●
●
●
●
● ●● ● ●
●
●
●
●●
● ●● ●● ● ●
●● ●
● ●●
●
●●● ●●
●● ●
●
●●● ● ●●● ●
●
● ●●
●● ●●
●
●
●
●
●
●
● ●
●
● ● ●
●
●
● ● ● ● ● ●
●●
● ●
● ●
●
● ●
●
● ● ●●●
●●
●
●
●
●
●
● ●
●
● ● ● ● ● ●●
●
●
●
●● ● ●●
●●
●
●
●● ●
●● ●●
●
●●
● ● ● ●●
●● ●● ●
● ●●●
●●● ●
●
● ● ●●
● ●
● ● ●● ●●● ● ●
●●
●●
●●
●●
●
●
●●
●●
● ● ●● ● ● ● ●
● ●● ● ●●
●● ●
●●●● ●● ●
●●
●● ●
●
●● ● ●●
● ●● ●
●
● ●●●
●● ● ●●
●
● ●●
●
●
●●
●
●
●●
●
●
●
●●
●
● ●
●●
● ●
●●
● ●●● ●
● ●
●
●
● ●
●
●
● ●
●
●
● ●●●
● ●●
●
●●
●
● ●
● ● ●
● ●●
●
●
●
●
●
●
●●● ● ●●●
●
●● ● ●
●
●
●
●
●● ● ●● ● ●● ●
● ● ● ●●● ●
●
●
●●
● ● ●● ● ●
●
●● ● ●
●
●
●
● ●
●● ●● ●●
●
●
●
●
● ●
●
●● ●
●
●
●● ● ●● ●
●
● ●
● ●
●●
● ●
●
●
●●
●
●
● ●
●
●
● ●
●●
● ● ●
● ●●●● ●●●●
●
●
●
●
●●
●
●
●
●●
●
●● ●
●
●
● ● ●
●
● ●
● ●● ● ● ●●● ● ●
●●
●
●
● ● ●
●●
● ●
●
●
●
●
● ●
● ●
●● ●
●
●
●
●
●
● ●
●
●
●
●●
●
●
● ●
●● ●
● ●
●
● ●● ●
●●●●● ●● ●●● ●
●
● ●
●
●
●
●
●●● ● ●
●●●
●●
●
●
●
● ●
●● ●● ●
●
● ● ●●
●● ● ● ● ●
●
●
● ●
●
●
●
●●●●●● ● ●
●●
● ●
● ● ●
●
●● ● ●
● ● ●
●
●
●●
●
●●
●
●
●
● ●
● ●● ●
●
●
●
●
●
●●
●
●
●●
● ●
●
●
●
●
●
●●
● ●●
●
●
●
●
●
●
●
●● ● ●
●
●
●
● ●
●●
● ●
●●
●●
●
● ●●
●●
●
●
● ●
● ●●
●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ● ●●
● ● ●
●
●
●
●●
●
●
● ●
●
●
●
● ●
●
●
●
●
●
● ●●
●
●
●
●
● ●●
●
●
●
●
● ●
●●
● ●
●●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
● ●
● ●
●● ● ● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●
●
●
●●
●
og
0.8
2.0
0.6
●
0.0
●
0.4
●
●
●
●
log(kRDeg)
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
−1.0
●
0.2
●
●
●
●
●
●
●
●
−2.0
0.0
−3.0
AR Pmmh
−0.2
●
●
●
●
●
−4.0
● ●
●
●
●
●
●
●
●
● ●
●●
●
●
●
2.0
Da
●
●
● ●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
2.5
●
●●
●
●
● ●●
●● ●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●● ● ●
●
●
● ● ●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
−5.0
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●
●●
● ●
●
●● ● ●
●
● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
● ●
●
●
●
●
● ● ●
●
●
●
● ●
●
●
●
●
●
● ●
●
●
● ●
●
●
●
●●●
●
● ●
● ●
● ●
●●
●
●
●
●
●
●
● ●
● ●
●
●
●
●●
●
● ●
●
●
●●
● ●
●
●
●●
●
●● ●● ●
●
●●
● ● ●●
● ●
●
●
●
●
●● ● ●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
● ● ●
●
● ● ●
●
●
●
●
●
● ●
●
●
●
●
●●
●
● ●
●
●
●
●●●
●
●
●
●
●● ● ●
●
●
● ●
●
● ●
●
●
●
●● ●
●
●
●
●●
●
●
●
●
●
●● ●
●
●
●
● ●
●
●●
● ●●●
●
●
●●
●
●
● ●
●
● ●
●
● ●
●
●
●
●
● ●
● ●
●
● ●
●
● ●●
●
●●
●
● ●
●
●
●
● ●
●
● ●
● ●
●
● ● ●
●
● ●●
●
●
●
●
●
●
●●
●
●●●
●
●●
● ●●
●●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
● ●
●
●
●●
● ● ●●
●
●
●
● ●
●
●●
●
●
●
●● ●
●
● ●
●
●
●
●
● ●
●
●
● ● ●
●
●●
● ●●
●●
● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●● ●
●
● ●
●●
●●
●
●
●
● ●
● ●
●●● ●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●
● ● ●
●●●● ●
●
●
●
●
● ●
●
●
● ●
●
●
●●●
●
●●
● ●
● ●
●
● ●
●
●
● ●●
●●
●
●
●
● ● ●● ●
● ●● ● ●
●
●
●
● ●● ●
●
●
●
●
●● ● ●
●
●●
●
●●
●
●● ● ● ●
● ●●
●
●
● ● ●
●
●
● ● ●
●
●● ● ●
●
●●● ●
●●
●
●●
● ●
●
● ● ●
●●●● ●
● ● ●●
●
●
●
●
● ● ●
●
●●
● ●
● ●●
●
● ●●●
●
●
●
● ●
●
● ● ●● ● ●
● ●
●
●
●
●
●
●
●● ●
● ● ●●●●
●● ● ●
●
●●
●●
● ●
● ●
●●
●●
●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●●
● ●
●
●● ●
●
●●
●●●
●
●●
● ●
●
●
● ●
● ● ●
●
●●
● ●●
●●
●
●
●●
●
●●●
●
●
●
●
●
●
●
●
●●
●●
●●
●
● ●
●●
●
●
● ● ●●
● ●
●●●
●
●● ● ●
● ●
●
●
● ●
●●● ●
●●● ●
● ●● ● ●
● ●
●
●●●
●
●
●●● ● ●
●
●
●●●
●
●●
● ● ● ●●● ●
●
●
●●●●●
●●
●
●
●●
●● ●
● ● ●
●
●● ● ●●
●●●
● ●● ●●
●
● ●
●●
● ●● ●
●
●
● ●●
●
●●● ●
● ●●
●
●
● ●
● ●
●●
●
● ● ● ●
● ● ●●
●●● ●
●
● ●
●●
●
●
●● ● ●●●
●●
●
●
●
●
●
●●
● ●●●● ● ● ●
●
● ● ●
●
●
● ●●●
●
● ●
● ●
●
●●
●●
●
●
●●
●
●
● ● ● ●●
●●
●●
● ● ● ●●●
● ● ● ●●
● ●
● ● ●● ●
● ●●● ● ●●
●●●●
●
● ● ●● ●
● ●
●
●
●
●
●
●
● ● ●●
● ●
●
●
●●
●● ● ●●
●
● ●●
●●
●● ● ●
● ● ●
●● ●●
● ● ●● ●
● ●
●
●
●●● ●
● ●
●
●
●
● ● ● ●●●
● ●● ●
●●
●●
● ●
● ● ●●
●● ●
● ● ●
●● ● ●
●
● ●
●
●
●
●
● ●●
● ●●
● ●
●●● ●
●
●
●
●
●
●●● ●
●
●
● ●
● ● ●●
●
●● ●●● ●
● ●
●● ●
● ● ● ●●●
●
● ● ●
●
●
●
●
● ●● ● ●
● ● ●●
●● ●
●
●
●
●
●●
●● ●
●
●
●
●
●●
● ●
● ●
●● ●
●
●●
● ● ●●
●●
●●●
●
●
● ●
● ●● ● ●
● ●●
● ●● ●● ●
●
● ●
●● ●
● ●
●
●
●●
●●●
●● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●● ● ●●
●
●●
●
●
● ●
●●● ●
● ●● ●
●● ●●
●
●
●●
●
●●●
● ●●● ● ● ●
●●
●●
●● ●
● ● ● ●●
●
● ●
● ●●● ●
●
●
●
● ● ●
●
● ● ●●●● ● ● ● ●
●
●
●
● ●
● ●
●
● ● ●●● ●● ●
● ● ●
●
●
●
●●
●●
●
● ● ●●
●
● ●●● ● ●
●● ●
●●
●
●●●
● ● ●●●●● ●
●
●
●
●● ●
●
●
●● ●
●
●
●●
●
●
● ●●
● ●
●
●
●
●
●
●
● ●●
●
●● ● ●●● ●●
● ● ●● ●
●
● ●
●●● ● ●
●
● ●●●
● ●● ●
●●
●● ●● ● ● ●● ●
●
● ●
● ●●
●●
● ● ● ●●
●
●●
●
● ● ●●
●●
●●
●●
● ●
●● ●
●
●●
● ●
●●
● ●●●
●
● ●● ● ●● ● ● ●●
●
●● ● ●
●
●●
●
●
●●● ● ●
●
● ●
● ●
●● ●
●
●
● ● ●
●●●
●●
● ●
● ● ●● ●
●
●
● ●
● ●● ● ● ●
●●
●
●
● ●●
●●
●●●●●
●● ●●
●
●
●
●
● ●● ●
●●● ●
●
●
●● ●
●
●
●
● ●● ●
●
●
●
●
●
●
●
● ● ●● ●
● ●●
●
●
●●
●
●●
● ●
●
●
● ●●●●
●●● ●
● ●
● ●
●
● ●
● ●●● ●
●
● ●●
●●●●
●●●●
● ● ● ● ●
●● ●● ●●
●● ● ●
●●
●
●
● ●
● ● ● ●●
●
●
●● ●●
● ● ● ●
●
●
● ● ● ● ●● ● ●
●● ● ●
●●
●●
●● ● ●
● ●●●●●● ●
● ● ●●●
●
●
●●
● ● ●●
● ●
● ●● ●
●
● ●●● ●●● ●
● ● ● ●●
●●●●
●
●
●●
● ● ● ●● ●
● ●
●● ●
● ● ●
●
●
● ● ● ● ●●
●
●● ● ●●●
●●
●
●
●● ● ●●
● ● ●
●●
● ●●
●●
●
●
●●
●● ● ●
●●
● ●
●●
●
● ●● ●
●
●
●
●
●
●
●
● ●● ● ●
●
●●
●
●
● ●
●
●
●●
●●
● ●
●●
●● ●●●●
●● ●
●●
●●
● ●
●
● ●
● ● ●● ● ●
●● ●
●
●
●● ● ●●
●
●
●
●●
●
● ●
●
● ● ● ●
●
●
●
●
●
●
●
●
● ●●●●● ● ● ●●●
●
●
●
● ●●
●● ● ●
●●
● ●
●●●
●
●
● ●
●●●
●
●● ● ●
● ● ●●
●●●
●●●
●
● ● ●●●
● ●●
●
●
●
● ●● ●
●
●●
● ● ● ●●
●●
●● ● ●
●
●
● ●● ●
●●
● ●
●● ●
● ●● ●
●● ●
●●● ●
●●● ●●●●● ● ●●● ●● ●
● ●●
● ●
● ●●●
● ●
●
●
●
● ●● ●
●
● ●
●●
●
●
● ●
● ● ● ●●●
●
● ●●●●
● ●●●
● ● ●
●●
● ●●●
●●
● ●● ● ●
●●● ●
●●
● ●
●
●
● ●●
●
● ●● ● ●
●
●●
●●
●
●●
● ● ● ●● ●●
●
● ●●
●
● ●●● ●
●
● ● ●● ●●●
● ● ●●
● ●●
● ●● ● ●●
● ●● ● ●
●●
●
●●●
● ●
●● ●
●
● ● ●
● ●● ● ●●
●
●
●
●● ● ● ●
● ●● ●●●● ●
● ● ●●● ●● ●●●●
● ● ●
●● ●
●
●
●
●
● ●●
● ●●
●●●● ●●●
●
● ●
●
●
●
●
●●● ●●
●●
●● ●
●●●●●
●
●● ●
●● ●●●
●
●
●
●● ●
●
●● ● ●
● ●●
● ●
●●●
● ●
●●
●
●●
●●
●
●
● ●
●
● ●● ●
● ●●●
●
●●● ●
●
● ● ●●● ●●●●
● ● ●
●
●● ● ●
●● ●
●
●●
●●
●●● ●
● ●
●
●●
●
● ●●
●
●●
●●
●
●
●●
●● ●
●
● ●● ●●
● ●● ●
●
● ●
●
● ●● ●●
● ● ●●
●●
●●●
●
● ●
●
●
●
●●
●
●●
●
●
● ●
●●
●
●● ●
●●●
● ● ●●●
●
●
●●●● ● ● ●● ● ● ● ●●●● ●● ●● ●
●
●
●
● ● ●
● ●
● ●●
●● ●
●
● ●●● ●
● ● ●
● ●● ●●
●● ●
●● ● ● ●
●
● ●●●● ● ●●●●
● ●●●
●
●
●
●
●
●
●
●
●● ● ● ● ● ●● ●
● ●
●
●
●●
● ●
●●●●●
●● ●
●●● ●
● ●
● ●● ● ●
●●●● ●●●● ●●●
● ●●
●
●
●
●
● ●
● ●●● ●●
●●●
●
●
●●
●
●
●
● ●●● ●●
●●
●
● ●
●
●
● ●●●
● ● ●
●● ● ● ●
● ●
● ●●●●
●
●
●●●
●
●● ●
●
●
● ● ●●●
●
●●●●●
●●
●
● ●●● ●●● ●● ● ●●●●
●●
●● ●
●●●
● ●
●●
● ● ●
●●
●
●● ●
● ●●● ●● ● ●● ●●
●
●●
● ●
● ●●
●
●
●● ● ●●● ●
● ●● ●●● ● ● ●
●●
●
●●
●●
●
● ●●●●
●
●●
●
● ●●
●●●
●●● ●
●●
●●●●
● ● ●● ● ●●●● ●
●●●●
●
●● ●
●
●●●
●
●
● ●
●
●
● ●●
● ●
● ●●● ● ● ●
●●
●● ●●
●
●
●●●●● ●
●●●● ●
●● ● ● ●●● ● ● ● ●
●●
●
●
●
● ●●● ● ●
●
● ● ●●● ● ●●●
● ● ● ●● ● ● ●
● ●
●
●
● ●●
●
●●
● ●
●
●
●● ● ●● ●
●● ●
●● ● ●
●●
●
●●
● ●● ●
●●● ● ●
● ● ● ● ●●
●●●● ● ● ● ● ●●●●
● ●●●
●
●
●
●● ●
●
●
● ●●● ●
●●
●
●
●●
●
●●
●
● ● ●
●● ●●● ● ●
●●
●
●● ● ● ●
●● ●
●
●● ●
●●●
●
●●
● ●●●●
●● ● ●●●
●●
●●● ●●
●●●●●●●
● ● ● ●●
●● ● ● ●
●
●● ● ●
●●●
● ●●
● ●●
●
● ●●
●
●●
●
●● ●●
●
● ●●
●
●
● ●●
● ●●
●● ●●
●
● ●●●●
●
●
● ● ●
●
●● ●●●
●● ● ●
● ●●● ● ●●
● ● ●
●
● ●
●
● ● ●
●
● ●●
●●
●
●
●
●● ●
●●
●● ●
●●
●
●
●●
●
●
● ●●
●
● ●●
● ●●
● ● ●●●●
●●
●●●
●
● ●● ●●●
●●
●●●
● ●
●● ● ● ● ●
●
●●●●● ●
●●● ●
●● ●● ●
● ●
●
●
●
●
● ●●
● ●●●●
●
●●
●●
●
●
●
●● ●
●●
● ● ● ●
●
● ●●
●
● ● ●●●● ●●
●
●●
●
●●
●
● ● ●● ● ●
● ●●●
●● ●
● ●
●● ●●
●
● ●●●
● ● ●●
●
● ● ●●
●●
● ●●●
● ●
● ●●
●
●●
●
● ●
●● ●● ● ●●●●●● ●
●●
●
●●
●
● ● ●● ●
● ● ●●
● ●
●●
●●
●
● ●
●
●●
● ●● ● ●● ●●●
●●
●● ● ●●●●
●
● ●
●● ●
●●
● ●
● ●
● ● ● ● ● ● ● ●●
●
●●
●
●●● ●
●●
●●
●●
●
●
●●●
● ● ●
●●
●●
●●
●
● ●●
●●
● ●● ●
●
●
●● ●● ● ●●●●
● ● ●●● ●
●●●●
●
●●
● ● ● ● ● ● ● ●● ●
●●
●
● ● ●● ●
●● ●
●
●
●●
●
●●●● ●
●● ●●●
●
● ●●
●
●
●● ●
●
● ●
● ●● ●● ● ● ●● ● ● ● ●
●●●● ● ●
●
●●
●
●●●
●
●
● ●
●
●
●● ●●
● ● ●●●●
●
●●●
● ●●●
● ● ● ●●
● ●●● ●●
● ●●
●●
●●
● ●
● ● ●
● ●
●●
●
●●
●●●
●● ●
● ● ● ●●●
● ● ●
●●● ● ●●●
●● ●
●●
●
● ●
● ●●
●
●
●●
●
●
●
●● ●
● ● ●
● ● ● ●●
●
●● ● ●
●●●● ●●●
●
●●●
● ●●
●
●●
● ●●
● ●● ● ●● ●
●●●● ●
● ●● ●
●●
●
●
●●● ● ●
●
●
●●
●●
●
● ● ●●● ●
●● ● ● ●
●
●●
● ●
● ●
● ●
● ●●●
●
● ●● ●● ● ●●●● ●
●
● ●
● ●●
●● ●
●●
●●
●●
● ●●●●● ●
●
●
●
●●
●
● ●●
●●
●● ● ●
● ● ●
●
● ●● ●
●
● ● ●
●●●
●● ●
●
●
●
●
●●●
● ●●●●
●
●●
●
●●
●●●
●●● ● ●●● ●
● ● ●●
●●
●
●
●●
● ● ●● ●
● ● ●
●
●● ●●
● ● ●● ● ●●
●
●
●
●●●
●
●●
●● ●●●●● ●●●
●●
● ●
● ●● ● ●
● ●
●
●
●●
● ●● ●● ●
● ●●
●
●
●
●●
●●
● ● ● ● ●●●● ● ●●●
●●
●
●
●
● ●●
●
● ●●
● ● ●●
● ●● ● ●
● ●
●
●
● ● ● ● ●● ●
● ●● ● ●●● ● ●
● ●
●●● ●● ●
● ●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
−6.0
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●●●
●
●
●
●
● ●
●
● ●
●
●
●● ●
● ● ●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●
● ●
●
● ●
●
●
og PDeg
●
●
●●
●
●
●
3.0
●
●
●●
●
●
3.5
4.0
n W k n on — CMS 8/9/2016
S a ab
a go hm
o Ma ko p o
pa am
n
n
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
“Sticking” and tuning of PMMH
As well as tuning the θ proposal variance, it is necessary to
tune the number of particles, N in the particle filter — need
enough to prevent the chain from sticking, but computational
cost roughly linear in N
Number of particles necessary depends on θ, but don’t know θ
a priori
Initialising the sampler is non-trivial, since much of parameter
space is likely to lead to likelihood estimates which are
dominated by noise — how to move around when you don’t
know which way is “up”?!
Without careful tuning and initialisation, burn-in, convergence
and mixing can all be very problematic, making algorithms
painfully slow...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
General SIR particle filter
At time t, we have (after resampling) an equally weighted
sample from π(xt |y1:t )
At time t + 1, we want a weighted sample from
π(xt+1 |y1:t+1 ), though in fact useful to construct a sample
from π(xt+1 , xt |y1:t+1 ), and then marginalise down as required
Target ∝ π(yt+1 |xt+1 )π(xt+1 |xt )π(xt |y1:t ) and proposal is
f (xt+1 |xt , yt+1 )π(xt |y1:t ), for some f (·), leading to
unnormalised weight
wt+1 =
π(yt+1 |xt+1 )π(xt+1 |xt )
f (xt+1 |xt , yt+1 )
LF choice is f (xt+1 |xt , yt+1 ) = π(xt+1 |xt ), otherwise need to
evaluate the discrete time transition density
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Weights and RN derivatives
For Markov processes with intractable kernels, make the
target π(xt+1 , xt |y1:t+1 ) and then marginalise down to
π(xt+1 |y1:t+1 ) if required
The proposal path will be of the form
f (xt+1 |xt , yt+1 )π(xt |y1:t ), leading to weight
wt+1 = π(yt+1 |xt+1 )
π(xt+1 |xt )
f (xt+1 |xt , yt+1 )
The expected weight is π(yt+1 |y1:t ), as needed for
pseudo-marginal MCMC
Formally,
dP
wt+1 = π(yt+1 |xt+1 )
(xt+1 |xt ),
dQ
the RN derivative of the true (unconditioned) diffusion wrt
the proposal process
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Alternative: approximate Bayesian computation (ABC)
Since π(θ, x, Y) = π(θ)π(x|θ)π(Y|θ, x), it is trivial to
generate samples from π(θ, x, Y) and to marginalise these
down to π(θ, Y)
Exact rejection algorithm: generate (θ? , Y ? ) from π(θ, Y) and
keep provided that Y = Y ? otherwise reject and try again
This gives exact realisations from π(θ|Y), but in practice the
acceptance rate will be very small (or zero)
ABC: Define a metric on the sample space, ρ(·, ·), and accept
(θ? , Y ? ) if ρ(Y, Y ? ) < ε
This gives exact realisations from π(θ|ρ(Y, Y ? ) < ε), which
tends to the true posterior as ε −→ 0
Still problematic if there is a large discrepancy between the
prior and posterior...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Summary statistics
The choice of metric ρ(·, ·) is very important to the overall
efficiency and performance of ABC methods
Using a naive Euclidean distance on the raw data
ρ(Y, Y ? ) = kY − Y ? k is likely to perform poorly in practice —
even with a perfect choice of parameters, it is extremely
unlikely that you will “hit” the data
Ideally, we would use a vector of sufficient statistics, s(Y) of
the likelihood model associated with the process, to
summarise the important aspects of the data relevant to
parameter inference, and then define a metric on s(·)
In practice, for complex models we don’t know the sufficient
statistics (and they probably don’t exist), but we nevertheless
form a vector of summary statistics, which we hope capture
the important aspects of the process and ignore the irrelevant
noise in each realisation
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Summary statistics for POMP models
For time series data, obvious summary statistics for each
univariate component include the sample mean,
(log-)variance, and the lag 1 and 2 auto-correlations
For multivariate time series, the (matrix of)
cross-correlation(s) is also useful
Together these summary statistics capture much of the
essential dynamic structure of multivariate time series
For our auto-reg network example, this reduces the dimension
of the statistic we are trying to match from 50 × 2 = 100 to
2 × 4 + 1 = 9 — not only a big reduction, but we are now
trying to match statistics that are relatively robust to noise in
the data
The individual components of s(·) are on different scales, so
need to re-weight (eg. by standardising) before applying a
simple metric such as Euclidean distance
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Issues with simple rejection ABC
There are two main problems with naive rejection sampling
based ABC:
The first relates to the dimension of the data, and this is
(largely) dealt with by carefully choosing and weighting
appropriate summary statistics
The second relates to the dimension of the parameter space...
If the dimension of the parameter space is large, the posterior
distribution is likely to have almost all of its mass
concentrated in a tiny part of the space covered by the prior,
so the chances of hitting on good parameters when sampling
from the prior will be very small
Might be better to gradually “zoom in” on promising parts of
the parameter space gradually over a series of iterations...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
ABC–SMC
Interest in a Bayesian posterior distribution
π(θ|x) ∝ π(θ)f (x|θ)
where f (x|θ) is intractable
Observed data x0
Sequence of approximations
πt (θ) = π(θ|ρ(x, x0 ) < εt ),
where ∞ = ε0 > ε1 > · · · > εn > 0 and ρ(·, ·) is a suitable
metric on data space
π0 is the prior, and for sufficiently small εn , hopefully πn not
too far from the posterior, π(θ|x0 )
Progressively reduce tolerances to improve agreement between
successive distributions and hopefully improve acceptance
rates
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
ABC–SMC algorithm
Suppose we have a large (possibly weighted) sample of size N
i
from πt−1 (θ), {θ1 , . . . , θN } with normalised weights w̃t−1
Pick θi? ∼ πt−1 (θ) (weighted particles)
Perturb θi?? ∼ K(θi? , θi?? )
Simulate x? ∼ f (x? |θ?? ) from intractable likelihood model
Accept only if ρ(x? , x0 ) < εt , otherwise reject go back to the
start and pick another θi?
Keep θ?? as ith member of new sample, and compute its
weight as
π(θ?? )
wti = PN
i
??
j=1 w̃t−1 K(θj , θ )
Once a sample of size N is obtained, normalise weights and
increment t
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
ABC–SMC weight derivation
Why does the ABC-SMC algorithm work? In particular, where
does the weight expression come from?
Need to view proposal on θ?? and x? jointly, but with θ?
integrated out
Z
∝
πt−1 (θ? )K(θ? , θ?? )f (x? |θ?? )I[ρ(x0 , x? ) < εt ] dθ?
Θ
Similarly, the target is
∝ π(θ?? )f (x? |θ?? )I[ρ(x0 , x? ) < εt ]
Leads to weight
w=R
π(θ?? )
?
? ??
?
Θ πt−1 (θ )K(θ , θ ) dθ
with obvious particle approximation, leading to the desired
weight
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Pros and cons of ABC(–SMC)
All likelihood-free methods have a tendency to be very
computationally intensive and somewhat inefficient
ABC is very general, and can be applied to arbitrary settings
(eg. not just POMP models)
ABC methods parallelise very well, and hence can be useful
for getting reasonably good approximations to be true
posterior relatively quickly if suitable hardware is available
ABC is becoming increasingly popular outside of statistics,
where the idea of “moment matching” is familiar and intuitive
ABC usually results in a distribution significantly
over-dispersed relative to the true posterior
The tuning parameters can affect the ABC posterior
It’s hard to know how well you are doing when working “blind”
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Pros and cons of pMCMC
Most obvious application is to POMP models — less general
than ABC
It targets the “exact” posterior distribution, irrespective of the
choices of tuning constants!
In practice, for finite length runs, the pMCMC output tends to
be slightly under-dispersed relative to the true posterior
(“missing the tails”)
Parallelises fine over multiple cores on a single machine, but
less well over a cluster
Although the theory underpinning pMCMC is non-trivial,
implementing likelihood-free PMMH is straightforward, and
has the advantage that it targets the “exact” posterior
distribution
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
ABC–SMC and PMMH
PMMH algorithms require careful tuning, and are not trivial
to parallelise
Multiple parallel chains benefit from being initialised at
samples from the posterior, in order to minimise burn-in
Tuning the number of particles to use in the particle filter is
best done by averaging over a sample from the posterior
ABC–SMC can be used to generate a sample from an
approximation to the posterior, which is good enough for
tuning and initialisation of PMMH chains
ABC algorithms parallelise well, so this strategy is well suited
to taking advantage of parallel hardware
ABC–SMC algorithms also require tuning, but reasonable
default choices work mostly OK for POMP models
Multiple independent tuned PMMH chains can then target
the exact posterior
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Overview of Bayesian methodology for stochastic dynamic models
POMP Models
Summary
Summary
For conducting Bayesian inference for complex simulation
models, “likelihood–free” methods are very attractive
There are many likelihood–free algorithms, some of which are
“exact” — pMCMC algorithms being a notable example
Likelihood-free algorithms can sometimes be very inefficient
pMCMC not the only option worth considering — ABC–SMC
methods, and SMC2 are also worth trying; also iterated
filtering for a ML solution
Hybrid procedures which parallelise well and take advantage of
the particular strengths of different algorithms are worth
exploring
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Problems with parallelising code
Most problems arise from concurrent processes needing to
access and modify some data — shared mutable state
Synchronisation (threads standing idle waiting for other tasks
to complete)
Deadlocks (two or more competing actions are each waiting for
the other to finish, and thus neither ever does)
Race conditions (unpredictable behaviour depending on timing
of unpredictable events - eg. simultaneous updating of data or
a resource)
A major issue is that we are using programming languages and
models of computation from the dawn of the computing age
— think about how much computing hardware has changed in
the last 40 years compared to the languages that most people
use for scientific computing...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Functional approaches to concurrency and parallelisation
Functional languages emphasise immutable state and
referentially transparent functions
Immutable state, and referentially transparent (side-effect
free) declarative workflow patterns are widely used for systems
which really need to scale (leads to naturally parallel code)
Shared mutable state is the enemy of concurrency and
parallelism (synchronisation, locks, waits, deadlocks, race
conditions, ...) — by avoiding mutable state, code becomes
easy to parallelise
The recent resurgence of functional programming and
functional programming languages is partly driven by the
realisation that functional programming provides a natural
way to develop algorithms which can exploit multi-core
parallel and distributed architectures, and efficiently scale
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Category theory
Dummies guide:
A “collection” (or parametrised “container” type) together
with a “map” function (defined in a sensible way) represents a
functor
If the collection additionally supports a (sensible) “apply”
operation, it is an applicative
If the collection additionally supports a (sensible) “flattening”
operation, it is a monad (required for composition)
For a “reduce” operation on a collection to parallelise cleanly,
the type of the collection together with the reduction
operation must define a monoid (must be an associative
operation, so that reductions can proceed in multiple threads
in parallel)
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Data structures and parallelism
Scala has an extensive “Collections framework”
(http://docs.scala-lang.org/overviews/
collections/overview.html), providing a large range of
data structures for almost any task (Array, List, Vector, Map,
Range, Stream, Queue, Stack, ...)
Most collections available as either a mutable or immutable
version — idiomatic Scala code favours immutable collections
Most collections have an associated parallel version, providing
concurrency and parallelism “for free” (examples later)
As the (immutable) collections are monads, they are often
referred to as monadic collections
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Spark
Spark is a scalable analytics library, including some ML (from
Berkeley AMP Lab)
It is rapidly replacing Hadoop and MapReduce as the standard
library for big data processing and analytics
It is written in Scala, but also has APIs for Java and Python
(and experimental API for R)
Only the Scala API leads to both concise elegant code and
good performance
Central to the approach is the notion of a resilient distributed
dataset (RDD) — a monadic collection — hooked up to
Spark via a connector
Lazy, functional, immutable data workflows are what makes it all
work — that’s why it’s implemented in a language like Scala
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Spark example
Example of a logistic regression log-likelihood gradient
computation taken from Spark MLLib:
v a l g r a d i e n t = p o i n t s . map { p =>
p . x ∗ (1/(1+ exp (−p . y ∗ (w . d o t ( p . x ) ) ) ) − 1 ) ∗
p . y }. reduce ( + )
Note that the RDD points could be terabytes big and
distributed across a large cluster of nodes, yet the code is
written as for any other immutable collection
The code itself is agnostic about the type of the collection
and whether it is parallel or distributed — this is a separate
concern
The map and reduce operations naturally parallelise
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Spark combinators
Variety of lazy, functional combinators for RDDs which can be
chained together and optimised prior to execution
Transformations: map, filter, flatMap, sample, union,
intersection, groupByKey, ...
Actions: reduce, collect, count, take, takeSample, foreach, ...
Higher-level tools include Spark SQL for SQL and structured
data processing, MLlib for machine learning, GraphX for
graph processing, and Spark Streaming
Intelligent scheduling of operations to maximise data locality
and minimise communication overhead
Spark RDDs can be used to distribute computation as well as
data — not only useful for “big data” problems — good for
“big models” too...
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Example: Scalable particle filter
First define a typeclass describing the required properties of
collections, independently of whether the actual concrete
implementation to be used is serial, parallel or distributed
GenericColl [C[ ] ] {
map [ A , B ] ( ca : C [ A ] ) ( f : A => B ) : C [ B ]
r e d u c e [ A ] ( ca : C [ A ] ) ( f : (A , A) => A ) : A
f l a t M a p [ A , B , D[ B ] <: G e n T r a v e r s a b l e [ B ] ]
( ca : C [ A ] ) ( f : A => D[ B ] ) : C [ B ]
d e f z i p [ A , B ] ( ca : C [ A ] ) ( cb : C [ B ] ) : C [ ( A , B ) ]
d e f l e n g t h [ A ] ( ca : C [ A ] ) : I n t
trait
def
def
def
}
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Example: Scalable particle filter
Code for one step of a particle filter, for any generic collection of
particles
d e f u p d a t e [ S : S t a t e , O: O b s e r v a t i o n , C [ ] : G e n e r i c C o l l ] (
d a t a L i k : ( S , O) => L o g L i k , s t e p F u n : S => S
) ( x : C [ S ] , o : O) : ( LogLik , C [ S ] ) = {
v a l xp = x map ( s t e p F u n ( ) )
v a l lw = xp map ( d a t a L i k ( , o ) )
))
v a l max = lw r e d u c e ( math . max ( ,
v a l rw = lw map ( l w i => math . e x p ( l w i − max ) )
v a l s r w = rw r e d u c e ( + )
v a l l = rw . l e n g t h
v a l z = rw z i p xp
v a l r x = z f l a t M a p ( p => V e c t o r . f i l l (
P o i s s o n ( p . 1 ∗ l / s r w ) . draw ) ( p . 2 ) )
( max + math . l o g ( s r w / l ) , r x )
}
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Example: Scalable particle filter
A particle filter is then just a fold over a sequence of observations
d e f p F i l t e r [ S : S t a t e , O: O b s e r v a t i o n , C [ ] : G e n e r i c C o l l ,
D[O] <: G e n T r a v e r s a b l e [O ] ] (
x0 : C [ S ] , d a t a : D[O] , d a t a L i k : ( S , O) => L o g L i k ,
s t e p F u n : S => S
) : ( LogLik , C [ S ] ) = {
v a l u p d a t e r = u p d a t e [ S , O, C ] ( d a t a L i k , s t e p F u n )
d a t a . f o l d L e f t ( ( 0 . 0 , x0 ) ) ( ( p r e v , o ) => {
val next = updater ( prev . 2 , o)
( prev . 1 + next . 1 , next . 2 )
})
}
d e f p f M l l [ S : S t a t e , P : P a r a m e t e r , O: O b s e r v a t i o n ,
C [ ] : G e n e r i c C o l l , D [O] <: G e n T r a v e r s a b l e [O ] ] (
simX0 : P => C [ S ] , s t e p F u n : P => S => S ,
d a t a L i k : P => ( S , O) => L o g L i k , d a t a : D [O]
) : (P => L o g L i k ) = ( t h : P) => p F i l t e r ( simX0 ( t h ) ,
da ta , d a t a L i k ( t h ) , s t e p F u n ( t h ) ) . 1
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Functional MCMC and particle algorithms
MCMC and particle algorithms correspond to simple, well
understood patterns of iteration and computation:
A step of an SMC algorithm can be accomplished with a few
monadic operations on arbitrary monadic collections
An SMC algorithm is then just a functional fold over a data
set
An MCMC algorithm is an unfolding of an initial state and
transition kernel into a lazy functional stream
The MCMC algorithm can be run and results summaried by
folding the MCMC stream
Algorithms can be described at a level of abstraction above
low-level details of exactly how operations should be
scaled/parallelised/distributed — others have already thought
about good ways to do this
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
Summary
Convincing arguments can be made that strongly typed
functional programming languages scale better than
conventional imperative (O–O) languages
Concurrency and parallelism are difficult to manage in
imperative languages due to shared mutable state
Since functional languages avoid mutable state, writing
concurrent and parallel code in functional languages is simple,
natural and elegant
Understanding the role of immutability and referential
transparency in functional programming will change how you
think about computation itself
Category theory provides the “design patterns” for functional
programming
Darren Wilkinson — ICMS, 8/9/2016
Scalable algorithms for Markov process parameter inference
Introduction
Bayesian inference
Functional languages for concurrency and parallelism
Problems of parallelisation in imperative languages
Functional programming
Distributed parallel computation
References
Golightly, A. and D. J. Wilkinson (2011) Bayesian parameter inference for
stochastic biochemical network models using particle MCMC. Interface Focus,
1(6):807–820.
Golightly, A. and Wilkinson, D. J. (2015) Bayesian inference for Markov jump
processes with informative observations, Statistical Applications in Genetics and
Molecular Biology , 14(2):169–188.
Owen, J., Wilkinson, D. J., Gillespie, C. S. (2015) Scalable inference for Markov
processes with intractable likelihoods, Statistics and Computing , 25(1):145–156.
Owen, J., Wilkinson, D. J., Gillespie, C. S. (2015) Likelihood free inference for
Markov processes: a comparison, Statistical Applications in Genetics and
Molecular Biology , 14(2):189-209.
Wilkinson, D. J. (2011) Stochastic Modelling for Systems Biology, second
edition. Chapman & Hall/CRC Press.
@darrenjw
Darren Wilkinson — ICMS, 8/9/2016
darrenjw.wordpress.com
Scalable algorithms for Markov process parameter inference

Documentos relacionados

a bayesian approach for modeling stochastic deterioration

a bayesian approach for modeling stochastic deterioration reliability measure over time in the light of only empirical data. In the context of reliability analysis, these approaches are of limited applicability given that one is often dealing with situati...

Leia mais