A latent factor model with a mixture of sparse and dense factors to model gene expression data with confounding effects

One important problem in genome science is to determine sets of co-regulated genes based on measurements of gene expression levels across samples, where the quantification of expression levels includes substantial technical and biological noise. To a…

Authors: Chuan Gao, Christopher D Brown, Barbara E Engelhardt

A latent factor model with a mixture of sparse and dense factors to   model gene expression data with confounding effects
A laten t factor mo del with a mixture of sparse and dense factors to mo del gene expression data with confounding effects Ch uan Gao 1 , Christopher D Brown 2 , Barbara E Engelhardt 1 , 3 , ∗ 1 Institute for Genome Sciences & P olicy , Duk e Universit y , Durham, NC, USA 2 Departmen t of Genetics, Universit y of P ennsylv ania, Philadelphia, P A, USA 3 Departmen t of Biostatistics & Bioinformatics and Department of Statistical Science, Duk e Universit y , Durham, NC, USA ∗ E-mail: barbara.engelhardt@duk e.edu Abstract One imp ortan t problem in genome science is to determine sets of co-regulated genes based on measuremen ts of gene expression levels across samples, where the quantification of expression levels includes substantial tec hnical and biological noise. T o address this problem, we dev elop ed a Bay esian sparse latent factor mo del that uses a three parameter b eta prior to flexibly mo del shrink age in the loading matrix. By applying three lay ers of shrink age to the loading matrix (global, factor-sp ecific, and elemen t-wise), this mo del has non-parametric prop erties in that it estimates the appropriate n umber of factors from the data. W e added a tw o-comp onen t mixture to mo del each factor loading as b eing generated from either a sparse or a dense mixture comp onen t; this allows dense factors that capture confounding noise, and sparse factors that capture lo cal gene in teractions. W e developed tw o statistics to quantify the stability of the recov ered matrices for b oth sparse and dense matrices. W e tested our model on simulated data and found that w e successfully reco vered the true laten t structure as compared to related models. W e applied our model to a large gene expression study and found that we recov ered known co v ariates and small groups of co-regulated genes. W e v alidated these gene subsets b y testing for asso ciations b etw een genotype data and these latent factors, and we found a substan tial num b er of biologically imp ortant genetic regulators for the reco vered gene subsets. 1 In tro duction F ast evolving experimental tec hniques for assa ying genomic data ha v e enabled the generation of large scale gene expression and genotype data at an unprecedented pace [1, 2]. Studies to find genetic v ariants that regulate gene expression lev els (called expression quan titative trait loci, or eQTLs), are now p ossible [3, 4]. Ho wev er, due to the complicated nature of experimental assays to quan tify cellular traits, substan tial tec hnical noise and biological cov ariates ma y confound measurements of gene expression levels. These confounding effects include batch effects [5–9], latent p opulation structure among the samples [10–12], and biological co v ariates, including age, sex, or b o dy mass index (BMI). The most replicable and numerous eQTL associations that hav e b een iden tified in h umans are those for which the single n ucleotide p olymorphism (SNP) is within the cis region of, or local to, the asso ciated gene [13, 14]. In practice, eQTL analyses are conducted by testing each genetic v ariant for an additiv e asso ciation with only the genes in cis , or lo cal, whic h helps to alleviate some of the burden imposed b y m ultiple testing [15, 16]. The biological realit y is that genes cannot manifest their function alone; instead, genes tend to work together to achiev e biological functions (Figure 1A) [17–19]. F urthermore, a SNP that regulates a gene in cis that, in turn, drives the expression lev els of other genes, suc h as a transcription factor, may app ear to co-regulate a subnetw ork of genes (p ossibly in tr ans ; Figure 1C,D). Metho ds that iden tify small, co-regulated groups of genes provide imp ortan t information to a downstream eQTL analysis, enabling genetic v ariants that regulate multiple genes ( pleiotr opic eQTLs) to b e iden tified (Figure 1B,D). The most effective metho d to control for confounding effects in gene expression assays in order to ha ve pow er to identify eQTLs remains an op en question. Confounding effects are often con trolled by estimating principal comp onen ts (PCs) of the gene expression matrix and removing the effects of the initial PCs before do wnstream analysis on the normalized residuals [14, 20]; the do wnside of this tw o-step 1 pro cedure is that it is p ossible that some of the sparse signal is remov ed in the first step [7, 21]. W e address this problem by developing a Bay esian latent factor mo del to identify a large n umber of sparse gene clusters, where individual signals are p erturb ed by unobserv ed confounding noise. In this laten t factor mo del, small clusters of co-regulated genes are captured by a large n umber of sparse factors. T o join tly mo del and implicitly control for confounding noise, our mo del includes a t wo-component mixture that allo ws each factor loading to b e regularized b y either a sparsit y-inducing prior or an alternativ e prior that do es not induce sparsity , where the probabilit y of a factor loading b eing sparse or dense is estimated from the data. Laten t factor mo dels, and sparse latent factor models in particular, are a common and effectiv e statistical metho dology for iden tifying in terpretable, low dimensional structure within a high dimensional matrix, and hav e frequently b een used to iden tify latent structure in gene expression data [22–25]. This approac h assumes that the gene expression levels for each gene can b e describ ed b y a linear combination of latent factors, and that the random noise in this matrix is appro ximately normal; thus eac h sample is modeled as b eing drawn from a multiv ariate normal distribution with a diagonal cov ariance matrix across genes, where the mean parameter is a linear combination of latent factors with a normal prior, and the v ariance term is estimated for eac h feature separately . Laten t factor mo dels assume that the total v ariation within the matrix can b e partitioned into cov ariation among genes and v ariation sp ecific to genes. This implies that a set of genes with correlated gene expression lev els will con tribute substan tially to (hav e a substantial loading on) a single factor, because this co-v ariability will contribute to the ov erall v ariabilit y in the matrix. In the setting of gene expression data, sparsity has often b een imp osed on the loading matrix to facilitate this clustering interpretation: genes with zero contribution to a factor are not included in the asso ciated gene cluster [26]. In this w ork, w e dev elop a flexible Bay esian sparse latent factor mo del, and we extend this sparse factor mo del to capture b oth sparse and dense latent factors by including a t wo-component mixture of priors on the loading matrix. W e use the flexible three parameter b eta ( T P B ) prior to induce lo cal (elemen t-sp ecific), factor-sp ecific, and global shrink age within the loading matrix [27]. W e then add a t wo-component mixture on the parameters of the factor-level three parameter b eta prior to jointly mo del sparse and dense latent structure. While this mo del draws up on ideas in our previous work in sparse factor analysis [28, 29], the main contributions of this w ork are that i) we adapt the Bay esian tw o group regularization framew ork for regression [30, 31] to laten t factor models in a natural wa y to create a flexible sparse latent factor mo del with desirable non-parametric and computational prop erties, and ii) we take adv an tage of this flexibility by jointly mo deling sparse and dense factors. W e believe that this sparse laten t factor mo del will ha ve broad utility in Bay esian statistics. A general difficult y when working with latent factor mo dels is that, in the basic model, the factors and loadings are only iden tifiable up to orthogonal rotation, scaling, and label switc hing [32]. W e w ould lik e to dev elop metrics with whic h to compare both sparse and dense matrices in order to ev aluate con vergence in parameter estimates and to quan tify the similarity of the recov ered matrices and the underlying structure. These metrics must be robust to these in v ariances to b e useful in this setting. While sparsity in the loading matrix facilitates rotational identifiabilit y and enables more direct comparisons across fitted sparse latent factors, dense factors are not as trivially comparable because of this rotation inv ariance. In order to address these issues of comparison, we dev elop ed tw o statistics to quantify the stability across estimated factors and factor loading vectors that are sparse (contain zeros) and dense (do not con tain zeros). Both statistics are inv arian t to lab el switching and scale. In addition, the dense matrix stabilit y statistic is rotation inv ariant. This pap er is organized as follows. Section 2 provides a general background to sparse factor analysis to motiv ate our formulation of a Ba yesian sparse laten t factor mo del. Section 3 sp ecifies our factor mo del with the T P B prior and the equiv alen t mo del in terms of a simple hierarc hical gamma prior. Section 3.2 extends the mo del to include a mixture of sparse and dense factors. The parameters are estimated using an appro ximate EM algorithm outlined in Section 4 and App endix B. W e motiv ate and describ e 2 our stabilit y statistics in Section 5. T o ev aluate the performance of our mo del, we simulated data and compared our mo del to related methods based on these sim ulations (Section 6.1). In Section 6.4, w e applied this mo del to real gene expression data on 480 samples and 8 , 718 genes, rev ealing interesting patterns in gene expression and confounding factors. Using these factors, w e identify relev an t genetic asso ciations for the subsets of co-regulated genes. !2 !1 0 1 2 0 Genotype Gene(expression(levels cis trans1 trans2 1 2 G cis trans1 trans2 log10BF=18.6 log10BF=1.2 log10BF=6.9 A B D C R Figure 1. Small gene net work mo dules and their role in iden tifying pleiotropic eQTLs. P anel A: A large gene net work shown as multiple no des (genes) connected b y edges (estimated using partial correlation b et w een gene expression lev els), with small sub-net works highlighted. Panel B: Each column of this matrix represen ts a gene cluster with blac k elemen ts denoting included genes in this laten t factor. The colored columns corresp ond to the colored subsets in Panel A. Panel C: A directed net work, including a cis regulatory genetic v ariant ( G ) that regulates a gene in cis (‘cis’) and tw o tr ans regulatory genetic v arian ts. P anel D: the x-axis is the three states of the single SNP , and the y-axis is the gene expression level for that transcription factor across samples. The slop e of the line connecting the means for each simulated gene is the effect size of the SNP on the transcription levels of that gene. 2 Ba y esian sparsit y and laten t factor mo dels F actor analysis has been used in a v ariety of settings to extract useful low dimensional features from high dimensional data [22–25, 33]. W e b egin with a basic factor analysis mo del [34, 35], Y = XΛ +  , with Y ∈ < n × p , X ∈ < n × K , Λ ∈ < K × p , and  ∈ < n × p ,  ij ∼ N (0 , ψ j ), where n and p corresp ond resp ectiv ely to the num b er of samples and the num b er of genes and, in practice, n  p . T o ensure conjugacy , the loading matrix Λ and the laten t factors X ha ve normal priors. This basic factor analysis mo del has a n umber of drawbac ks: the laten t factors and corresp onding loadings are unidentifiable with resp ect to orthogonal rotation and scaling, and it is difficult to select the dimension of the latent factors, whic h is fixed a priori . One solution to addresses rotational in v ariance is to induce sparsit y in the loading matrix, whic h allows for identifiabilit y in the estimated matrices when the laten t space is sufficiently sparse [28]. There are currently a num b er of wa ys to regularize the latent parameter space. Sparse principle com- p onen t analyses (PCA) ha ve been described [36, 37], related to laten t factor models through a probabilistic PCA framew ork [28, 38, 39]; for example, sparse principle comp onen ts analysis (SPCA) uses an ` 1 p enalt y to induce sparsit y on the PCs [36, 37]. W e choose to work in the Bay esian context with latent factor mo dels, and consider a sparsity-inducing prior on the factor loading matrix Λ. This sparsity-inducing prior should hav e substan tial mass around zero to provide strong shrink age near zero, and also ha ve hea vy tails to allow signals to escap e strong shrink age [30, 40]. In the context of sparse regression, there ha ve b een a num b er of prop osed solutions, including a student’s t-distribution, the horseshoe prior, the normal-gamma prior, and the Laplace prior [40–43]. Sparse factor analysis mo dels hav e taken adv antage 3 of some of these sparsity-inducing priors [28, 44, 45]. In particular, a num b er of sparse factor mo dels for use in biological applications hav e included some form of the Student’s t-distribution, also kno wn as automatic relev ance determination (ARD) [41, 46], as a prior on the v ariance terms of the elements of the factor loading matrix [28, 44, 47]. The sparse Ba yesian infinite factor mo del (SBIF) [45] in tro duces in- creasingly stronger shrink age across the loading vectors using a multiplicativ e gamma prior. The Infinite Sparse F actor Analysis mo del (ISF A) [48, 49] extends the Indian Buffet Pro cess to select the n umber of laten t factors in the sparse loading matrix. In these tw o mo dels, as the prop ortion of v ariance explained b y the factors decreases, the prop ortion of zeros in the factor loadings, in theory , increases, enabling a finite num b er of factors to b e recov ered from a mo del with an infinite num b er of underlying factors. Two fla ws in this construction are that i) sparsity and PVE may not b e w ell correlated in the latent space we are mo deling and ii) PVE ma y not b e a monotone decreasing function. In this sparse factor analysis context, most approaches to inducing sparsity hav e applied shrink age through a single parameter (generally , the v ariance of the factor loading matrix elements) on all loading parameters, which ma y sacrifice small signals to ac hieve high levels of sparsit y . This b eha vior has been lab eled the one gr oup solution to inducing sparsity , b ecause it effectiv ely considers b oth signal and noise in a single group and regularizes them the same wa y [31]. In contrast, the two gr oups solution mo dels noise and signal differen tly , strongly shrinking noise to zero but allo wing signals to escap e extreme shrink age [30]. The canonical two gr oups solution in the Ba yesian context is the so-called ‘spike-and-slab’ prior, whic h induces sparsit y using a t wo-component mixture mo del including a point mass at zero and a normal distribution [50, 51]. The comp onents that are noise are effectively remov ed from the mo del through the p oin t mass at zero, while the signals are regularized using the normal distribution but remain in the mo del; this approach additionally allows an explicit p osterior distribution on the inclusion probabilit y of eac h comp onen t [26]. In the factor mo del framework, a spike-and-slab prior can b e put on eac h elemen t of the loading matrix, as in the Bay esian factor regression mo del (BFRM) [26]. Unfortunately , there is no closed form solution for the parameter estimates, because of the mixture comp onen t, and so MCMC is most generally used to estimate the parameters [26]. Bec ause the parameter space for m comp onents includes 2 m configurations, this is computational intractable for large matrices [52]. These con tinuous sparsit y-inducing priors all hav e the prop ert y that they imp ose strong shrink age around zero but hav e sub-exp onen tial tails, whic h allow signals to escap e shrink age. Because of these prop erties, these types of priors ha ve b een describ ed as the ‘one-group answer to the original tw o-groups question’ [30]. In this w ork, w e use a three parameter b eta ( T P B ) distribution [27] to encourage sparsit y in the elemen ts of the factor loading matrix by shrinking their v ariance term. T P B ( a, b, φ ) is a generalized form of the Beta distribution, with the third parameter φ further controlling the shap e of the densit y . It has b een shown that a linear transformation of the beta distribution, pro ducing the in verse b eta distribution or the horsesho e prior, has desirable shrink age prop erties in sparse regression [40]. A linear transformation of the T P B distribution can b e used to mimic the in verse b eta distribution, with the in v erse b eta v ariable scaled by φ . The T P B distribution can also replicate other distributions, including the Stra wderman- Berger prior [27]. The T P B pro duces a T P B -normal ( T P BN ) distribution when coupled with the normal distribution, where, for a = 1 and φ = 1, this is equiv alent to the normal-exp onen tial-gamma distribution (NEG, T able 1) [27]. The T P B is th us app ealing as a prior b ecause it can recapitulate the sparsit y-inducing prop erties of contin uous one group priors, including the horsesho e, but it is also flexible enough to recapitulate other t yp es of priors including some that do not induce sparsity (T able 1). W e build a sparse factor model using this sparsit y-inducing prior follo wing recent work in Ba yesian regression [30]. In the regression context, a t wo groups mo del is ac hieved by setting the v ariance term for the regression co efficien ts to a scale mixture of normals: β j | λ j , τ ∼ N (0 , τ 2 λ 2 j ) (1) λ j ∼ π ( λ j ) (2) ( τ 2 , φ 2 ) ∼ π ( τ , φ ) , (3) 4 T able 1. Effect of different parameter settings for T P B ( a, b, φ ) on the shrink age imposed by this prior. φ 1 < 1 > 1 a = b = 1 2 horsesho e strong w eak a = 1 , b = 1 2 Stra wderman-Berger T P B N for a = 1 NEG a ↑ and b ↓ w eak v ariable w eak a ↓ and b ↑ strong strong v ariable where π , with a one group prior, is on the lo cal v ariance comp onen t λ j , and the same distribution is on the global v ariance comp onen t τ . This simple mo del exhibits tw o groups b eha vior, given the prop er distributions for τ and φ , b ecause τ effectively shrinks all of the regression coefficients to 0, then λ j , whic h is allo wed to b e very large through a heavy-tailed distribution, rescues individual signals [30] by scaling the global shrink age parameter τ that is v ery small. T o adapt this approach to the setting of latent factor mo dels, we added an additional la yer of shrink- age to each individual factor, which maintains the global-lo cal-t yp e mo del selection from the regression con text, but allo ws factor-sp ecific b ehavior. This creates, in effect, a three groups mo del, where signal and noise are mo deled in a factor-specific wa y . In particular, a global parameter heavily shrinks all signals through the loading matrix tow ard zero, a factor-sp ecific parameter rescues sp ecific factors from global shrink age, and a lo cal parameter enables within-factor sparsit y by shrinking individual elemen ts of a factor. Eac h of the three lay ers serv es a critical role: global regularization creates a non-parametric effect of removing factors from the mo del that are not necessary , factor-sp ecific regularization iden tifies factors that will b e included in the model, and lo cal regularization enables sparsit y , or mo del selection, within those selected factors. W e imp ose regularization at all three levels of the loading matrix using the T P B prior b ecause it is contin uous and flexible. Recen t w ork has produced a strong result in the Ba yesian sparse factor model setting that, using sp ecific lo cal-global shrink age priors, one obtains the minimax optimal rate of p osterior concentration up to a log factor; this w ork is the first asymptotic justification for global-lo cal approaches to Bay esian sparse factor analysis [53]. Although w e use a different heavy-tailed lo cal prior, this work motiv ates our general approach to Bay esian sparse factor analysis. Our sparse laten t factor mo del has a straigh tforward p osterior distribution for which p oin t estimates of the parameters are computed using expectation maximization (EM), making it computationally tractable via the careful application of this contin uous distribution. P articularly in the dictionary le arning setting of identifying an over c omplete , or K > n , num b er of factors that may individually contribute minimally to the v ariation in the resp onse matrix, this approach to inducing sparse in latent factor mo dels is statistically and computationally well motiv ated [54]. 3 Ba y esian sparse factor mo del via T P B W e define a Bay esian factor analysis mo del in the following wa y: Y = XΛ +  (4) X i ∼ N (0 , I K ) (5)  j ∼ N (0 , Ψ j ) , (6) 5 where Y ∈ < n × p is the matrix of observed v ariables, X ∈ < n × K is the factor matrix with K factors, Λ ∈ < K × p is the loading matrix, and  ∈ < n × p is the residual error matrix. W e assume Ψ = diag ( ψ 1 , . . . , ψ p ) is diagonal (but the diagonal elemen ts are not necessarily the same). In this mo del, the cov ariance among the p features in Y is captured in Λ T Λ . F or the laten t factors in X , we follow the usual con ven tion by giving each element a standard normal prior, where I K is the K × K identit y matrix. T o induce sparsity in the factor loading matrix Λ , we use the three parameter b eta distribution parameterized to hav e a sparsity-inducing effect [27]. The three parameter beta distribution has the follo wing form: f ( x : a, b, φ ) = Γ( a + b ) Γ( a )Γ( b ) φ b x b − 1 (1 − x ) a − 1 { 1 + ( φ − 1) x } − ( a + b ) , (7) for x ∈ (0 , 1), a > 0, b > 0 and φ > 0. W e put the T P B prior on the v ariance of each element Λ k,j of the loading matrix Λ , creating the following hierarchical structure: % ∼ T P B ( e, f , ν ) . (8) ζ k ∼ T P B ( c, d, 1 % − 1) (9) ϕ k,j ∼ T P B ( a, b, 1 ζ k − 1) (10) Λ k,j ∼ N (0 , 1 ϕ k,j − 1) (11) This sp ecification pro vides three lay ers of shrink age on the sparse loading matrix: ϕ k,j pro vides lo cal shrink age for each element b y shrinking the v ariance term of the normal prior; ζ k con trols the shrink age sp ecific to each factor k ; % shrinks all elements of the matrix globally . This mo del captures differen t shrink age scenarios dep ending on T P B parameters at each lev el a, b, c, d, e, f (Figure 2). W e estimate the third term of the factor-sp ecific and lo cal T P B priors, %, ζ k , from the data. By tuning parameters a, b, c, d, e, f and ν , w e apply more or less shrink age on the sparse loading matrix Λ . In practice, w e set a = b = c = d = e = f = 0 . 5 to recapitulate the horsesho e prior at all three levels. 3.1 Equiv alen t mo del via gamma priors W e transform the parameter ϕ to θ = 1 ϕ − 1, and w e find that the following relationship holds [27]: ϕ ∼ T P B ( a, b, ν ) ⇔ θ ν ∼ B e 0 ( a, b ) ⇔ θ ∼ G a ( a, δ ) and δ ∼ G a ( b, ν ) , (12) where B e 0 ( a, b ) and G a indicate an in verse beta and a gamma distribution, resp ectiv ely . F or concreteness, w e define the in verse b eta distribution as follows: f ( x ; α, β ) = x α − 1 (1 + x ) − α − β B ( α, β ) , (13) where B ( · , · ) is the b eta function. W e apply this transformation to Equations 8, 9, 10, and 11, sp ecifically , the v ariance terms θ k,j = 1 ϕ k,j − 1 and φ k = 1 ζ k − 1. It can b e shown that θ k,j φ k ∼ B e 0 ( a, b ) [27]; the same relationship holds for other T P B v ariables. This relationship implies the following hierarchical structure 6 0 5 10 0.00 0.25 0.50 0.75 1.00 X p(X) Distribution a=0.5, b=1 a=b=0.5 (horseshoe if phi=1) a=1, b=0.5 (Stra wder man−Berger if phi=1) a=5, b=5 Phi phi=1 phi=10 Figure 2. The p df of different parameterizations of the three parameter b eta distribution. The combination of a color and a line type identify alternative parameterizations and associated probabilit y densit y functions (p df; y-axis) on x ∈ (0 , 1) (x-axis). for the loading matrix Λ : γ ∼ G ( f , ν ) (14) η ∼ G ( e, γ ) (15) τ k ∼ G ( d, η ) (16) φ k ∼ G ( c, τ k ) (17) δ k,j ∼ G ( b, φ k ) (18) θ k,j ∼ G ( a, δ k,j ) (19) Λ k,j ∼ N (0 , θ k,j ) , (20) where the parameter η controls the global shrink age, φ k con trols the factor-sp ecific shrink age, and θ k,j con trols the lo cal shrink age for each elemen t of the factor loading matrix Λ . 3.2 Mixture of sparse and dense factors W e will define a sp arse factor as factor asso ciated with a loading v ector Λ k that con tains one or more zeros (or minimal con tribution from some num b er of features); we similarly define a dense factor as a 7 factor associated with a loading vector Λ k that con tains no zeros (or contributions from all features). This form ulation of the mo del (Equation 20) makes it suitable for generating sparse factors and, sim ultaneously , eliminating unnecessary factors. If we remo ved the loc al sparse components, and instead let each elemen t of the loading matrix be generated from the factor-level v ariance term directly , Λ k,j ∼ N (0 , φ k ), the model generates dense factors and simultaneously eliminates un used factors. Although there are other p ossible w ays to model dense factors in this framework, we ha v e found that this approach is both computationally tractable and n umerically stable. Using this approach, w e added a mixture model to the prior on Λ in order to jointly mo del both sparse and dense factors. In particular, we mix o ver generating eac h θ k,j parameter from the gamma prior to encourage sparsity within a factor loading v ector, and setting θ k,j , j ∈ { 1 , . . . , p } , to the factor-sp ecific parameter φ k to encourage dense factor loadings: θ k,j ∼ π G a ( a, δ k,j ) + (1 − π ) δ ( φ k ) , (21) where δ ( · ) is the dirac delta function, which sets θ j,k = φ k for all j ∈ { 1 , . . . , p } . Let Z ∈ { 0 , 1 } K b e a latent vector that indicates whether a factor is a sparse or a dense comp onen t. These indicator v ariables ha ve a Bernoulli distribution with parameter π , which we further assume are generated according to a b eta distribution with parameters α and β . Therefore, w e ma y view the gene expression data as b eing generated from the follo wing mo del: π | α, β ∼ B e ( α , β ) (22) Z k | π ∼ Bern( π ) , k = { 1 , . . . , K } (23) Λ k,j | Z k ∼  p (Λ k,j | θ k,j , δ k,j , φ k ) if Z k = 1; p (Λ k,j | φ k ) if Z k = 0 . (24) X i,k ∼ N (0 , 1) (25) Y i,j | Λ k,j , X i,k , ψ j ∼ N K X k =1 X i,k Λ k,j , ψ j ! . (26) 4 Appro ximate inference via EM W e presen t a fast expectation maximization (EM) algorithm for parameter estimation in this mo del; w e also deriv ed a Gibbs sampler (App endix A). In the Exp ectation step of the EM algorithm, we take exp ectations of the latent factors X and latent v ariables Z ; this is simple b ecause X and Z are condition- ally indep enden t of each other with resp ect to Λ . W e use maxim um a p osteriori (MAP) estimates for parameters in the M-step as in the original pap er on EM [55] (see App endix B for complete description). 8 The p osterior probability is written as follows: p ( Λ , X , Z , Θ | Y ) ∝ p ( Y | Λ , X , Θ , Z ) p ( X | Θ ) p ( Λ | Θ , Z ) p ( Z | Θ ) p ( Θ ) (27) ∝ " n Y i =1 N ( Y i | Λ , X i ) N ( X i | 0 , I K ) #   K Y k =1 p Y j =1 N (Λ k,j | φ k ) 1 Z k =0   ×   K Y k =1 p Y j =1 {N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) } 1 Z k =1   × " K Y k =1 B ern ( Z k | π ) # " K Y k =1 G a ( φ k | c, τ k ) G a ( τ k | d, η ) # × G a ( η | e, γ ) G a ( γ | f , ν ) B eta ( π | α, β ) Key elements of EM include: 1) the p osterior of Λ k,j has a normal distribution, with its mo de b eing a function of a w eighted sum of the sparse and dense comp onen ts, 2) the p osterior of θ k,j and φ k are in a Generalized In verse Gaussian ( G I G ) distribution, with MAP estimates of their mo des being a closed form solution to a quadratic function; how ever, θ k,j is only asso ciated with the sparse components, whereas φ k is a function of b oth sparse and dense comp onents. 3) The parameters δ i,k , τ k ha ve a gamma distribution, for which the MAP estimates hav e a closed form solution b ecause of conjugacy . F or parameters φ k , η , w e used their MLE estimates when MAP estimates ≡ 0, which is the case for the horsesho e parameterization of the T P B prior ( a = b = 0 . 5). 5 Stabilit y statistics F actor mo dels suffer from unidentifiabilit y: in the general mo del, the likelihoo d is inv ariant up to or- thogonal rotation and scaling of the factors and loadings, and the factors and loadings may b e jointly p erm uted without affecting the likelihoo d, called the lab el switching problem. Because of these inv ari- ances, it is difficult to compare the results from fitted factor mo dels, sp ecifically Λ and X . Ho wev er, it is imp ortan t to b e able to compare these fitted matrices because w e w ould lik e to, for example, quantify how w ell simulated data are recapitulated or ev aluate ho w sensitive the EM algorithm is to random starting p oin ts. In the sparse matrix setting, by imp osing significant sparsity on the loading matrix, w e eliminate rotational in v ariance for the most part. W e therefore construct a stabilit y measure to compare tw o sparse matrices that is inv ariant to scale and lab el switc hing. In the dense matrix setting, w e develop a stability measure that quantifies the similarity betw een tw o matrices based on their underlying basis, whic h is in v ariant to rotation, scaling, lab el switching, and even a v arying num b er of reco vered factors. 5.1 Stabilit y statistic for sparse factors W e prop ose the following stability measurement for t wo sparse matrices. Let K 1 , K 2 b e the n umber of ro ws for tw o sparse matrix Λ 1 and Λ 2 , let Σ ∈ [0 , 1] K 1 × K 2 denote the correlation matrix generated from t wo fitted sparse matrices ˆ Λ 1 ∈ < K 1 × p and ˆ Λ 2 ∈ < K 2 × p b y computing the absolute v alue of the pairwise P earson’s correlations among eac h sparse matrix column, we consider the following statistic: r s = 1 2 K 1 K 1 X l =1 ( max( | Σ l,. | ) − P K 2 t =1 I ( | Σ l,t | > | Σ l,. | )Σ l,t K 2 − 1 ) (28) + 1 2 K 2 K 2 X t =1 ( max( | Σ .,t | ) − P K 1 l =1 I ( | Σ l,t | > | Σ .,t | )Σ l,t K 1 − 1 ) 9 where | Σ l,. | and | Σ .,j | denote the mean for the i th ro w and the j th column. The idea b ehind this metric is as follows: giv en tw o sparse matrices that are p erfect matches despite lab el switching, there should be exactly one Σ i,j = 1 for the i th ro w and j th column, and the rest should be closer to zero (although, b ecause we do not enforce orthogonal factor loadings, probably not exactly zero). The stability measure r s should reward this scenario, but p enalize the comparisons when there are zero or more than one Σ i,j ≈ 1 for the i th ro w or j th column ( factor splitting ). Conv ersely , w e do not wan t to p enalize small correlations among factors as correlations ma y exist, so w e only p enalize correlations that are greater than the mean correlation v alue for that factor, whic h ma y b e smaller than the correlation betw een matching factors (with correlation near one) and larger than the correlation b etw een non-matching factors (with correlation closer to zero). 5.2 Stabilit y for dense factors W e built a stabilit y measure to quantify the similarity of t w o dense matrices based on the estimates of the co v ariance of the features of matrix M , M T M , Although M itself is unidentifiable up to an orthogonal rotation, the form M T M is identifiable, so we will compare tw o dense matrices with these features using their resp ective cov ariance matrices. The problem of comparing tw o co v ariance matrices Σ 1 = M T 1 M 1 and Σ 2 = M T 2 M 2 has been well studied [56]. A test statistic that is a function of the determinant of the t wo cov ariance matrices will quantify the difference b etw een the tw o [56, 57] for example. A determinant- based approach was rejected, though, because, in our mo del p  K , so the p × p co v ariance matrices are singular and therefore will ha ve a determinant of zero. T o address this, a simple squared trace, T r ( Σ 1 − Σ 2 ) 2 , was recently prop osed to measure the distance b et ween tw o dense matrices [57]. This metric is rotation inv ariant, inv ariant to lab el switching, and allows singular matrices; to make it scale in v ariant, w e scale eac h row of the original matrices by  M i,. − 1 p P p j =1 M i,j   1 p − 1 M T i M i  − 1 / 2 . Given t wo scaled dense matrices M 1 ∈ < K 1 × p and M 2 ∈ < K 2 × p , we compare them by using the trace squared: r d = 1 p 2 T r ( M T 1 M 1 − M T 2 M 2 ) 2 , (29) whic h is prop ortional to the distance b et ween the t wo matrices, with smaller v alues representing greater similarit y in this scenario. 6 Results 6.1 Sim ulated data T o test the p erformance of this model, w e sim ulated t wo t yp es of gene expression measuremen ts. First, we sim ulated ten data sets with only sparse comp onen ts, in consideration of the mo dels in this comparison that do not handle confounders explicitly (Sim 1). Second, we sim ulated ten data sets with sparse comp onen ts plus dense confounders (Sim 2). The tw o types of simulated data were generated from the follo wing mo del: Y = XΛ + FΩ +  (30) where Λ and Ω corresp ond to the sparse and dense loading matrices, and X and F corresp ond to the sparse and dense factors, resp ectively . T o generate Λ ∈ < n × p , for each ro w of Λ , W e sampled the num b er of genes in a single sparse factor from U nif [10 , 20] and then assigned v alues from N (0 , 1) at random to the included genes and set the loadings for the excluded genes to zero. The indices of the included genes were randomly sampled from the p total genes. Some genes may app ear in multiple ro ws, and th us correlations among the factors are p ossible. W e sampled each row of X i ∼ N (0 , I K ). W e simulated the dense matrix Ω i,j ∼ N (0 , 1) i = 1 , . . . , n , j = 1 , . . . , p , and the error term  j ∼ N (0 , 1). F or Sim 1, w e 10 sim ulated ten sparse factors; for Sim 2, w e added five dense factors. F or b oth simulations, we used these sim ulated factors to generate a gene expression matrix with dimension n = 200, p = 500 (Figure 3). The same simulation scheme w as replicated ten times to generate ten data sets for b oth Sim 1 and Sim 2. Cor(Y) Cor(sparse) Cor(dense) 0 50 100 0 50 100 0 50 100 0 50 100 Genes Genes 0.25 0.50 0.75 1.00 Corr Gene correlations A B C Figure 3. Correlation patterns within the sim ulated data. The absolute v alue of the Pearson correlation co efficien t b et w een all pairs of genes is shown as a heatmap, where the rows and columns of eac h matrix corresp ond to genes. P anel A: Y T Y , P anel B: Λ T Λ , and P anel C: Ω T Ω. 6.2 Related metho ds for comparison W e v alidated our model on these simulations, and compared the results against six other related methods. W e ran KSVD [54], BFRM [26], SPCA [37], SBIF [45], our mo del, after controlling for PCs in the Y matrix (SF Amix2), and our mo del, lea ving Y as it was giv en (SF Amix), in the following wa y . W e ran KSVD assuming one elemen t in eac h linear com bination, which best recapitulated the sparsit y in the simulations for both Sim 1 and Sim 2. The default n umber of iterations was used. W e gav e the metho d the correct num b er of factors. W e also ran the metho d on Sim 1 setting K = 20. W e ran BFRM and SPCA with the correct factor num b ers; we used default v alues for the other parameters. W e also ran BFRM and SPCA on Sim 1 setting K = 20. F or SBIF, maximum a p osteriori (MAP) estimates of X and Λ were used as the final p oin t estimates. This method selected the factor num b er nonparametrically; Ho wev er, we seeded the metho d with the correct n umber of factors. F or Sim 1, w e also seeded SBIF with K = 20. F or SF Amix2, w e ran SF Amix as below, but controlled for confounders in matrix Y b efore applying our mo del to the residuals from a fitted linear mo del with the original matrix Y and the first fiv e principal comp onen ts (PCs) of Y T Y . F or SF Amix, we initialized the program with 50 factors, and we set the parameter v alues to a = b = c = d = 0 . 5 and ν = 1 to recapitulate the horsesho e. W e set α = β = 1 for a uniform prior distribution on the mixture prop ortions. W e assessed conv ergence by chec king changes in the num b er of non-zero elemen ts l = P K k =1 || Λ k || 0 in each iteration, and stopp ed when l was stable for 20 iterations. W e also ran SF Amix using the correct num b er of factors on Sim 1. Since a n umber of metho ds in this comparison did not recov er matrices with substan tial sparsit y , we p ost pro cessed the results for these methods to determine the sparse and dense loadings. W e chose a cutoff t , different for each metho d, so that, for a factor loading k , w e thresholded the vector elemen ts to count the num b er of non-zero features in that factor: l k = P p j =1 1 {| Λ k,j | > t } . W e determined this 11 cutoff based on factor loading histograms, resolving am biguous cutoff levels in fav or of the correct num ber of sparse and dense factors. Then we set elemen ts for which | Λ k,j | ≤ t to zero. F or SF Amix, w e used the p osterior probability of the Z k v ariables to determine whether a factor w as sparse or dense (with a naiv e cutoff of 0 . 5). W e found for SF Amix that the threshold for removing a feature from a factor t w as < 10 − 10 , requiring minimal p ost-pro cessing to determine the gene clusters. 6.3 Comparison b et w een six methods on sim ulated data W e compared our mixture factor analysis mo del, SF Amix, and our mo del with a t wo-stage approac h, SF Amix2, to KSVD, BFRM, SPCA, and SBIF. W e ev aluated the performance of each metho d based on the stabilit y statistics b et ween the true simulated and the recov ered latent spaces, for b oth sparse and dense loadings and factor matrices. W e ran eac h of the fiv e metho ds on the ten data sets in Sim 1, and w e compared each recov ered sparse factor loadings ˆ Λ with the true loading matrix Λ (Figure 4). When the correct factor num b er was kno wn for all methods other than SF Amix, all metho ds were able to reco ver the sparse factor loadings w ell, all producing an av erage stabilit y measure r s > 0 . 75 o ver the ten sim ulations. When the factor num b er w as unknown (SF Amix was given K = 50 and all other methods w ere given K = 20, for simulated K = 15) SF Amix reco vered the sparse loading matrix equally w ell, follo wed b y SBIF, while the remaining metho ds p erformed substantially w orse. This suggests a b enefit of the nonparametric b ehavior of SF Amix and SBIF, which b oth estimated the num b er of factors effectively when the underlying factor n umber w as unkno wn a priori . F or Sim 2, we found that SF Amix recov ered the sparse loadings well, follow ed by BFRM, SF Amix2, SBIF, KSVD, and SPCA (Figure 4B). Indeed, SF Amix w as able to recov er b oth the sparse and dense loading matrices without knowing the num b er or prop ortion of sparse and dense factors b eforehand (Figure 4C). The amount of p ost pro cessing required for BFRM may hav e artificially inflated the quality of those results relativ e to SBIF in particular. BFRM and SBIF allo w v ariability on the shrink age applied across factors; th us, they recov er matrices with confounding factors b etter than KSVD and SPCA, which imp ose equal shrink age across factors (Figure 4B). This difference is reflected in the dense stability measure, where SF Amix and SF Amix2 had the smallest av erage distance b et ween the recov ered and the true cov ariance matrices, follow ed by BFRM, SBIF, KSVD and SPCA (Figure 4C). W e used the dense stabilit y metric to compare the reco vered factors corresp onding to the dense loadings to the original dense factors, and we find an identical ranking of methods in terms of the factor recov ery but with substan tially greater v ariance across the different data sets in Sim 2 (Figure 4D). The results for Sim 2 suggest that estimating the sparse and dense components join tly offers benefits ov er the t wo-stage method (SF Amix2), whic h, ev en giv en the correct factor num b ers, performs w orse than the join t model in reco vering the sparse comp onen ts (Figure 4B,D). W e further in vestigated the recov ered gene clusters in the sparse loadings for b oth Sim 1 and 2. W e found that, for Sim 1, SF Amix and SPCA recapitulated the lev el of sparsit y in the simulated loadings; in particular, the av erage n umber of non-zero comp onents in a sparse loading ( l k ) for SF Amix, KSVD, SPCA, BFRM and SPIF were 10, 50, 23, 495, and 500 resp ectiv ely , where the simulated av erage cluster size was 15 (Figure 5). F or Sim 2, we found that the sparsit y levels were recov ered well by SF Amix, and also by BFRM and SBIF when the n umber of sparse and dense loadings were approximated correctly (Figure 5). KSVD and SPCA do not appro ximate the sparse clusters well in the presence of dense factors. SF Amix recov er the sparse latent structure w ell relative to other metho ds in the presence of confounding factors, with minimal p ost-pro cessing. 6.4 Gene expression study An RNA microarray generates gene transcription levels for tens of thousands of genes from an RNA sample rapidly and at low cost. Biologists ha v e shown that genes are not transcribed into mRNA as indep enden t units, but instead as correlated comp onents of biological netw orks with v arious bio c hemical 12 ● 0.00 0.25 0.50 0.75 1.00 Known Unknown F actor number Stability Sim 1, sparse loadings ● ● ● 0.25 0.50 0.75 1.00 SF Amix SF Amix2 KSVD SPCA BFRM SBIF Methods Stability Sim 2, sparse loadings ● ● ● ● 0.0 2.5 5.0 7.5 SF Amix SF Amix2 KSVD SPCA BFRM SBIF Methods Stability Sim 2, dense loadings ● 0 1 2 3 4 SF Amix SF Amix2 KSVD SPCA BFRM SBIF Methods Stability Sim 2, dense factors Methods SF Amix SF Amix2 KSVD SPCA BFRM SBIF B A C D Figure 4. Stabilit y measures for sparse and dense matrices ov er ten sim ulations. P anel A: Av erage r s for sparse loadings on Sim 1, for b oth known and unknown num bers of factors. Panel B: Av erage r s for sparse loadings on Sim 2. P anel C: Average r d for dense loadings on Sim 2. P anel D: Av erage r d for dense factors on Sim 2. roles [58, 59]. As a result, genes that share similar biological roles may hav e correlated expression lev els across samples b ecause, for example, they may b e regulated for a similar cellular purp ose b y a common transcription factor that is expressed at different levels across samples. Identifying these correlated sets of genes from high dimensional gene expression measuremen ts is a fundamental biological problem [18, 60, 61] with many downstream applications. 6.4.1 Latent factors recov ered from the gene expression data W e applied our metho d to expression levels from 8 , 718 genes measured in a sample of 480 human immortalized bloo d cell lines (LCLs) [14]. The data w ere processed according to previous work [14]; ho wev er, neither known cov ariates nor PCs were controlled for b efore quantile normalization. W e also remo ved genes with probes on the gene expression arra y that aligned to multiple regions of the genome using a BLAST analysis and human reference genome hg19. In this exp erimen t, the n umber of correlated sets of genes may b e large relative to the n umber of genes in the gene expression matrix (and, certainly , relativ e to the n umber of samples) if we indeed iden tify small clusters of co-regulated genes. W e set K = 4000 and ran EM from ten starting p oin ts with a = b = c = d = 0 . 5, ν = 1 and α = β = 1. W e recov ered appro ximately 350 factors across different random starting v alues, appro ximately 25-30 of whic h w ere dense factors. 13 SF Amix KSVD SPCA BFRM SBIF 0 50 100 150 0 50 100 150 Sim1 Sim2 0 50 100 150 0 50 100 150 0 50 100 150 0 50 100 150 0 50 100 150 Genes Genes 0.25 0.50 0.75 1.00 Corr Gene correlations Figure 5. Absolute v alue of the correlation b etw een genes based on the reco v ered factor loadings across fiv e metho ds. The top and b ottom panels show Λ T Λ for the recov ered sparse loading matrices Λ for Sims 1 and 2, respectively , across the five methods; the x- and y-axes are the genes included in those factors. Correlations b et w een genes in a single factor are partitioned by black lines. W e present results from the run that pro duced the most factors. F or this run, w e found a total of 399 factors, of which 32 were dense (Figure 6A,B). W e found that 98% of the sparse factors contained few er than 50 genes, and 81% contained fewer than 10 genes (Figure 6D). T o quan tify gene correlation patterns within each factor, we calculated the correlation matrix using the gene expression v alues in the residual matrix Y − ΩF for each gene included in each sparse factor (Figure 6A,B). W e found that our mo del recov ered factors containing groups of strongly correlated genes, ev en when the correlation w as confounded by the structure of the dense factors in the original matrix Y . A further lo ok at the prop ortion of v ariance explained (PVE; Figure 6C) shows that dense factors individually explain as m uch as 13% of the total v ariance in the gene expression matrix. The sparse factors individually explain as m uch as ≈ 1.3% of the total v ariance, whic h is more than some of the dense factors explain. One feature of our join t mixture mo del is that sparse factors may capture substantial PVE, instead of controlling for this v ariance through PCs in a t wo-stage approach (SF Amix2) or implicitly con trolling sparsit y via a decreasing prior on the PVE (SBIF). F urthermore, we found that the reco vered dense factors correlated well with some kno wn biological and technical co v ariates, including batch effects, whic h are known to cause a substantial amount of v ariation in gene expression lev els [62] (Figure 7). 6.4.2 Ontology term enrichmen t v alidates recov ered gene clusters Genes that hav e correlated transcription levels often ha ve similar molecular functions [58, 59]. T o v al- idate the gene clusters recov ered b y SF Amix, we applied the Gene On tology enric hment analysis to ol, D A VID [63], to the genes in each sparse factor. Using FDR ≤ 0 . 2, we found that 145 sparse factors were enric hed for 1 , 917 different biological functions (Supplemental T able S1). F or example, a sparse factor with 39 genes, including CMPK2, DDX60, and SP110, was enriched for the GO terms r esp onse to virus (FDR ≤ 3 . 11 × 10 − 13 ) and antivir al defense (FDR ≤ 3 . 26 × 10 − 8 ). The substantial enric hment of GO 14 0 50 100 150 0 50 100 150 Genes Genes 0 50 100 150 200 0 50 100 150 200 Genes Genes 0.25 0.50 0.75 1.00 Corr ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0.00 0.05 0.10 0 100 200 300 400 Inde x of factors PVE Factor s ● ● Dense Sparse 0 50 100 150 200 5 2 10 2 15 2 20 2 25 2 Genes in F actors Frequency C D A B Figure 6. Gene correlation patterns, prop ortion of v ariance explained, and the n umber of genes in eac h sparse factor. P anel A: Absolute v alue of Pearson’s correlation betw een genes asso ciated with sparse factors as a heatmap for factors with av erage correlation > 0 . 4 and cluster size < 50. Genes in a single factor are partitioned by black lines. P anel B: Same heatmap for 50 factors selected at random with a verage correlations < 0 . 4 and cluster size < 50. P anel C: Percen tage of v ariance explained by b oth dense and sparse factors (ordered). P anel D: Histogram of the n umber of genes in eac h sparse factor. terms in the recov ered sparse factors suggests that the induced gene clusters reco vered by this mo del are biologically meaningful. F urthermore, these sp ecific GO terms indicate that these samples hav e mounted a co ordinated cellular resp onse to virus, which, as w e discuss later, reflects the immortalization process for the sp ecific t yp e of cells in this study [64]. 6.4.3 eQTL analysis finds pleiotropic eQTLs One do wnstream application of iden tifying subsets of correlated genes is to find genetic v arian ts that are asso ciated with transcription levels of the recov ered subsets of p ossibly co-regulated genes [65–68]. T o 15 0 10 20 30 ExposureBatch CellCount RNAbactch PlateID Gender Age BMI SmokingStatus Known co variates Dense f actors −0.3 0.0 0.3 Corr Figure 7. Correlations b et w een the recov ered dense factors and kno wn cov ariates. The kno wn co v ariates are plotted along the x -axis and the recov ered dense factors are sho wn on the y -axis; colors represent v arious levels of Pearson’s correlation co efficien t b et w een eac h co v ariate and the reco vered factors. further v alidate the recov ered gene clusters, w e p erformed eQTL asso ciation mapping to the sparse fac- tors to iden tify genetic v ariants that regulate the corresp onding small gene clusters ( pleiotr opic eQTLs ). F or this experiment, w e pro jected each recov ered factor to the quan tiles of a standard normal distri- bution across the samples; w e then tested for asso ciations b et ween eac h of these normalized laten t fac- tors and ≈ 2.6 million genetic v ariants (genotyped in the same individuals) using univ ariate Bay esian tests [70]. W e also ran the same asso ciation tests on the p erm uted normalized latent factors to compute the false discov ery rate (FDR) for sp ecific log 10 B F v alues. W e identified 7 , 154 asso ciated genetic v ari- an ts (FDR ≤ 0 . 2; log 10 B F ≥ 3 . 70), and 5 , 568 asso ciated genetic v ariants at a more strict FDR (FDR ≤ 0 . 05; log 10 B F ≥ 4 . 37); all iden tified eQTLs are presen ted in Supplemental T able S2. W e found that 257 out of 367 of our sparse factors (70%) had at least one eQTL (FDR ≤ 0 . 2). W e define cis asso ciations as v arian ts lo cated within 1 Mb of the transcription start site (TSS) or the transcription end site (TES) of an y gene in the factor. W e found 5 , 318 (76%) cis-associations of 7154 total asso ciations, recapitulating previous studies sho wing many more significan t cis-eQTLs than trans- eQTLs [13, 14]. If we consider only the most significantly asso ciated eQTL for each factor, 95 out of 257 factors with eQTLs (37%) are in cis ; ho wev er, this prop ortion go es up to 60% (86 out of 143) at an FDR of 0 . 05 (log 10 B F ≥ 4 . 37), and 84% (72 out of 86) at an FDR of 0 . 01 (log 10 B F ≥ 5 . 17), suggesting that the cis asso ciations represent stronger genetic effects than the tr ans asso ciations [72]. All asso ciations with log 10 B F ≥ 30 ha ve a cluster size of less than or equal to three genes; generally the eQTL is a short distance from the cis-gene’s transcrib ed region (Figure 8A). Less significant asso ciations (log 10 B F ≤ 30) 16 sho w more v ariability in distance to the closest gene and cluster size (Figure 8A). As the num b er of genes in cis to the eQTL for a cluster increased, the asso ciation significance also tended to increase (Figure 8B). These asso ciations suggest that this t yp e of factor mo del can b e used to capture small groups of genes that app ear to b e co-regulated by pleiotropic genetic lo ci. F or comparison, we found 119 genetic v ariants asso ciations with the dense factors (FDR ≤ 0 . 20), and 22 asso ciated genetic v ariants at a stricter FDR (FDR ≤ 0 . 05). This prop ortion of dense factors with eQTLs is smaller than the genetic asso ciations for the sparse factors, supp orting the h ypothesis that most of the dense factors are not genetically driven but represent biological and experimental confounders. This also suggests that a joint modeling of gene clusters and confounding effects does not remo ve genetic signal unin tentionally , although it is p ossible that a genetic effect constitutes only a small prop ortion of the v ariance explained by a dense factor, so those factors would still not app ear to be associated with genetic v arian ts. Asso ciation mapping identified eQTLs asso ciated with tw o factors, the first including ten genes (DD X58, GMPR, IFIT2, IFIT3, IFIT5, MO V10, OASL, P ARP12, P ARP9, XAF1), the second including four genes (CD55, CR1, CR2, IFNA2). Both eQTLs are unlink ed with (i.e., in trans) all of the genes included in the factors. Both of these factors are enric hed for GO terms related to interferon response, or resp onse to inv asion of host cells b y pathogens including viruses and tumor cells; the first factor is enric hed for interfer on-induc e d 56K pr otein (FDR ≤ 3 . 45 × 10 − 04 ), and the second factor is enric hed for Sushi4 domain (FDR ≤ 5 . 42 × 10 − 3 ) that is activ ated in resp onse to sp ecific viruses including Epstein- Barr. Both of these factors are relev ant to the cell t yp e in this study , lymphoblastoid cell lines, which hav e b een immortalized using the Epstein-Barr virus, and it app ears that we are able to observ e the resp onse that these cells ha ve moun ted against the viral pathogen. F or the first factor, the trans-eQTL is lo cated within a K-lysine acetyltransferase (KA T8, also known as MYST1), which is in our gene expression data but not included in this factor, and is a known in terferon effector gene [73]. The eQTL for the second factor is similarly lo cated within the TRAPPC9 gene, not in our gene expression data set, which is in the NF- κ B path wa y and is activ ated during viral stress of host cells. W e also p erformed a univ ariate Bay esian test for asso ciation b et w een the genes within eac h factor and the SNPs asso ciated with these factors for SNP-factor asso ciations with an FDR ≤ 0 . 20. W e found that b y jointly testing for asso ciation with the clustered genes, w e iden tified asso ciations with greater significance than testing the genes separately (Figure 9). 7 Conclusions W e dev elop ed a mo del for sparse factor analysis using a three parameter b eta prior to induce shrink age in the loading matrix at three lev els of the hierarch y: globally , factor-sp ecific, and elemen t-wise. W e found that this mo del has fav orable prop erties for estimating possibly high-dimensional latent spaces, including accurate recov ery of sparse signals and a non-parametric prop erty of remo ving un used factors. W e extended this mo del to explicitly include dense factors by adding a t wo-component mixture mo del within this hierarch y . W e developed t wo simple metrics for stability across sparse and dense matrices that are inv arian t to scale, lab el switching, and (for dense matrices) orthogonal rotation. W e v alidated our mo del on simulated data, and show ed that our mo del recapitulated b oth sparse and dense factors with high accuracy relativ e to current state-of-the-art methods. W e applied our mo del to a large gene expression data set and found biologically meaningful clusters of genes. The recov ered dense factors correlate w ell with kno wn biological and technical cov ariates. W e used the sparse factors to iden tify genetic v ariants that are associated with transcriptional regulation of the genes within the individual sparse factors, and our results suggest that our sparse gene clusters capture genes that are co-regulated b y genetic v ariants, and that our metho d is useful for identifying pleiotrophic eQTLs. 17 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 30 60 90 120 0e+00 1e+08 2e+08 3e+08 4e+08 Distance LogBF Size of cluster ● ● ● 50 100 150 50 100 150 Size of cluster ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 50 100 150 0 1 2 3 N cis N trans 30 60 90 LogBF LogBF ● ● ● 30 60 90 A B Figure 8. Distance of the asso ciations, cluster size and log 10 B F for the top asso ciations ab o v e FDR of 0.2. P anel A: x-axis sho ws distance b et ween the SNPs and their asso ciated factors (the smallest distance to the TSS or TES of any gene within that factor); the y-axis corresp onds to the log 10 B F asso ciation v alues; the size of the p oints corresponds to the size of the gene clusters. The distance 4 × 10 8 represen ts SNPs lo cated on differen t c hromosomes from all of the genes in the asso ciated cluster. P anel B: F or all factors, the n umber of cis gene-SNP asso ciations is sho wn on the x-axis and the the n umber of tr ans gene-SNP asso ciations is shown on the y-axis; The size of p oin ts corresp ond to the log 10 B F v alues. A P osterior distribution for the parameters The p osterior probability for our mo del giv en matrix Y is written as follows: p ( Λ , X , Z , Θ | Y ) ∝ p ( Y | Λ , X , Θ , Z ) p ( X | Θ ) p ( Λ | Θ , Z ) p ( Z | Θ ) p ( Θ ) (31) ∝ " n Y i =1 N ( Y i | Λ , X i ) N ( X i | 0 , I K ) #   K Y k =1 p Y j =1 N (Λ k,j | φ k ) 1 Z k =0   ×   K Y k =1 p Y j =1 {N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) } 1 Z k =1   × " K Y k =1 B ern ( Z k | π ) # " K Y k =1 G a ( φ k | c, τ k ) G a ( τ k | d, η ) # × G a ( η | e, γ ) G a ( γ | f , ν ) B eta ( π | α, β ) . 18 ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0 20 40 0 30 60 90 120 LogBF−F actor LogBF−univariate Ba yesian Association type ● ● Cis T rans Figure 9. The log 10 B F for SNP-factor asso ciations and univ ariate SNP-gene asso ciations for the genes within eac h factor. The x-axis sho ws the log 10 B F for the SNP-factor asso ciations, and the y-axis corresp onds to the log 10 B F of SNP-gene asso ciations within factors. The black line represen ts p erfect correlation. Only SNP-factor asso ciations with an FDR ≥ 0 . 2 were used. The conditional probabilit y for X i has the follo wing form: X i | Y i , Θ ∝ exp ( n X i =1  1 2 ( Y i − X i Λ ) T Ψ − 1 ( Y i − X i Λ )  ) exp ( n X i =1  1 2 X T i X i  ) (32) ∝ exp ( n X i =1  1 2 ( X i − µ x i ) T Σ − 1 X ( X i − µ x i )  ) . Th us, w e ha ve the following conditional probabilit y for X i : X i | Y i ∼ N ( µ x i , Σ X ) (33) where µ x i = Y T i Ψ − 1 Λ ( ΛΨ − 1 Λ T + I K ) − 1 (34) Σ X = ( ΛΨ − 1 Λ T + I K ) − 1 (35) The conditional probabilit y for Z k has a Bernoulli distribution: p ( Z k = 1 | Θ ) = π Q p j =1 N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) (1 − π )( Q p j =1 N (Λ k,j | φ k )) + π Q p j =1 N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) (36) Let ρ k = p ( Z k = 1 | Θ ); then the conditional probabilit y for Z k is Z k | Θ ∼ B er n ( ρ k ) (37) T o deriv e the conditional probabilities for the parameters generating the matrix Λ , w e note that man y of them ha ve a generalized inv erse Gaussian distribution, conditional on Z k : 19 If Z k = 1 Λ k,j | Y , X , Θ k,j , ψ j,j ∼ N 1 ψ j,j P n i =1 ( y i,j − P ˜ k 6 = k x i, ˜ k Λ ˜ k,j ) x i,k 1 ψ j,j P n i =1 x 2 i,k + 1 θ k,j , 1 ψ j,j n X i =1 x 2 i,k + 1 θ k,j ! (38) θ k,j | Λ k,j , δ k,j ∼ G I G  a − 1 2 , 2 δ k,j , Λ 2 k,j  (39) δ k,j | θ k,j , φ k ∼ G ( a + b, θ k,j + φ k ) (40) φ k | δ k,j , τ k ∼ G   pb + c, p X j =1 δ k,j + τ k   (41) If Z k = 0 Λ k,j | Y , X , φ k , ψ j,j ∼ N   1 ψ j,j P n i =1  y i,j − P ˜ k 6 = k x i, ˜ k Λ ˜ k,j  x i,k 1 ψ j,j P n i =1 x 2 i,k + 1 φ k , 1 ψ j,j n X i =1 x 2 i,k + 1 φ k   (42) φ k | τ k , Λ k,j ∼ G I G   c − p 2 , 2 τ k , p X j =1 Λ 2 k,j   , (43) where we used ˜ k 6 = k to denote any elemen t but element k . The following parameters are not sparse or dense comp onent sp ecific, and they each hav e a gamma conditional probability b ecause of conjugacy: τ k | φ k , η ∼ G ( c + d, φ k + η ) (44) η | γ , τ k ∼ G K d + e, γ + K X k =1 τ k ! (45) γ | η , ν ∼ G ( e + f , η + ν ) . (46) The mixing prop ortion π has a b eta conditional probabilit y: π | α, β , Z k ∼ B eta α + K X k =1 1 Z k =0 , K − K X k =1 1 Z k =1 + β ! , (47) where 1 is the indicator function. Finally , the v ariance of the error term has an inv erse gamma distribution: ψ j,j | Y , X , Λ ∼ I G    n 2 − 1 , P n i =1  y i,j − P K k =1 x i,k Λ k,j  2 2    . (48) B Exp ectation maximization algorithm W e describ e an exp ectation maximization algorithm, where we take the exp ected v alues of the latent v ariables Z and X , enabling conjugate gradient metho ds for p oin t estimates of the parameters in this 20 space. T o derive the EM updates, we write the auxiliary function, using the exp ected complete log p osterior probabilit y in lieu of the lik eliho o d, Q ( Θ ) = h ` c ( Θ , Λ | Z , X , Y ) i as: Q ( Θ ) ∝ n X i =1 p X j =1 h log p ( y i,j | Λ , X , Θ , Z ) i + n X i =1 K X k =1 h log p ( x i,k | Θ ) i (49) + K X k =1 p X j =1 h p ( Z k | Θ ) log p (Λ k,j | Θ , Z k ) i + log p ( Θ ) ∝ − p 2 ln | Ψ | − n X i =1 p X j =1  y i,j − P K k =1 h x i,k i Λ k,j  2 2 ψ j,j − n X i =1 K X k =1 D x 2 i,k E 2 + K X k =1 p X j =1 h 1 − h z k ii ( − 1 2 ln φ k − Λ 2 k,j 2 φ k ) + K X k =1 p X j =1 h z k i ( − 1 2 ln θ k,j − Λ 2 k,j 2 θ k,j + a ln δ k j + ( a − 1) ln θ k j − δ k,j θ k,j ) + K X k =1 p X j =1 h z k i { b ln φ k + ( b − 1) ln δ k,j − φ k δ k,j } + K X k =1 {h z k i ln π + (1 − h z k i ) ln(1 − π ) } + K X k =1 { c ln τ k + ( c − 1) ln φ k − τ k φ k + d ln η + ( d − 1) ln τ k − η τ k } + e ln γ + ( e − 1) ln η − γ η + f ln ν + ( f − 1) ln γ − ν γ + α ln π + β ln(1 − π ) First we write out the equations for the three expected sufficien t statistics iden tified ab o ve. In section A, w e established that X i | Y i has a Gaussian distribution; then the exp ected v alue h X i | Y i i is computed in the E-step as: h X i | Y i i = Y T i Ψ − 1 Λ ( ΛΨ − 1 Λ T + I K ) − 1 (50) Similarly , D x 2 i,k E is computed as follows:  x 2 i,k  = Σ X k,k + h x i,k i 2 (51) D x i, ˜ k x i,k E = D x i, ˜ k E h x i,k i + Σ X ˜ k,k . (52) The exp ected v alue of Z k | Θ was deriv ed in Section A as: h z k | Θ i = p ( Z k = 1 | Θ ) (53) = π Q p j =1 N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) (1 − π )( Q p j =1 N (Λ k,j | φ k )) + π Q p j =1 N (Λ k,j | θ k,j ) G a ( θ k,j | a, δ k,j ) G a ( δ k,j | b, φ k ) . The parameter updates are computed in the M-step. W e obtain their MAP estimates ˆ Θ = arg max Θ Q ( Θ ). Sp ecifically , we solve equation ∂ Q ( Θ ) ∂ Θ = 0 to find the closed form of their MAP estimates. The same up- dates are obtained b y finding the mo de of the conditional probabilit y of each parameter, as in App endix A. 21 Our MAP estimate for Λ k,j is a function of the weigh ted sum of the tw o v ariance terms θ k,j and φ k : ˆ Λ k,j = 1 ψ j,j P n i =1  y i,j h x i,k i − P ˜ k 6 = k D x i, ˜ k , x i,k E Λ ˜ k,j  1 ψ j,j P n i =1 D x 2 i,k E + h z k i θ k,j + 1 −h z k i φ k , (54) where D x 2 i,k E is calculated in the E-step. Σ X k,k w as derived in App endix A as the ( k , k )th element in the Σ X matrix, and Σ X ˜ k,k as the ˜ k , k th element in the Σ X matrix. As sho wn in App endix A, θ k,j has a generalized inv erse Gaussian conditional probabilit y , and its MAP estimates can either b e obtained by directly taking the mo de of this distribution, or by solving a quadratic formula. W e obtain the follo wing form for the parameter up dates: ˆ θ k,j = 2 a − 3 + q (2 a − 3) 2 + 8Λ 2 k,j δ k,j 4 δ k,j . (55) The estimates for ˆ δ k,j are trivially obtained as: ˆ δ k,j = a + b − 1 θ k,j + φ k . (56) The estimates for φ k are also based on a generalized inv erse Gaussian. Unlik e θ k,j , φ k generates b oth sparse and dense factors, so the estimates are a function of a weigh ted sum of parame ters from both comp onen ts: ˆ φ k = h + p h 2 + χω χ , (57) where h = pb h z k i + c − 1 − p 2 (1 − h z k i ) (58) χ = 2   h z k i p X j =1 δ k,j + τ k   (59) ω = p X j =1 Λ 2 k,j . (60) The follo wing parameters hav e similar up dates to δ k,j , whic h ha ve natural forms b ecause of conjugacy: ˆ τ k = c + d − 1 φ k + η (61) ˆ η = K d + e − 1 γ + P K k =1 α k (62) ˆ γ = e + f − 1 η + ν . (63) The prior on the indicator v ariable for sparse and dense comp onents, π , has a b eta distribution, and its geometric mean is the following: h ln π i = ψ K X k =1 h z k i + α ! − ψ ( K + α + β ) (64) 22 where ψ is the digamma function. The v ariance for the error term has the following up date: ˆ ψ j,j = P n i =1  y i,j − P K k =1 h x i,k i Λ k,j  2 n . (65) Ac kno wledgemen ts The authors would like to thank Say an Mukherjee and David Dunson for helpful con versations. All data are publicly av ailable: the gene expression data w ere acquired through GEO GSE36868, and the genot yp e data w ere acquired through dbGaP , acquisition num b er phs000481, and generated from the Krauss Lab at the Children’s Hospital Oakland Researc h Institute. This work was supported in part b y U19 HL069757: Pharmacogenomics and Risk of Cardiov ascular Disease. W e ac knowledge the P ARC inv estigators and researc h team, supported by NHLBI, for collection of data from the Cholesterol and Pharmacogenetics clinical trial. References [1] Zhong W ang, Mark Gerstein, and Michael Sn yder. Rna-seq: a rev olutionary tool for transcriptomics. Nat R ev Genet , 10(1):57–63, 2009. [2] The In ternational HapMap Consortium. A second generation human haplotype map of ov er 3.1 million SNPs. Natur e , 449(7164):851–861, October 2007. [3] William Cookson, Liming Liang, Goncalo Abecasis, Miriam Moffatt, and Mark Lathrop. Mapping complex disease traits with global gene expression. Natur e r eviews. Genetics , 10(3):184–194, 2009. [4] Y oav Gilad, Scott A. Rifkin, and Jonathan K. Pritc hard. Revealing the architecture of gene regula- tion: the promise of eQTL studies. T r ends in Genetics , 24(8):408–415, August 2008. [5] Jeffrey Leek and John Storey . Capturing heterogeneity in gene expression studies by surrogate v ariable analysis. PL oS Genet , 3(9):e161–1735, 2007. [6] Chlo´ e F riguet, M Kloareg, and D Causeur. A factor mo del approac h to multiple testing under dep endence. Journal of the Americ an Statistic al Asso ciation , 104(488):1406–1415, 2009. [7] Oliver Stegle, Leop old P arts, Ric hard Durbin, and John Winn. A bay esian framework to account for complex non-genetic factors in gene expression levels greatly increases p o wer in eqtl studies. PL oS c omputational biolo gy , 6(5):e1000770, 2010. [8] Jennifer Listgarten, Carl Kadie, Eric Schadt, and David Heck erman. Correction for hidden con- founders in the genetic analysis of gene expression. Pr o c e e dings of the National A c ademy of Scienc es of the Unite d States of Americ a , 107(38):16465–16470, 2010. [9] C Y ang, L W ang, S Zhang, and H Zhao. Accounting for non-genetic factors by low-rank representa- tion and sparse regression for eqtl mapping. Bioinformatics , Mar 2013. [10] Alkes Price, Nick Patterson, Rob ert Plenge, Michael W ein blatt, Nancy Shadick, and David Reich. Principal comp onen ts analysis corrects for stratification in genome-wide asso ciation studies. Natur e Genetics , 38(8):904–909, 2006. 23 [11] Jonathan Pritchard, Matthew Stephens, Noah Rosen b erg, and Peter Donnelly . Asso ciation mapping in structured p opulations. The Americ an Journal of Human Genetics , 67(1):170–181, 2000. [12] Daniel E Runcie and Say an Mukherjee. Dissecting high-dimensional phenot yp es with ba yesian sparse factor analysis of genetic co v ariance matrices. Genetics , 2013. [13] Barbara E. Stranger, Matthew S. F orrest, Andrew G. Clark, Mark J. Minichiello, Samuel Deutsc h, Rob ert Lyle, Sarah Hun t, Brenda Kahl, St ylianos E. An tonarakis, Simon T av ar´ e, Panagiotis De- louk as, and Emmanouil T. Dermitzakis. Genome-Wide Asso ciations of Gene Expression V ariation in Humans. PL oS Genet , 1(6):e78+, December 2005. [14] Christopher D. Brown, Lara M. Mangravite, and Barbara E. Engelhardt. In tegrative Mo deling of eQTLs and Cis-Regulatory Elements Suggests Mec hanisms Underlying Cell Type Sp ecificit y of eQTLs. PL oS Genet , 9(8):e1003649+, August 2013. [15] Ev a F reyh ult, Mattias Landfors, Jenn y Onskog, T orgeir Hvidsten, and Patrik Ryden. Challenges in microarray class disco very: a comprehensiv e examination of normalization, gene selection and clustering. BMC Bioinformatics , 11(1):503+, October 2010. [16] Daxin Jiang, Chun T ang, and Aidong Zhang. Cluster analysis for gene expression data: A surv ey . IEEE T r ans. on Know l. and Data Eng. , 16(11):1370–1386, Nov ember 2004. [17] Audrey P . Gasc h and Mic hael B. Eisen. Exploring the conditional coregulation of yeast gene expres- sion through fuzzy k-means clustering. Genome biolo gy , 3(11), Octob er 2002. [18] Dominic Allocco, Isaac Kohane, and Atul Butte. Quantifying the relationship b etw een co-expression, co-regulation and gene function. BMC Bioinformatics , 5(1):18+, F ebruary 2004. [19] Oliver Hob ert. Gene Regulation b y T ranscription F actors and MicroRNAs. Scienc e , 319(5871):1785– 1786, March 2008. [20] Joseph K. Pickrell, John C. Marioni, Athma A. Pai, Jacob F. Degner, Barbara E. Engelhardt, Ev erlyne Nk adori, Jean-Baptiste B. V eyrieras, Matthew Stephens, Y oa v Gilad, and Jonathan K. Pritc hard. Understanding mechanisms underlying human gene expression v ariation with RNA se- quencing. Natur e , 464(7289):768–772, April 2010. [21] Anita Goldinger, Anjali K. Henders, Allan F. McRae, Nicholas G. Martin, Greg Gibson, Grant W. Mon tgomery , Peter M. Visscher, and Joseph E. Po well. Genetic and non-genetic v ariation revealed for the principal comp onen ts of human gene expression. Genetics , 2013. [22] I. Pournara and L. W ernisch. F actor analysis for gene regulatory net works and transcription factor activit y profiles. BMC Bioinformatics , 8(61), 2007. [23] Y una Blum, Guillaume Le Mignon, Sandrine Lagarrigue, and David Causeur. A factor mo del to analyze heterogeneity in gene expression. BMC bioinformatics , 11(1):368+, July 2010. [24] Joseph E. Lucas, Hsiu-Ni Kung, and Jen-Tsan A. Chi. Latent F actor Analysis to Disco ver P athw ay- Asso ciated Putative Segmen tal Aneuploidies in Human Cancers. PL oS Comput Biol , 6(9):e1000920+, Septem b er 2010. [25] Leop old Parts, Oliver Stegle, John Winn, and Richard Durbin. Joint genetic analysis of gene ex- pression data with inferred cellular phenotypes. PL oS Genet , 7(1):e1001276, 2011. [26] C. M. Carv alho, J. E. Lucas, Q. W ang, J. Chang, J. R. Nevins, and M. W est. High-dimensional sparse factor modelling - applications in gene expression genomics. Journal of the Americ an Statistic al Asso ciation , 103:1438–1456, 2008. PMCID 3017385. 24 [27] Artin Armagan, David B. Dunson, and Merlise Clyde. Generalized b eta mixtures of gaussians. In John Shaw e-T aylor, Ric hard S. Zemel, Peter L. Bartlett, F ernando C. N. Pereira, and Kilian Q. W ein b erger, editors, NIPS , pages 523–531, 2011. [28] Barbara Engelhardt and Matthew Stephens. Analysis of p opulation structure: A unifying framew ork and nov el metho ds based on sparse factor analysis. PL oS Genetics , 6(9):e1001117, 2010. [29] C. Gao and BE. Engelhardt. A sparse factor analysis mo del for high dimensional latent spaces. NIPs: Workshop on Analy sis Op er ator L e arning vs. Dictionary L e arning: F r aternal Twins in Sp arse Mo deling , 2012. [30] Nicholas G. Polson and James G. Scott. Shrink globally , act lo cally: Sparse ba yesian regularization and prediction, 2010. [31] B. Efron. Microarrays, Empirical Bay es and the Two-Groups Mo del. Statistic al Scienc e , 23:1–47, 2008. [32] Henry F. Kaiser. The v arimax criterion for analytic rotation in factor analysis. Psychometrika , 23(3):187–200, September 1958. [33] David W. Stew art. The application and misapplication of factor analysis in mark eting research. Journal of Marketing R ese ar ch , 18(1):pp. 51–62, 1981. [34] Donald Rubin and Doroth y Tha y er. EM algorithms for ML factor analysis. Psychometrika , 47(1):69– 76, March 1982. [35] M. W est. Ba yesian factor regression mo dels in the ”large p, small n” paradigm. Bayesian Statistics 7 , pages 723–732, 2003. [36] Hui Zou, T revor Hastie, and Rob ert Tibshirani. Sparse principal comp onen t analysis. Journal of Computational and Gr aphic al Statistics , 15:2006, 2004. [37] Daniela M. Witten, Rob ert Tibshirani, and T revor Hastie. A p enalized matrix decomp osition, with applications to sparse principal comp onen ts and canonical correlation analysis. Biostatistics , 10(3):515–534, July 2009. [38] Sam Row eis. Em algorithms for PCA and SPCA. In Michael I. Jordan, Mic hael J. Kearns, and Sara A. Solla, editors, A dvanc es in Neur al Information Pr o c essing Systems , v olume 10. The MIT Press, 1998. [39] Michael E. Tipping and Christopher M. Bishop. Mixtures of probabilistic principal comp onen t analyzers. Neur al Comput. , 11(2):443–482, F ebruary 1999. [40] Carlos M. Carv alho, Nicholas G. Polson, and James G. Scott. Handling sparsit y via the horseshoe. Journal of Machine L e arning R ese ar ch - Pr o c e e dings T r ack , pages 73–80, 2009. [41] Michael E. Tipping. Sparse ba yesian learning and the relev ance vector machine. J. Mach. L e arn. R es. , 1:211–244, Septem b er 2001. [42] Jim and Philip. Inference with normal-gamma prior distributions in regression problems. Bayesian A nalysis , 5(1):171–188, 2010. [43] T revor P ark and George Casella. The Bay esian Lasso. Journal of the Americ an Statistic al Asso cia- tion , 103(482):681–686, 2008. 25 [44] Iulian Prutean u-Malinici, Daniel L. Mace, and Uwe Ohler. Automatic annotation of spatial expres- sion patterns via sparse Bay esian factor mo dels. PL oS c omputational biolo gy , 7(7):e1002098+, July 2011. [45] A. Bhattac harya and D. B. Dunson. Sparse bay esian infinite factor mo dels. Biometrika , 98(2):291– 306, 2011. [46] Radford M. Neal. Bayesian L e arning for Neur al Networks . Springer-V erlag New Y ork, Inc., Secaucus, NJ, USA, 1996. [47] Ernest F okoue. Sto chastic determination of the in trinsic structure in Bay esian factor analysis. T ech- nical rep ort, Statistical and Applied Mathematical Sciences Institute (SAMSI), 2004. [48] David A. Knowles and Zoubin Ghahramani. Infinite sparse factor analysis and infinite indep enden t comp onen ts analysis. In Mike E. Da vies, Christopher J. James, Samer A. Ab dallah, and Mark D. Plum bley , editors, ICA , v olume 4666 of L e ctur e Notes in Computer Scienc e , pages 381–388. Springer, 2007. [49] David A. Kno wles and Zoubin Ghahramani. Nonparametric ba yesian sparse factor mo dels with application to gene expression mo delling. CoRR , abs/1011.6293, 2010. [50] T. J. Mitchell and J. J. Beauc hamp. Ba y esian V ariable Selection in Linear Regression. Journal of the Americ an Statistic al Asso ciation , 83(404):1023–1032, December 1988. [51] Edward I. George and Rob ert E. Mccullo c h. V ariable Selection Via Gibbs Sampling. Journal of the A meric an Statistic al Asso ciation , 88(423):881–889, September 1993. [52] Gertraud Malsiner-W alli and Helga W agner. Comparing spike and slab priors for ba yesian v ariable selection. A ustrian Journal of Statistics , 40:241264, 2011. [53] Deb deep Pati, Anirban Bhattachary a, Natesh S. Pillai, and David B. Dunson. Posterior contraction in sparse Ba yesian factor mo dels for massive cov ariance matrices. page 42, June 2012. [54] M. Aharon, M. Elad, and A. Bruckstein. K-SVD: An Algorithm for Designing Ov ercomplete Dic- tionaries for Sparse Representation. Signal Pr o c essing, IEEE T r ansactions on , 54(11):4311–4322, No vem b er 2006. [55] AP Dempster, NM Laird, and DB Rubin. Maxim um lik eliho o d from incomplete data via the em algorithm. Journal of the R oyal Statistic al So ciety. Series B (Metho dolo gic al) , 39(1):1–38, 1977. [56] T. W. Anderson. A n Intr o duction to Multivariate Statistic al Analysis . Wiley , New Y ork, NY, third edition, 2003. [57] Jun Li and Song Xi Chen. Two sample tests for high-dimensional cov ariance matrices. Ann. Stat. , 40(2):908–940, 2012. [58] Y o o-Ah Kim, Stefan W uc ht y , and T eresa Przytyc k a. Identifying causal genes and dysregulated path wa ys in complex diseases. PL oS c omputational biolo gy , 7(3):e1001095, 2011. [59] Y anqing Chen, Jun Zh u, Pek Lum, Xia Y ang, Shirly Pinto, Douglas MacNeil, Chunsheng Zhang, John Lam b, Stephen Edwards, Solveig Sieberts, Amy Leonardson, La wrence Castellini, Susanna W ang, Marie-F rance Champy , Bin Zhang, V alur Emilsson, Sudheer Doss, Anatole Ghazalp our, Steve Horv ath, Thomas Drake, Aldons Lusis, and Eric Schadt. V ariations in dna elucidate molecular net works that cause disease. Natur e , 452(7186):429–435, 2008. 26 [60] Pierre Y. Dup on t, Audrey Guttin, Jean P . Issartel, and Georges Stepien. Computational iden tifi- cation of transcriptionally co-regulated genes, v alidation with the four ANT isoform g enes. BMC Genomics , 13(1):482+, Septem b er 2012. [61] Daniel Marbach, James C. Costello, Rob ert Kuffner, Nicole M. V ega, Rob ert J. Prill, Diogo M. Camac ho, Kyle R. Allison, Manolis Kellis, James J. Collins, and Gustav o Stolovitzky . Wisdom of cro wds for robust gene netw ork inference. Nat Meth , 9(8):796–804, August 2012. [62] Jeffrey T. Leek, Rob ert B. Sc harpf, H ˜ A c  ctor C. Brav o, Da vid Simcha, Benjamin Langmead, W. Ev an Johnson, Donald Geman, Keith Baggerly , and Rafael A. Irizarry . T ackling the widespread and critical impact of batch effects in high-throughput data. Natur e R eviews Genetics , 11(10):733– 739, September 2010. [63] Da W. Huang, Brad T. Sherman, and Richard A. Lempic ki. Systematic and in tegrativ e analysis of large gene lists using DA VID bioinformatics resources. Natur e Pr oto c ols , 4(1):44–57, December 2008. [64] Minal Calisk an, Darren A Cusanovic h, Carole Ob er, and Y oav Gilad. The effects of EBV transfor- mation on gene expression levels and meth ylation profiles. Human mole cular genetics , 20(8):1643–52, April 2011. [65] Aravind Subramanian, Pablo T ama yo, V amsi K. Mo otha, Say an Mukherjee, Benjamin L. Eb ert, Mic hael A. Gillette, Amanda Paulo vic h, Scott L. Pomero y , T o dd R. Golub, Eric S. Lander, and Jill P . Mesirov. Gene set enrichmen t analysis: a knowledge-based approach for in terpreting genome- wide expression profiles. Pr o c e e dings of the National A c ademy of Scienc es of the Unite d States of A meric a , 102(43):15545–15550, Octob er 2005. [66] Daria V. Zhernako v a, Eleonora de Klerk, Harm-Jan W estra, Anastasios Mastrokolias, Shoaib Amini, Y a vuz Ariyurek, Rick Jansen, Brenda W. Penninx, Jouke J. Hottenga, Gonneke Willemsen, Eco J. de Geus, Dorret I. Bo omsma, Jan H. V eldink, Leonard H. v an den Berg, Cisca Wijmenga, Johan T. den Dunnen, Gert-Jan B. v an Ommen, Peter A. C. ’t Ho en, and Lude F ranke. DeepSA GE Reveals Genetic V ariants Asso ciated with Alternative P olyaden ylation and Expression of Coding and Non- co ding T ranscripts. PL oS Genetics , 9(6):e1003594+, June 2013. [67] Sarit Suissa, Zhib o W ang, Jason Poole, Sharine Wittk opp, Jeanette F eder, Timoth y E. Shutt, Dou- glas C. W allace, Gerald S. Shadel, and Dan Mishmar. Ancien t mtdna genetic v ariants mo dulate m tdna transcription and replication. PL oS Genet , 5(5):e1000474, 05 2009. [68] Qing Xiong, Nicola Ancona, Elizab eth R. Hauser, Sa yan Mukherjee, and T errence S. F urey . In- tegrating genetic and gene expression evidence in to genome-wide asso ciation analysis of gene sets. Genome r ese ar ch , 22(2):386–397, F ebruary 2012. [69] Paul Sc heet and Matthew Stephens. A F ast and Flexible Statistical Mo del for Large-Scale P opulation Genot yp e Data: Applications to Inferring Missing Genotypes and Haplotypic Phase. The A meric an Journal of Human Genetics , 78(4):629–644, April 2006. [70] Bertrand Servin and Matthew Stephens. Imputation-based analysis of asso ciation studies: candidate regions and quan titative traits. PL oS genetics , 3(7):e114+, July 2007. [71] Y. Guan and M. Stephens. Bay esian v ariable selection regression for genome-wide asso ciation studies, and other large-scale problems. Annals of Applie d Statistics , 2011. 27 [72] Elin Grundberg, Kerrin S Small, ˚ Asa K Hedman, Alexandra C Nica, Alfonso Buil, Sarah Keild- son, Jordana T Bell, Tsun-Po Y ang, Eshw ar Meduri, Amy Barrett, James Nisbett, Magdalena Sek owsk a, Alicja Wilk, So-Y oun Shin, Daniel Glass, Mary T rav ers, Josine L Min, Sue Ring, Karen Ho, Gudmar Thorleifsson, Augustine Kong, Unnur Thorsteindottir, Chrysanthi Ainali, Antigone S Dimas, Neelam Hassanali, Catherine Ingle, Da vid Knowles, Maria Krest y aninov a, Christopher E Lo we, Paola Di Meglio, Stephen B Mon tgomery , Leop old Parts, Simon Potter, Gabriela Surdulescu, Loukia Tsaprouni, Sophia Tsok a, V eronique Bataille, Richard Durbin, F rank O Nestle, Stephen O’Rahilly , Nicole Soranzo, Cecilia M Lindgren, Krina T Zonderv an, Kourosh R Ahmadi, Eric E Sc hadt, Kari Stefansson, George Da vey Smith, Mark I McCarthy , Panos Delouk as, Emmanouil T Dermitzakis, and Tim D Spector. Mapping cis- and trans-regulatory effects across m ultiple tissues in twins. Natur e genetics , 44(10):1084–9, Octob er 2012. [73] Dahlene N. F usco, Cyn thia Brisac, Sinu P . John, YiW en Huang, Christopher R. Chin, Tiao Xie, Hong Zhao, Nikolaus Jilg, Leiliang Zhang, Stephane Chev aliez, Daniel W ambua, W enyu Lin, Lee P eng, Raymond T. Ch ung, and Abraham L. Brass. A genetic screen iden tifies interferon- effector genes required to suppress hepatitis c virus replication. Gastr o enter olo gy , 144(7):1438 – 1449.e9, 2013. 28

Original Paper

Loading high-quality paper...

Comments & Academic Discussion

Loading comments...

Leave a Comment