Mixtures estimation and applications /

This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a compl...

Full description

Saved in:
Bibliographic Details
Group Author: Mengersen, Kerrie L.; Robert, Christian P., 1961-; Titterington, D. M.
Published:
Literature type: Electronic eBook
Language: English
Series: Wiley series in probability and statistics
Subjects:
Online Access: http://onlinelibrary.wiley.com/book/10.1002/9781119995678
Summary: This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subje.
Carrier Form: 1 online resource (xviii, 311 p.) : ill.
Bibliography: Includes bibliographical references and index.
ISBN: 9781119995678 (electronic bk.)
1119995671 (electronic bk.)
9781119995685 (electronic bk.)
111999568X (electronic bk.)
Access: Due to publisher license, access is restricted to authorised GRAIL clients only. Please contact GRAIL staff.
Index Number: QA273
CLC: O211.3
Contents: The EM algorithm, variational approximations and expectation propagation for mixtures /
Preamble --
The EM algorithm --
Introduction to the algorithm --
The E-step and the M-step for the mixing weights --
The M-step for mixtures of univariate Gaussian distributions --
M-step for mixtures of regular exponential family distributions formulated in terms of the natural parameters --
Application to other mixtures --
EM as a double expectation --
Variational approximations --
Introduction to variational approximations --
Application of variational Bayes to mixture problems --
Application to other mixture problems --
Recursive variational approximations --
Asymptotic results --
Expectation-propagation --
Introduction --
Overview of the recursive approach to be adopted
Finite Gaussian mixtures with an unknown mean parameter --
Mixture of two known distributions --
Discussion --
Acknowledgements --
References --
Online expectation maximisation /
Model and assumptions --
The EM algorithm and the limiting EM recursion --
The batch EM algorithm --
The limiting EM recursion --
Limitations of batch EM for long data records --
Online expectation maximisation --
The algorithm --
Convergence properties --
Application to finite mixtures --
Use for batch maximum-likelihood estimation --
The limiting distribution of the EM test of the order of a finite mixture /
The method and theory of the EM test --
The definition of the EM test statistic --
The limiting distribution of the EM test statistic --
Proofs
Comparing Wald and likelihood regions applied to locally identifiable mixture models /
Background on likelihood confidence regions --
Likelihood regions --
Profile likelihood regions --
Alternative methods --
Background on simulation and visualisation of the likelihood regions --
Modal simulation method --
Illustrative example --
Comparison between the likelihood regions and the Wald regions --
Volume/volume error of the confidence regions --
Differences in univariate intervals via worst case analysis --
Illustrative example (revisited) --
Application to a finite mixture model --
Nonidentifiabilities and likelihood regions for the mixture parameters --
Mixture likelihood region simulation and visualisation --
Adequacy of using the Wald confidence region
Data analysis --
Mixture of experts modelling with social science applications /
Motivating examples --
Voting blocs --
Social and organisational structure --
Mixture models --
Mixture of experts models --
A mixture of experts model for ranked preference data --
Examining the clustering structure --
A mixture of experts latent position cluster model --
Modelling conditional densities using finite smooth mixtures /
The model and prior --
Smooth mixtures --
The component models --
The prior --
Inference methodology --
The general MCMC scheme --
Updating β and I using variable-dimension finite-step Newton proposals --
Model comparison --
Applications --
A small simulation study
LIDAR data --
Electricity expenditure data --
Conclusions --
Appendix: Implementation details for the gamma and log-normal models --
Nonparametric mixed membership modelling using the IBP compound Dirichlet process /
Mixed membership models --
Latent Dirichlet allocation --
Nonparametric mixed membership models --
Motivation --
Decorrelating prevalence and proportion --
Indian buffet process --
The IBP compound Dirichlet process --
An application of the ICD: focused topic models --
Inference --
Related models --
Empirical studies --
Discovering nonbinary hierarchical structures with Bayesian rose trees /
Prior work --
Rose trees, partitions and mixtures --
Avoiding needless cascades --
Cluster models
Greedy construction of Bayesian rose tree mixtures --
Prediction --
Hyperparameter optimisation --
Bayesian hierarchical clustering, Dirichlet process models and product partition models --
Mixture models and product partition models --
PCluster and Bayesian hierarchical clustering --
Results --
Optimality of tree structure --
Hierarchy likelihoods --
Partially observed data --
Psychological hierarchies --
Hierarchies of Gaussian process experts --
Mixtures of factor analysers for the analysis of high-dimensional data /
Single-factor analysis model --
Mixtures of factor analysers --
Mixtures of common factor analysers (MCFA) --
Some related approaches --
Fitting of factor-analytic models --
Choice of the number of factors q --
Example --
Low-dimensional plots via MCFA approach
Multivariate t-factor analysers --
Appendix --
Dealing with label switching under model uncertainty /
Labelling through clustering in the point-process representation --
The point-process representation of a finite mixture model --
Identification through clustering in the point-process representation --
Identifying mixtures when the number of components is unknown --
The role of Dirichlet priors in overfitting mixtures --
The meaning of K for overfitting mixtures --
The point-process representation of overfitting mixtures --
Examples --
Overfitting heterogeneity of component-specific parameters --
Overfitting heterogeneity --
Using shrinkage priors on the component-specific location parameters --
Concluding remarks --
Exact Bayesian analysis of mixtures /
Formal derivation of the posterior distribution --
Locally conjugate priors --
True posterior distributions --
Poisson mixture --
Multinomial mixtures --
Normal mixtures --
Manifold MCMC for mixtures /
Markov chain Monte Carlo Methods --
Metropolis-Hastings --
Gibbs sampling --
Manifold Metropolis adjusted Langevin algorithm --
Manifold Hamiltonian Monte Carlo --
Finite Gaussian mixture models --
Gibbs sampler for mixtures of univariate Gaussians --
Manifold MCMC for mixtures of univariate Gaussians --
Metric tensor --
An illustrative example --
Experiments --
How many components in a finite mixture? /
The galaxy data --
The normal mixture model
Bayesian analyses --
Escobar and West --
Phillips and Smith --
Roeder and Wasserman --
Richardson and Green --
Stephens --
Posterior distributions for K (for flat prior) --
Conclusions from the Bayesian analyses --
Posterior distributions of the model deviances --
Asymptotic distributions --
Posterior deviances for the galaxy data --
Bayesian mixture models: a blood-free dissection of a sheep /
Hierarchical normal mixture --
Altering dimensions of the mixture model --
Bayesian mixture model incorporating spatial information --
Volume calculation --
References.