Mixtures estimation and applications /

This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a compl...

Full description

Saved in:
Bibliographic Details
Group Author: Mengersen, Kerrie L; Robert, Christian P., 1961; Titterington, D. M
Published:
Literature type: Electronic eBook
Language: English
Series: Wiley series in probability and statistics
Subjects:
Online Access: http://onlinelibrary.wiley.com/book/10.1002/9781119995678
Summary: This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subje
Carrier Form: 1 online resource (xviii, 311 p.) : ill.
Bibliography: Includes bibliographical references and index.
ISBN: 9781119995678 (electronic bk.)
1119995671 (electronic bk.)
9781119995685 (electronic bk.)
111999568X (electronic bk.)
Access: Due to publisher license, access is restricted to authorised GRAIL clients only. Please contact GRAIL staff.
Index Number: QA273
CLC: O211.3
Contents: The EM algorithm, variational approximations and expectation propagation for mixtures /
Preamble --
The EM algorithm --
Introduction to the algorithm --
The E-step and the M-step for the mixing weights --
The M-step for mixtures of univariate Gaussian distributions --
M-step for mixtures of regular exponential family distributions formulated in terms of the natural parameters --
Application to other mixtures --
EM as a double expectation --
Variational
Finite Gaussian mixtures with an unknown mean parameter --
Mixture of two known distributions --
Discussion --
Acknowledgements --
References --
Online expectation maximisation /
Introduction --
Model and assumptions --
The EM algorithm and the limiting EM recursion --
The batch EM algorithm --
The limiting EM recursion --
Limitations of batch EM for long data records --
Online expectation maximisation --
The algorithm --
Convergence properties --
Application t
Comparing Wald and likelihood regions applied to locally identifiable mixture models /
Background on likelihood confidence regions --
Likelihood regions --
Profile likelihood regions --
Alternative methods --
Background on simulation and visualisation of the likelihood regions --
Modal simulation method --
Illustrative example --
Comparison between the likelihood regions and the Wald regions --
Volume/volume error of the confidence reg
Data analysis --
Mixture of experts modelling with social science applications /
Motivating examples --
Voting blocs --
Social and organisational structure --
Mixture models --
Mixture of experts models --
A mixture of experts model for ranked preference data --
Examining the clustering structure --
A mixture of experts latent position cluster model --
Modelling conditional d
LIDAR data --
Electricity expenditure data --
Conclusions --
Appendix: Implementation details for the gamma and log-normal models --
Nonparametric mixed membership modelling using the IBP compound Dirichlet process /
Mixed membership models --
Latent Dirichlet allocation --
Nonparametric mixed membership models --
Motivation --
Decorrelating prevalence and proportion --
Indian buffet process --
The IBP compound Dirichlet process -
Greedy construction of Bayesian rose tree mixtures --
Prediction --
Hyperparameter optimisation --
Bayesian hierarchical clustering, Dirichlet process models and product partition models --
Mixture models and product partition models --
PCluster and Bayesian hierarchical clustering --
Results --
Optimality of tree structure --
Hierarchy likelihoods --
Partially observed data --
Psychological hierarchies --
Hierarchies of Gaussian process experts --
Mix
Multivariate t-factor analysers --
Appendix --
Dealing with label switching under model uncertainty /
Labelling through clustering in the point-process representation --
The point-process representation of a finite mixture model --
Identification through clustering in the point-process representation --
Identifying mixtures when the number of components is unknown --
The role of Dirichlet priors in overfitting mixtures --
The meanin
Formal derivation of the posterior distribution --
Locally conjugate priors --
True posterior distributions --
Poisson mixture --
Multinomial mixtures --
Normal mixtures --
Manifold MCMC for mixtures /
Markov chain Monte Carlo Methods --
Metropolis-Hastings --
Gibbs sampling --
Manifold Metropolis adjusted Langevin algorithm --
Manifold Hamiltonian Monte Carlo --
Finite Gaussian mixture models -
Bayesian analyses --
Escobar and West --
Phillips and Smith --
Roeder and Wasserman --
Richardson and Green --
Stephens --
Posterior distributions for K (for flat prior) --
Conclusions from the Bayesian analyses --
Posterior distributions of the model deviances --
Asymptotic distributions --
Posterior deviances for the galaxy data --
Bayesian mixture models: a blood-free dissection of a sheep /
Mixt