Stable Poisson-Kingman species sampling priors generated by general ordered size biased generalized gamma mixing distributions
- 👤 Speaker: Prof. Lancelot James (HKUST)
- 📅 Date & Time: Thursday 08 May 2014, 11:00 - 12:30
- 📍 Venue: Engineering Department, CBL Room BE-438
Abstract
Discrete random distribution functions play a central role in applications in Bayesian Nonparametrics, Statistical Machine Learning, and also in the fields broadly defined as employing Combinatorial Stochastic processes. Arguably the most popular models are the Dirichlet Process and its two Parameter extension derived from a stable subordinator, the latter process is also known as Pitman-Yor process. Perhaps the third most popular model, and one which is being used more frequently, is the class obtained by normalizing a generalized gamma subordinator. The law of a generalized gamma random variable is defined by exponentially tilting a stable density. As we shall describe, all models can be considered in a unified way by using the Poisson-Kingman framework applied to a stable subordinator. This amounts to conditioning on the total mass of a normalized stable process and then mixing over a new distribution of the total mass. The focus of this talk will be based on new classes of models that encompass the special cases mentioned above. These classes are defined by mixing over random variables based on the expectation of a generalized gamma variable raised to an arbitrary real valued power.
Our results include explicit stick-breaking representations derived from a generalized residual allocation scheme for this entire class. Representations in terms of normalized subordinators. A posterior analysis etc. What is quite interesting is that these results can be seen to arise from mappings that project random variables in a lower ordered class to higher ones. Furthermore this necessitates randomization of a generalized gamma parameter. If time permits we shall describe how our analysis leads to transparent results related to recent work on species richness estimators.
Series This talk is part of the Machine Learning @ CUED series.
Included in Lists
- All Talks (aka the CURE list)
- Biology
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge Neuroscience Seminars
- Cambridge talks
- CBL important
- Chris Davis' list
- Creating transparent intact animal organs for high-resolution 3D deep-tissue imaging
- dh539
- dh539
- Engineering Department, CBL Room BE-438
- Featured lists
- Guy Emerson's list
- Hanchen DaDaDash
- Inference Group Summary
- Information Engineering Division seminar list
- Interested Talks
- Joint Machine Learning Seminars
- Life Science
- Life Sciences
- Machine Learning @ CUED
- Machine Learning Summary
- ML
- ndk22's list
- Neuroscience
- Neuroscience Seminars
- Neuroscience Seminars
- ob366-ai4er
- Required lists for MLG
- rp587
- Seminar
- Simon Baker's List
- Stem Cells & Regenerative Medicine
- Trust & Technology Initiative - interesting events
- yk373's list
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 08 May 2014, 11:00-12:30