资料介绍
Gaussian linear modelling cannot address current signal processing demands. In
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
下载该资料的人也在下载
下载该资料的人还在阅读
更多 >
- 简述对贝叶斯公式的基本理解 0次下载
- 基于贝叶斯网络等的疼痛表情识别方法 11次下载
- 基于贝叶斯网络和数据挖掘的航班延误预测方法 3次下载
- 一种基于贝叶斯方法的网络安全态势感知混合模型 19次下载
- 贝叶斯网络模型之一依赖估测器模型研究 12次下载
- 朴素贝叶斯NB经典案例 2次下载
- 基于贝叶斯网络的克隆有害性预测方法 0次下载
- 建立实体情感演化贝叶斯置信网的方法 0次下载
- 贝叶斯网络分析 2次下载
- 贝叶斯算法(bayesian)介绍 0次下载
- 基于贝叶斯分类研究肌肉动作模式识别方法
- 基于贝叶斯网络的故障树在机械设备中的应用
- 贝叶斯方法在蛋白质耐热性分类中的研究
- 基于贝叶斯网络的软件项目风险评估模型
- 基于应变模态和贝叶斯方法的杆件损伤识别
- 贝叶斯优化是干什么的(原理解读) 1214次阅读
- 关于贝叶斯概念进行形式化的建模和推理 509次阅读
- 对朴素贝叶斯算法原理做展开介绍 1744次阅读
- 使用朴素贝叶斯和GPU进行更快的文本分类 1346次阅读
- 机器学习:简单的术语带你领略贝叶斯优化之美 2075次阅读
- 贝叶斯方法到贝叶斯网络 3305次阅读
- 带你入门常见的机器学习分类算法——逻辑回归、朴素贝叶斯、KNN、SVM、决策树 1w次阅读
- 为什么AlphaGo调参用贝叶斯优化?手动调参需要8.3天 4421次阅读
- 贝叶斯统计的一个实践案例让你更快的对贝叶斯算法有更多的了解 1.4w次阅读
- 朴素贝叶斯算法详细总结 3.4w次阅读
- 机器学习之朴素贝叶斯 906次阅读
- 基于概率的常见的分类方法--朴素贝叶斯 5263次阅读
- 怎样通俗易懂地解释贝叶斯网络和它的应用? 4168次阅读
- 贝叶斯分类算法及其实现 7453次阅读
- 如何理解贝叶斯公式 3941次阅读
下载排行
本周
- 1电子电路原理第七版PDF电子教材免费下载
- 0.00 MB | 1490次下载 | 免费
- 2单片机典型实例介绍
- 18.19 MB | 92次下载 | 1 积分
- 3S7-200PLC编程实例详细资料
- 1.17 MB | 27次下载 | 1 积分
- 4笔记本电脑主板的元件识别和讲解说明
- 4.28 MB | 18次下载 | 4 积分
- 5开关电源原理及各功能电路详解
- 0.38 MB | 10次下载 | 免费
- 6基于AT89C2051/4051单片机编程器的实验
- 0.11 MB | 4次下载 | 免费
- 7蓝牙设备在嵌入式领域的广泛应用
- 0.63 MB | 3次下载 | 免费
- 89天练会电子电路识图
- 5.91 MB | 3次下载 | 免费
本月
- 1OrCAD10.5下载OrCAD10.5中文版软件
- 0.00 MB | 234313次下载 | 免费
- 2PADS 9.0 2009最新版 -下载
- 0.00 MB | 66304次下载 | 免费
- 3protel99下载protel99软件下载(中文版)
- 0.00 MB | 51209次下载 | 免费
- 4LabView 8.0 专业版下载 (3CD完整版)
- 0.00 MB | 51043次下载 | 免费
- 5555集成电路应用800例(新编版)
- 0.00 MB | 33562次下载 | 免费
- 6接口电路图大全
- 未知 | 30320次下载 | 免费
- 7Multisim 10下载Multisim 10 中文版
- 0.00 MB | 28588次下载 | 免费
- 8开关电源设计实例指南
- 未知 | 21539次下载 | 免费
总榜
- 1matlab软件下载入口
- 未知 | 935053次下载 | 免费
- 2protel99se软件下载(可英文版转中文版)
- 78.1 MB | 537791次下载 | 免费
- 3MATLAB 7.1 下载 (含软件介绍)
- 未知 | 420026次下载 | 免费
- 4OrCAD10.5下载OrCAD10.5中文版软件
- 0.00 MB | 234313次下载 | 免费
- 5Altium DXP2002下载入口
- 未知 | 233045次下载 | 免费
- 6电路仿真软件multisim 10.0免费下载
- 340992 | 191183次下载 | 免费
- 7十天学会AVR单片机与C语言视频教程 下载
- 158M | 183277次下载 | 免费
- 8proe5.0野火版下载(中文版免费下载)
- 未知 | 138039次下载 | 免费
评论