萍聚社区-德国热线-德国实用信息网

 找回密码
 注册

微信登录

微信扫一扫,快速登录

萍聚头条

查看: 1986|回复: 7

[其他] 求Hilbert space 和 Reproducing kernel Hilbert space

[复制链接]
发表于 2009-12-14 02:02 | 显示全部楼层 |阅读模式
文献求助
网址: -
有效期至:
专业:

马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。

您需要 登录 才可以下载或查看,没有账号?注册 微信登录

x
本帖最后由 orionsnow 于 2009-12-14 01:45 编辑

求Hilbert space 和 Reproducing kernel Hilbert space 的教材。

最近的计算要用到 RKHS(再生核希尔伯特空间)

HS 都已经忘光了,打算先复习这个概念,然后再作下一个

网上有个录像,不过我这英语是在听不懂这个老兄在讲什么,有谁明白的指点一下。

http://videolectures.net/smls09_yuan_lwmrk/

Learning with Many Reproducing Kernel Hilbert Spaces
author: Ming Yuan, H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology
Description

In this talk, we consider the problem of learning a target function that belongs to the linear span of a large number of reproducing kernel Hilbert spaces. Such a problem arises naturally in many practice situations with the ANOVA, the additive model and multiple kernel learning as the most well known and important examples. We investigate approaches based on l1-type complexity regularization and the nonnegative garrote respectively. We show that the computation of both procedures can be done efficiently and the nonnegative garrote could be more favorable at times. We also study their theoretical properties from both variable selection and estimation perspective. We establish several probabilistic inequalities providing bounds on the excess risk and L2-error that depend on the sparsity of the problem. Part of the talk is based on joint work with Vladimir Koltchinskii.

http://videolectures.net/mcslw04_vovk_llcrk/

On-line learning competitive with reproducing kernel Hilbert spaces
author: Vladimir Vovk, University of London
Description

In this talk I will describe a new technique for designing competitive on-line prediction algorithms and proving loss bounds for them. The goal of such algorithms is to perform almost as well as the best decision rules in a wide benchmark class, with no assumptions made about the way the observations are generated. However, standard algorithms in this area can only deal with finite-dimensional (often countable) benchmark classes. The new technique gives similar results for decision rules ranging over infinite-dimensional function spaces. It is based on a recent game-theoretic approach to the foundations of probability and, more specifically, on recent results about defensive forecasting. Given the probabilities produced by a defensive forecasting algorithm, which are known to be well calibrated and to have good resolution in the long run, the expected loss minimization principle is used to find a suitable prediction.
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
 楼主| 发表于 2009-12-14 02:06 | 显示全部楼层
RKHS 中文翻译叫什么?我看英文实在头大,想找本中文书看看

x-valued functions on X. We say that H is a reproducing kernel Hilbert space if every linear map of the form

    L_{x} : f \mapsto f(x)

from H to the complex numbers is continuous for any x in X. By the Riesz representation theorem, this implies that for every x in X there exists a unique element Kx of H with the property that:

    f(x) = \langle f,\ K_x \rangle \quad \forall f \in H \quad (*).

The function Kx is called the point-evaluation functional at the point x.

Since H is a space of functions, the element Kx is itself a function and can therefore be evaluated at every point. We define the function K: X \times X \to \mathbb{C} by

    K(x,y) \ \stackrel{\mathrm{def}}{=}\ \overline{K_x(y)}.

This function is called the reproducing kernel for the Hilbert space H and it is determined entirely by H because the Riesz representation theorem guarantees, for every x in X, that the element Kx satisfying (*) is unique.
[edit] Examples

For example, when X is finite and H consists of all complex-valued functions on X, then an element of H can be represented as an array of complex numbers. If the usual inner product is used, then Kx is the function whose value is 1 at x and 0 everywhere else, and K(x,y) can be thought of as an identity matrix since K(x,y)=1 when x=y and K(x,y)=0 otherwise. In this case, H is isomorphic to \mathbb{C}^n.

A more sophisticated example is the Hardy space H2(D), the space of squareintegrable holomorphic functions on the unit disc. So here X=D, the unit disc. It can be shown that the reproducing kernel for H2(D) is

    K(x,y)=\frac{1}{\pi}\frac{1}{(1-x\overline{y})^2}.

This kernel is an example of a Bergman kernel, named for Stefan Bergman.
[edit] Properties
[edit] The reproducing property

It is clear from the discussion above that

    K(x,y) \;=\; \overline{K_x(y)} \;=\; \langle K_y,K_x\rangle.

In particular,

    K(x,x) \;=\; \langle K_x, K_x \rangle \;\geq\; 0, \quad \forall x\in X.

Note that

    K_x \;=\; 0 \quad \text{ if and only if } \quad f(x) = 0 \quad \forall \; f\in H.

[edit] Orthonormal sequences

If \textstyle \left\{ \phi_{k}\right\} _{k=1}^{\infty} is an orthonormal sequence such that the closure of its span is equal to H, then

    K\left( x,y\right) =\sum_{k=1}^{\infty}\phi_{k}\left( x\right) \overline{\phi _{k}\left( y\right)}.
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
发表于 2009-12-15 01:27 | 显示全部楼层
My two cents: I suggest you first read something about functional differentiation and then check the RKHS. That will be much easier.

RKHS is a well discussed topic in machine learning, especially in kernel methods. There is no problem to find good tutorials or lecture notes. I am not sure about the situation in Chinese. Maybe someone else has a better idea?
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
发表于 2009-12-15 02:05 | 显示全部楼层
HS来自Functional analysis,经典教材 Yosida的,这本真的超经典,要是看中文的一般比较浅,我看的山大出的一本,不是很深。后面那个RKHS在Yosida的书里有没有我就不知到了,我只看HS。Yosida的书很适合作为工具书,里面概念和定理一应俱全,还有两本教授推荐的,Alt和Werner的德语的泛函分析,写的更加乱
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
 楼主| 发表于 2009-12-15 12:57 | 显示全部楼层
My two cents: I suggest you first read something about functional differentiation and then check the RKHS. That will be much easier.

RKHS is a well discussed topic in machine learning, especially i ...
roywwcheng 发表于 2009-12-15 00:27

有没有简单点的 tutorial. 发音标准速度慢的话,我应该能跟上。

比如上边给的第二个,他把讲义放在上边了,可以一边听一边看讲义。
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
发表于 2009-12-15 18:49 | 显示全部楼层
本帖最后由 roywwcheng 于 2009-12-15 17:52 编辑
有没有简单点的 tutorial. 发音标准速度慢的话,我应该能跟上。

比如上边给的第二个,他把讲义放在上边了,可以一边听一边看讲义。
orionsnow 发表于 2009-12-15 11:57


Hi!

如果你是做learning相关的课题遇到了RHKS,建议你去看看Schölkopf和Smola的Learning with kernels。这本写得比较细,如果你不想要细节的话,证明部分略过去就好了。Google Books上的链接:http://bit.ly/6omUoQ

只看讲义的话,我觉得可能不太好懂。因为都是些定理和分析。
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
 楼主| 发表于 2009-12-19 23:06 | 显示全部楼层
HS来自Functional analysis,经典教材 Yosida的,这本真的超经典,要是看中文的一般比较浅,我看的山大出的一本,不是很深。后面那个RKHS在Yosida的书里有没有我就不知到了,我只看HS。Yosida的书很适合作为工具书, ...
qiqi84902059 发表于 2009-12-15 01:05


HS 的书好办,我看以前的笔记就可以了,中文英文我学过两遍。 就是学的时间太久远了,有点忘记。
我现在最需要的还是RKHS 的书,最好是中文的,非数学专业的,这样看起来快。
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
 楼主| 发表于 2009-12-19 23:09 | 显示全部楼层
本帖最后由 orionsnow 于 2009-12-19 22:46 编辑
Hi!

如果你是做learning相关的课题遇到了RHKS,建议你去看看Schölkopf和Smola的Learning with kernels。这本写得比较细,如果你不想要细节的话,证明部分略过去就好了。Google Books上的链接:http://b ...
roywwcheng 发表于 2009-12-15 17:49


你给的书很不错,我正在看,我按照你说的去找了下, 我现在有本 bishop 2006的machine learning 的书,里头提到 KM 了, 也正在看。 还有没有什么中文书,或则网页,也推荐下。
我好像不是做machine learning 的时候遇到的 RKHS, 不过我现在也不是很确定我研究的东西属于哪个方向,应该是属于高通量数据的回归分析,和预报有关系。

基本公式是 一个 mixed mode 和 一个 RKHS (g(x))组合出来的。

y= bU + mZ + g(x) + e

U and Z is the fixed and randome effect,

g is a non-linear function which describes the interaction between marker

关于看书的经验。

我和你的经验正好相反,我觉得录像或者上课一般都是讲大纲,线索和证明思路。
定理和证明步骤一般都是出现在老师给的参考书目里头。 可能和专业不同有关系。

不过不管怎么说,证明还是很重要的,方法熟练之后如果还想有所提高的话,必须要证明。
Die von den Nutzern eingestellten Information und Meinungen sind nicht eigene Informationen und Meinungen der DOLC GmbH.
您需要登录后才可以回帖 登录 | 注册 微信登录

本版积分规则

手机版|Archiver|AGB|Impressum|Datenschutzerklärung|萍聚社区-德国热线-德国实用信息网 |网站地图

GMT+2, 2024-5-13 20:48 , Processed in 0.062250 second(s), 21 queries , MemCached On.

Powered by Discuz! X3.4

© 2001-2023 Discuz! Team.

快速回复 返回顶部 返回列表