登入
選單
返回
Google圖書搜尋
Bayesian Learning in Reproducing Kernel Hilbert Spaces
Ralf Herbrich
Thore Graepel
Colin Campbell
出版
Leiter der Fachbibliothek Informatik, Sekretariat FR 5-4
, 1999
URL
http://books.google.com.hk/books?id=VtjRXwAACAAJ&hl=&source=gbs_api
註釋
Abstract: "Support Vector Machines find the hypothesis that corresponds to the centre of the largest hypersphere that can be placed inside version space, i.e. the space of all consistent hypotheses given a training set. The boundaries of version space touched by this hypersphere define the support vectors. An even more promising approach is to construct the hypothesis using the whole of version space. This is achieved by the Bayes point: the midpoint of the region of intersection of all hyperplanes bisecting version space into two volumes of equal magnitude. It is known that the centre of mass of version space approximates the Bayes point [31]. The centre of mass is estimated by averaging over the trajectory of a billiard in version space. We derive bounds on the generalisation error of Bayesian classifiers in terms of the volume ratio of version space and parameter space. This ratio serves as an effective VC dimension and greatly influences generalisation. We present experimental results indicating that Bayes Point Machines consistently outperform Support Vector machines. Moreover, we show theoretically and experimentally how Bayes Point Machines can easily be extended to admit training errors."