登入
選單
返回
Google圖書搜尋
Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension
David Haussler
出版
University of California, Santa Cruz, Computer Research Laboratory
, 1991
URL
http://books.google.com.hk/books?id=3L8oAQAAIAAJ&hl=&source=gbs_api
註釋
Abstract: "In this paper we study a Bayesian or average-case model of concept learning with a twofold goal: to provide more precise characterizations of learning curve (sample complexity) behavior that depend on properties of both the prior distribution over concepts and the sequence of instances seen by the learner, and to smmoothly unite in a common framework the popular statistical physics and VC dimension theories of learning curves. To achieve this, we undertake a systematic investigation and comparison of two fundamental quantities in learning and information theory: the probability of an incorrect prediction for an optimal learning algorithm, and the Shannon information gain. This study leads to a new understanding of the sample complexity of learning in several existing models."