登入
選單
返回
Google圖書搜尋
An Anytime Approach to Connectionist Theory Refinement
David William Opitz
其他書名
Refining the Topologies of Knowledge-based Neural Networks
出版
University of Wisconsin--Madison
, 1995
URL
http://books.google.com.hk/books?id=dlfQAAAAMAAJ&hl=&source=gbs_api
註釋
Abstract: "Many scientific and industrial problems can be better understood by learning from samples of the task at hand. For this reason, the machine learning and statistics communities devote considerable research effort on generating inductive-learning algorithms that try to learn the true 'concept' of a task from a set of its examples. Often times, however, one has additional resources readily available, but largely unused, that can improve the concept that these learing algorithms generate. These resources include available computer cycles, as well as prior knowledge describing what is currently known about the domain. Effective utilization of available computer time is important since for most domains an expert is willing to wait for weeks, or even months, if a learning system can produce an improved concept. Using prior knowledge is important since it can contain information not present in the current set of training examples. In this thesis, I present three 'anytime' approaches to connectionist theory refinement. Briefly, these approaches start by translating a set of rules describing what is currently known about the domain into a neural network, thus generating a knowledge-based neural network (KNN). My approaches then utilize available computer time to improve this KNN by continually refining its weights and topology. My first method, TopGen, searches for good 'local' refinements to the KNN topology. It does this by adding nodes to the KNN in a manner analogous to symbolically adding rules and conjuncts to an incorrect rule base. My next approach, REGENT, uses genetic algorithms to find better 'global' changes to this topology. REGENT proceeds by using (a) the domain-specific rules to help create the initial population of KNNs and (b) crossover and mutation operators specifically designed for KNNs. My final algorithm, ADDEMUP, searches for an 'ensemble' of KNNs that work together to produce an effective composite prediction. ADDEMUP works by using genetic algorithms to continually create new networks, keeping the set of networks that are as accurate as possible while disagreeing with each other as much as possible. Empirical results show that these algorithms successfully achieve each of their respective goals."