登入
選單
返回
Google圖書搜尋
Neural Computation in Hopfield Networks and Boltzmann Machines
James P. Coughlin
Robert H. Baran
出版
University of Delaware Press
, 1995
主題
Computers / Artificial Intelligence / General
Computers / Machine Theory
Computers / Data Science / Neural Networks
ISBN
0874134641
9780874134643
URL
http://books.google.com.hk/books?id=XcQFO-gxB0QC&hl=&source=gbs_api
EBook
SAMPLE
註釋
"One hundred years ago, the fundamental building block of the central nervous system, the neuron, was discovered. This study focuses on the existing mathematical models of neurons and their interactions, the simulation of which has been one of the biggest challenges facing modern science." "More than fifty years ago, W. S. McCulloch and W. Pitts devised their model for the neuron, John von Neumann seemed to sense the possibilities for the development of intelligent systems, and Frank Rosenblatt came up with a functioning network of neurons. Despite these advances, the subject had begun to fade as a major research area until John Hopfield arrived on the scene. Drawing an analogy between neural networks and the Ising spin models of ferromagnetism, Hopfield was able to introduce a "computational energy" that would decline toward stable minima under the operation of the system of neurodynamics devised by Roy Glauber." "Like a switch, a neuron is said to be either "on" or "off." The state of the neuron is determined by the states of the other neurons and the connections between them, and the connections are assumed to be reciprocal - that is, neuron number one influences neuron number two exactly as strongly as neuron number two influences neuron number one. According to the Glauber dynamics, the states of the neurons are updated in a random serial way until an equilibrium is reached. An energy function can be associated with each state, and equilibrium corresponds to a minimum of this energy. It follows from Hopfield's assumption of reciprocity that an equilibrium will always be reached." "D. H. Ackley, G. E. Hinton, and T. J. Sejnowski modified the Hopfield network by introducing the simulated annealing algorithm to search out the deepest minima. This is accomplished by - loosely speaking - shaking the machine. The violence of the shaking is controlled by a parameter called temperature, producing the Boltzmann machine - a name designed to emphasize the connection to the statistical physics of Ising spin models." "The Boltzmann machine reduces to the Hopfield model in the special case where the temperature goes to zero. The resulting network, under the Glauber dynamics, produces a homogeneous, irreducible, aperiodic Markov chain as it wanders through state space. The entire theory of Markov chains becomes applicable to the Boltzmann machine." "With ten chapters, five appendices, a list of references, and an index, this study should serve as an introduction to the field of neural networks and its application, and is suitable for an introductory graduate course or an advanced undergraduate course."--BOOK JACKET.Title Summary field provided by Blackwell North America, Inc. All Rights Reserved