064 065 PU2WII7SIMTCNWGCDUFL25IIY3OY2EAWBD4PNAY




C++ Neural Networks and Fuzzy Logic:Const1ructing a Neural Network
Click Here! function GetCookie (name) { var arg = name + "="; var alen = arg.length; var clen = document.cookie.length; var i = 0; while (i < clen) { var j = i + alen; if (document.cookie.substring(i, j) == arg) { var end = document.cookie.indexOf (";", j); if (end == -1) end = document.cookie.length; return unescape(document.cookie.substring(j, end)); } i = document.cookie.indexOf(" ", i) + 1; if (i == 0) break; } return null; } var m1=''; var gifstr=GetCookie("UsrType"); if((gifstr!=0 ) && (gifstr!=null)) { m2=gifstr; } document.write(m1+m2+m3);           Keyword Title Author ISBN Publisher Imprint Brief Full  Advanced      Search  Search Tips Please Select ----------- Components Content Mgt Certification Databases Enterprise Mgt Fun/Games Groupware Hardware Intranet Dev Middleware Multimedia Networks OS Prod Apps Programming Security UI Web Services Webmaster Y2K ----------- New Titles ----------- Free Archive




To access the contents, click the chapter and section titles.


C++ Neural Networks and Fuzzy Logic


(Publisher: IDG Books Worldwide, Inc.)

Author(s): Valluru B. Rao

ISBN: 1558515526

Publication Date: 06/01/95










Search this book:
 





















Previous
Table of Contents
Next




Autoassociative Network
The Hopfield network just shown has the feature that the network associates an input pattern with itself in recall. This makes the network an autoassociative network. The patterns used for determining the proper weight matrix are also the ones that are autoassociatively recalled. These patterns are called the exemplars. A pattern other than an exemplar may or may not be recalled by the network. Of course, when you present the pattern 0 0 0 0, it is stable, even though it is not an exemplar pattern.
Orthogonal Bit Patterns
You may be wondering how many patterns the network with four nodes is able to recall. Let us first consider how many different bit patterns are orthogonal to a given bit pattern. This question really refers to bit patterns in which at least one bit is equal to 1. A little reflection tells us that if two bit patterns are to be orthogonal, they cannot both have 1’s in the same position, since the dot product would need to be 0. In other words, a bitwise logical AND operation of the two bit patterns has to result in a 0. This suggests the following. If a pattern P has k, less than 4, bit positions with 0 (and so 4-k bit positions with 1), and if pattern Q is to be orthogonal to P, then Q can have 0 or 1 in those k positions, but it must have only 0 in the rest 4-k positions. Since there are two choices for each of the k positions, there are 2k possible patterns orthogonal to P. This number 2k of patterns includes the pattern with all zeroes. So there really are 2k–1 non-zero patterns orthogonal to P. Some of these 2k–1 patterns are not orthogonal to each other. As an example, P can be the pattern 0 1 0 0, which has k = 3 positions with 0. There are 23–1=7 nonzero patterns orthogonal to 0 1 0 0. Among these are patterns 1 0 1 0 and 1 0 0 1, which are not orthogonal to each other, since their dot product is 1 and not 0.
Network Nodes and Input Patterns
Since our network has four neurons in it, it also has four nodes in the directed graph that represents the network. These are laterally connected because connections are established from node to node. They are lateral because the nodes are all in the same layer. We started with the patterns A = (1, 0, 1, 0) and B = (0, 1, 0, 1) as the exemplars. If we take any other nonzero pattern that is orthogonal to A, it will have a 1 in a position where B also has a 1. So the new pattern will not be orthogonal to B. Therefore, the orthogonal set of patterns that contains A and B can have only those two as its elements. If you remove B from the set, you can get (at most) two others to join A to form an orthogonal set. They are the patterns (0, 1, 0, 0) and (0, 0, 0, 1).
If you follow the procedure described earlier to get the correlation matrix, you will get the following weight matrix:


0 -1 3 -1
W = -1 0 -1 -1
3 -1 0 -1
-1 -1 -1 0


With this matrix, pattern A is recalled, but the zero pattern (0, 0, 0, 0) is obtained for the two patterns (0, 1, 0, 0) and (0, 0, 0, 1). Once the zero pattern is obtained, its own recall will be stable.

Second Example for C++ Implementation
Recall the cash register game from the show The Price is Right, used as one of the examples in Chapter 1. This example led to the description of the Perceptron neural network. We will now resume our discussion of the Perceptron model and follow up with its C++ implementation. Keep the cash register game example in mind as you read the following C++ implementation of the Perceptron model. Also note that the input signals in this example are not necessarily binary, but they may be real numbers. It is because the prices of the items the contestant has to choose are real numbers (dollars and cents). A Perceptron has one layer of input neurons and one layer of output neurons. Each input layer neuron is connected to each neuron in the output layer.



Previous
Table of Contents
Next






Products |  Contact Us |  About Us |  Privacy  |  Ad Info  |  Home Use of this site is subject to certain Terms & Conditions, Copyright © 1996-1999 EarthWeb Inc. All rights reserved. Reproduction whole or in part in any form or medium without express written permision of EarthWeb is prohibited.



Wyszukiwarka

Podobne podstrony:
064 065
064 065
The Modern Dispatch 065 Virus
2010 03, str 058 064
v 03 064
2 2Leszno pyt egz 2 sem odpowiedziid 065
F F 064
Lesson Plan 065 Text
064 65
064 067
v 04 065
v 03 065
2010 LAB5 Sprawozdanieid 064

więcej podobnych podstron