background image

 

1

Midsem Examination        Artificial Neuro-Fuzzy Theory AT74.05         March 7, 2008 

  

 

 

Time: 10:00-12:00 h.  

    

 

               

 

Open Book 

  

Marks: 100 

Attempt all questions. 

 

Q.1  The relation between input, x, and output, y, of a system from an experiment is recorded as 

 

   

 

 

x -1 0 1 

 

   

 

 

5 1 3 

Apply LMS algorithm to determine the parameters when the relation is modeled by 

(a) 

y = a

(b) 

y = a

0

+a

1

x 

(c)

  

y = a

0

+a

1

x+a

2

x

2

 

Assume that all the data are presented equally. Determine summation of squared errors from each model.  

      (30) 

Solution 

Rx

x

h

x

c

x

zz

E

x

tz

E

x

t

E

x

F

T

T

T

T

T

+

=

+

=

2

)

(

)

(

2

)

(

)

(

2

                         (1) 

[

]

[ ]

35

3

1

3

1

5

3

1

2

2

2

=

+

+

=

c

                                                       (2) 

In model (a), 

[ ] [ ] [ ]

[

]

[ ] [ ]

3

9

3

1

1

3

1

1

1

5

3

1

=

=

+

+

=

h

                                                 (3) 

[ ][ ] [ ][ ] [ ][ ]

[

]

[ ] [ ]

1

3

3

1

1

1

1

1

1

1

3

1

=

=

+

+

=

R

                                              (4) 

[ ] [ ] [ ]

3

3

1

1

1

0

=

=

=

h

R

a

                                                         (5) 

3

=

y

                                                                        (6) 

=

+

+

=

=

8

)

3

3

(

)

3

1

(

)

3

5

(

)

(

2

2

2

2

2

a

t

e

                              (7) 

In model (b), 

⎡−

=

+

+

⎡−

=

9

2

3

1

1

1

3

1

0

1

1

1

5

3

1

h

                                           (8) 

background image

 

2

[

]

[

]

[ ]

=

+

+

⎡−

=

3

0

0

2

3

1

1

1

1

1

1

0

1

0

1

1

1

1

3

1

R

                              (9) 

⎡−

=

⎡−

=

=

3

1

9

2

3

0

0

2

1

1

0

1

h

R

a

a

                                        (10) 

x

y

= 3

                                                                (11) 

6

)

2

3

(

)

3

1

(

)

4

5

(

)

(

2

2

2

2

2

=

+

+

=

=

a

t

e

                           (12) 

In model (c), 

=

+

+

=

9

2

8

3

1

1

1

1

3

1

0

0

1

1

1

1

5

3

1

h

                                                (13) 

[

]

[

]

[

]

=

+

+

=

3

0

2

0

2

0

2

0

2

3

1

1

1

1

1

1

1

1

0

0

1

0

0

1

1

1

1

1

1

3

1

R

                         (14) 

=

=

=

1

1

3

9

2

8

3

0

2

0

2

0

2

0

2

1

1

0

1

2

h

R

a

a

a

                                        (15) 

2

3

1

x

x

y

+

=

                                                          (16) 

0

)

3

3

(

)

1

1

(

)

5

5

(

)

(

2

2

2

2

2

=

+

+

=

=

a

t

e

                    (17) 

 

 

 

 

 

 

 

 

 

 

background image

 

3

Q.2 

A 1-2-1 network is used to model the relation between input and output of the system in Q.1. 

Train the network by Momentum Backpropagation (MOBP) algorithm by batching mode for 1 round. 

Apply sum squared error as the performance index function. Use the following initial weights and biases 

0

,

3

.

0

,

2

.

0

,

1

.

0

,

3

.

0

,

2

.

0

,

1

.

0

2

1

2

12

2

11

1

2

1

1

1
21

1

11

=

=

=

=

=

=

=

b

w

w

b

b

w

w

 

Assume that log-sigmoid function is applied in the first layer and linear function is applied in the second 

layer and 

α = 1, γ = 0.5.  

 

 

 

 

 

 

 

 

      (30) 

Solution 

 

 

 

 

 

 

 
 
 
 

 

 

 

 

 

Determine derivative of transfer function, 

1

,

),

)(

1

(

,

)

1

(

2

2

1

1

1

1

1

=

=

=

+

=

f

n

f

a

a

f

e

f

n

&

&

  

 

 

   (1) 

Present (-1, 5), 

[ ]

=

+

+

=

+

+

=

=

⎡−

+

⎡−

=

+

=

=

=

4750

.

0

4502

.

0

)

1

(

)

1

(

)

1

(

)

1

(

,

1

.

0

2

.

0

1

.

0

3

.

0

1

2

.

0

1

.

0

,

1

1

1

.

0

1

2

.

0

1

1

1

1

1

1

1

1
2

1

1

e

e

e

e

a

b

p

w

n

p

p

n

n

 (2) 

[

]

9475

.

4

0525

.

0

5

,

0525

.

0

,

0525

.

0

0

4750

.

0

4502

.

0

3

.

0

2

.

0

1

2

1

2

1

2

1

2

2

1

=

=

=

=

=

=

=

=

+

=

+

=

a

t

e

e

n

a

a

b

a

w

n

  (3) 

=

⎡−

=

=

=

=

0748

.

0

0495

.

0

]

1

[

3

.

0

2

.

0

4750

.

0

)

4750

.

0

1

(

0

0

4502

.

0

)

4502

.

0

1

(

~

)

(

~

,

1

~

2

2

1

1

2

2

S

w

f

S

f

S

T

&

&

    (4) 

Σ 

purelin 

1

11

w

 

2

11

w

1

2

b

2

1

b

Σ 

logsig

1

1

b

1

21

w

 

2

12

w

Σ 

logsig

background image

 

4

=

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

5000

.

0

2375

.

0

2251

.

0

0374

.

0

0248

.

0

0374

.

0

0248

.

0

1

)

4750

.

0

(

1

)

4502

.

0

(

1

0748

.

0

0495

.

0

)

1

(

0748

.

0

)

1

(

0495

.

0

)

1

(

5

.

0

0

0

0

0

0

0

0

5

.

0

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

2

1

1
2

2

1

1

1

2

1

1
2

1

1

1

1
2

1

1

1

2

1

2

12

2

11

1

2

1

1

1
21

1

11

2

1

2

12

2

11

1
2

1

1

1
21

1

11

s

a

s

a

s

s

s

p

s

p

s

b

w

w

b

b

w

w

b

w

w

b

b

w

w

α

γ

γ

 (5) 

Present (0, 1), 

[ ]

=

+

+

=

+

+

=

⎡−

=

⎡−

+

⎡−

=

+

=

=

=

5250

.

0

4256

.

0

)

1

(

)

1

(

)

1

(

)

1

(

,

1

.

0

3

.

0

1

.

0

3

.

0

0

2

.

0

1

.

0

,

0

1

1

.

0

1

3

.

0

1

1

1

1

1

1

2

1
2

1

1

e

e

e

e

a

b

p

w

n

p

p

n

n

    (6) 

[

]

9276

.

0

0724

.

0

1

,

0724

.

0

,

0724

.

0

0

5250

.

0

4256

.

0

3

.

0

2

.

0

1

2

1

2

1

2

1

2

2

1

=

=

=

=

=

=

=

=

+

=

+

=

a

t

e

e

n

a

a

b

a

w

n

  (7) 

=

⎡−

=

=

=

=

0748

.

0

0489

.

0

]

1

[

3

.

0

2

.

0

5250

.

0

)

5250

.

0

1

(

0

0

4256

.

0

)

4256

.

0

1

(

~

)

(

~

,

1

~

2

2

1

1

2

2

S

w

f

S

f

S

T

&

&

    (8) 

=

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

5000

.

0

2625

.

0

2128

.

0

0374

.

0

0245

.

0

0

0

1

)

5250

.

0

(

1

)

4256

.

0

(

1

0748

.

0

0489

.

0

)

0

(

0748

.

0

)

0

(

0489

.

0

)

1

(

5

.

0

0

0

0

0

0

0

0

5

.

0

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

2

1

1
2

2

1

1

1

2

1

1
2

1

1

1

1
2

1

1

1

2

1

2

12

2

11

1

2

1

1

1
21

1

11

2

1

2

12

2

11

1
2

1

1

1
21

1

11

s

a

s

a

s

s

s

p

s

p

s

b

w

w

b

b

w

w

b

w

w

b

b

w

w

α

γ

γ

 (9) 

Present (1, 3), 

[ ]

=

+

+

=

+

+

=

⎡−

=

⎡−

+

⎡−

=

+

=

=

=

5744

.

0

4013

.

0

)

1

(

)

1

(

)

1

(

)

1

(

,

3

.

0

4

.

0

1

.

0

3

.

0

1

2

.

0

1

.

0

,

1

1

3

.

0

1

4

.

0

1

1

1

1

1

1

3

1
2

1

1

e

e

e

e

a

b

p

w

n

p

p

n

n

 (10) 

[

]

9079

.

2

0921

.

0

3

,

0921

.

0

,

0921

.

0

0

5744

.

0

4013

.

0

3

.

0

2

.

0

1

2

1

2

1

2

1

2

2

1

=

=

=

=

=

=

=

=

+

=

+

=

a

t

e

e

n

a

a

b

a

w

n

  (11) 

=

⎡−

=

=

=

=

0733

.

0

0481

.

0

]

1

[

3

.

0

2

.

0

5744

.

0

)

5744

.

0

1

(

0

0

4013

.

0

)

4013

.

0

1

(

~

)

(

~

,

1

~

2

2

1

1

2

2

S

w

f

S

f

S

T

&

&

    (12) 

background image

 

5

=

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

5000

.

0

2872

.

0

2007

.

0

0367

.

0

0240

.

0

0367

.

0

0240

.

0

1

)

5744

.

0

(

1

)

4013

.

0

(

1

0733

.

0

0481

.

0

)

1

(

0733

.

0

)

1

(

0481

.

0

)

1

(

5

.

0

0

0

0

0

0

0

0

5

.

0

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

2

1

1
2

2

1

1

1

2

1

1
2

1

1

1

1
2

1

1

1

2

1

2

12

2

11

1

2

1

1

1
21

1

11

2

1

2

12

2

11

1
2

1

1

1
21

1

11

s

a

s

a

s

s

s

p

s

p

s

b

w

w

b

b

w

w

b

w

w

b

b

w

w

α

γ

γ

  (13) 

Determine summation of error square, 

7941

.

33

9079

.

2

9276

.

0

9475

.

4

)

(

2

2

2

2

=

+

+

=

=

e

x

F

             (14) 

=

+

+

+

=

Δ

Δ

Δ

Δ

Δ

Δ

Δ

+

=

5000

.

1

0872

.

1

4386

.

0

2115

.

0

3773

.

0

1993

.

0

0992

.

0

5000

.

0

2872

.

0

2007

.

0

0367

.

0

0240

.

0

0367

.

0

0240

.

0

5000

.

0

2625

.

0

2128

.

0

0374

.

0

0245

.

0

0

0

5000

.

0

2375

.

0

2251

.

0

0374

.

0

0248

.

0

0374

.

0

0248

.

0

0

3

.

0

2

.

0

1

.

0

3

.

0

2

.

0

1

.

0

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

0

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

)

1

(

2

1

2

12

2

11

1

2

1

1

1
21

1

11

2

1

2

12

2

11

1

2

1

1

1
21

1

11

2

1

2

12

2

11

1
2

1

1

1
21

1

11

b

w

w

b

b

w

w

b

w

w

b

b

w

w

b

w

w

b

b

w

w

 (15) 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

background image

 

6

Q.3  

Linear associator network with Pseudoinverse rule is used to perform as a fuzzy controller for a 

mobile robot. 3 sensors, used to detect the existence of obstacles, are equipped at the left, middle and 

right of the robot. The reading from the sensor varies between -1 to 1; the reading indicates 1 if there 

exists definitely the obstacle, -1 if there is no obstacle. The output from the network is the commands 

sending to 2 motors at the left and right wheels. The motor command of 1 means move forward at full 

speed, the command of 0 means no move, while the command of -1 means move backward at full speed. 

Assume that the following rules are desired. 

=

=

motor

right 

motor

left 

,

sensor

right 

sensor

 

middle

sensor

left 

t

p

 

=

=

=

=

=

=

0

0

,

1

1

1

,

1

0

,

1

1

1

,

0

1

,

1

1

1

3

3

2

2

1

1

t

p

t

p

t

p

 

Find the weight from Pseudoinverse rule. Determine the output when the sensor reading indicates 

(a) 

−1

1

1

, (b) 

⎡−

1

1

1

, (c) 

1

1

1

, (d) 

1

1

1

, and (e) 

1

1

1

.

   

 

 

 

 

      (20) 

Solution 

By Pseudoinverse rule, 

W = TP

+

  

 

 

 

 

        (1) 

Where 

P

+

 = (P

T

 P)

-1

P

T

   

 

 

 

 

(2) 

and 

=

0

1

0

0

0

1

T

=

1

1

1

1

1

1

1

1

1

P

                            (3) 

=

=

3

1

1

1

3

1

1

1

3

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1

P

P

T

                 (4) 

background image

 

7

 

=

=

5

.

0

25

.

0

25

.

0

25

.

0

5

.

0

25

.

0

25

.

0

25

.

0

5

.

0

3

1

1

1

3

1

1

1

3

)

(

1

1

P

P

T

                                (5) 

=

=

5

.

0

0

5

.

0

0

5

.

0

5

.

0

5

.

0

5

.

0

0

1

1

1

1

1

1

1

1

1

5

.

0

25

.

0

25

.

0

25

.

0

5

.

0

25

.

0

25

.

0

25

.

0

5

.

0

)

(

1

T

T

P

P

P

           (6) 

=

=

=

+

0

5

.

0

5

.

0

5

.

0

5

.

0

0

5

.

0

0

5

.

0

0

5

.

0

5

.

0

5

.

0

5

.

0

0

0

1

0

0

0

1

TP

W

                (7) 

(a) 

when the input is 

⎡−

1

1

1

=

=

1

0

1

1

1

0

5

.

0

5

.

0

5

.

0

5

.

0

0

a

                                     (8) 

(b) 

when the input is 

−1

1

1

⎡−

=

⎡−

=

0

1

1

1

1

0

5

.

0

5

.

0

5

.

0

5

.

0

0

a

 

                           (9) 

(c) 

when the input is 

1

1

1

=

=

0

0

1

1

1

0

5

.

0

5

.

0

5

.

0

5

.

0

0

a

        

                         (10) 

(d) 

when the input is 

1

1

1

background image

 

8

=

=

1

1

1

1

1

0

5

.

0

5

.

0

5

.

0

5

.

0

0

a

         

                         (11) 

(e) 

when the input is 

1

1

1

=

=

1

1

1

1

1

0

5

.

0

5

.

0

5

.

0

5

.

0

0

a

         

                         (12) 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

background image

 

9

Q.4  

Design an LVQ network which can recognize classes A and B based on the vector spaces shown 

below.   

 

 

 

 

 

 

 

 

 

 

      (20) 

 

Solution 

LVQ network can be used. 

In the first layer of LVQ, 

=

10

5

1

w

  

 

 

                           (1) 

=

5

10

2

w

  

 

 

 

              (2) 

=

10

15

3

w

 

 

 

 

             (3) 

=

5

20

4

w

 

 

 

 

             (4) 

The weight matrix of the first layer is thus 

=

5

20

10

15

5

10

10

5

1

W

  

 

 

                  (5) 

0              5            10           15           20           25             

y 
 
15 
 
 
10 
 
 

 
  
 

A

B

background image

 

10

The weight matrix of the second layer combines and recognizes subclasses 1 and 4 as class A (1), 

subclasses 2 and 3 as class B (2) respectively. 

=

0

1

1

0

1

0

0

1

2

W

    

 

 

                   (6)