Presentation is loading. Please wait.

Presentation is loading. Please wait.

-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.

Similar presentations


Presentation on theme: "-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授."— Presentation transcript:

1 -Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授

2 朝陽科技大學 李麗華 教授 2 Assoicative Memory(AM) - 1 Def: Associative memory (AM) is any device that associates a set of predefined output patterns with specific input patterns. Two types of AM: –Auto-associative Memory: Converts a corrupted input pattern into the most resembled input. –Hetro-associative Memory: Produces an output pattern that was stored corresponding to the most similar input pattern.

3 朝陽科技大學 李麗華 教授 3 Models: It is the associative mapping of an input vector X into the output vector V. EX: Hopfield Neural Network (HNN) EX: Bidirectional Associative Memory (BAM) Assoicative Memory(AM) - 2 Assoicative Memory X1X2X3:XnX1X2X3:Xn v1v2v3:vmv1v2v3:vm

4 朝陽科技大學 李麗華 教授 4 Introduction Hopfield Neural Network(HNN) was proposed by Hopfield in 1982. HNN is an auto-associative memory network. It is a one layer, fully connected network. X1X1 X2X2 XnXn … …

5 朝陽科技大學 李麗華 教授 5 HNN Architecture Input :  X i  ﹛ -1, +1 ﹜ Output : same as input( ∵ single layer network) Transfer function : X i new = Weights : Connections : +1 net j > 0 X i if net j = 0 -1 net j < 0 X1X1 X2X2 XnXn … … ( X i 是指前一個 X 值 )

6 朝陽科技大學 李麗華 教授 6 HNN Learning Process Learning Process : a. Setup the network, i.e., design the input nodes & connections. b. Calculate and derived the weight matrix C. Store the weight matrix. The learning process is done when the weight matrix is derived. We shall obtain a nxn weight matrix, W nxn.

7 朝陽科技大學 李麗華 教授 7 HNN Recall Process Recall a. Read the nxn weight matrix, W nxn. b. Input the test pattern X for recalling. c. Compute new input ( i.e. output ) d. Repeat process c. until the network converge ( i.e. the net value is not changed or the error is very small ) +1 net j > 0 X j old if net j = 0 +1 net j < 0 X j : ( or net = W ‧ X i ) X new

8 朝陽科技大學 李麗華 教授 8 Example: Use HNN to memorize patterns (1) Use HNN to memorize the following patterns. Let the Green color is represented by “1” and white color is represented by “-1”. The input data is as shown in the table PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X 1 11 1 X2X2 1 1 1 X3X3 111111 X4X4 X1X1 X2X2 X3X3 X4X4

9 朝陽科技大學 李麗華 教授 9 W ii =0 PX1X1 X2X2 X3X3 X4X4 X5X5 X6X6 X 1 11 1 X2X2 1 1 1 X3X3 111111 X4X4 Example: Use HNN to memorize patterns (2)

10 朝陽科技大學 李麗華 教授 10 Recall Example: Use HNN to memorize patterns (3) The pattern is recalled as:

11 朝陽科技大學 李麗華 教授 11 Lyapunov or Liapunov(1/5) Aleksandr Mikhailovich Lyapunov (1918) is a Russian mathematician and physicist. His work in the field of differential equations, potential theory, the stability of systems and probability theory is very important.potential theoryprobability theory His methods, today named Lyapunov methods, which he developed in 1899, make it possible to define the stability of sets of ordinary differential equations.1899

12 朝陽科技大學 李麗華 教授 12 Liapunov function (1/5) –This is an energy function that has been utilized to construct the method of HNN. Where –In order to let the function converge, we need to achieve =constant

13 朝陽科技大學 李麗華 教授 13 Liapunov function (2/5) To prove the Liapunov function can be used in HNN We let pf :

14 朝陽科技大學 李麗華 教授 14 Liapunov function (3/5)

15 朝陽科技大學 李麗華 教授 15 Liapunov function (4/5) Discussion : A Case1: when Case2 when B ∴ <0 B, B >0 ∴ B <0 A &,

16 朝陽科技大學 李麗華 教授 16 Liapunov function (5/5) Liapunov → Auto associative 令  ‚ ƒ 當 ∴ 最小最大


Download ppt "-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授."

Similar presentations


Ads by Google