Download presentation
Presentation is loading. Please wait.
Published byHengki Kurniawan Modified 5年之前
1
-Artificial Neural Network(ANN)- Self Organization Map(SOM)
朝陽科技大學 資訊管理系 李麗華 教授
2
Outline Review: Types of ANN Introduction SOM Application
Network Structure Concept of Neighborhood Learning Process and Reuse Process Computation Process--Training SOM Exerciese 朝陽科技大學資訊管理系 李麗華 教授
3
Review: Types of ANN Learning Type:
Supervise Learning: using a training set and the expected output for network learning. EX: Perceptron, BPN, PNN, LVQ, CPN. Unsupervise Learning: the network is learned and adjusted based only on the input pattern (date). There is no expected output for fine tune the network. EX:SOM, ART Associative memory Learning: the network uses the weight matrix to memorize all training patterns. The output is recalled based on the learned patterns. EX: Hopfield, Bidirectional Associative Memory(BAM), Hopfield-Tank Optimization Application: the network is designed to perform the optimize solution or global solution. EX: ANN, HTN 朝陽科技大學資訊管理系 李麗華 教授
4
Introduction It’s proposed by Kohonen in 1980.
SOM is an unsupervised two layered network that can organize a topological map from a random starting point. SOM is also called Kohnen’s self organizating feature map. The resulting map shows the natural relationships among the patterns that are given to the network. The application is good for clustering analysis. 朝陽科技大學資訊管理系 李麗華 教授
5
SOM Application Ex: For hand writting pattern classification
Ex: For feature extraction Ex: For simulate the topological feature Ex: For web usage pattern discovering Ex: To simulate the voice pattern Ex: Finding clusters before applying Data Mining technique 朝陽科技大學資訊管理系 李麗華 教授
6
朝陽科技大學資訊管理系 李麗華 教授
7
Network Structure One input layer
One competitive layer which is usually a 2-Dim grid X1 Xi Yj1 Yj2 Yjk j Y1k Y11 Y12 Wijk k … Input layer :f(x):x Output layer:competitive layer with topological map relationship Weights :randomly assigned 朝陽科技大學資訊管理系 李麗華 教授
8
Concept of Neighborhood
Center:the winning node C is the center. Distance: RFactor(鄰近係數): RFjk即 f(rjk,R)=e(-rjk/R) when rjk=0, then e(-rjk/R) 1 when rjk=∞, then e(-rjk/R) 0 when rjk=R, then e(-rjk/R) 0.368 (*)The longer the distance, the smaller the neighborhood factor. N is the node to be calculated C is the center node r is the distance from N to C R:the radius of the neighborhood r :distance from N to C R Factor Adjustment:Rn=R_rate*Rn-1, R_rate<1.0 朝陽科技大學資訊管理系 李麗華 教授
9
Learning Process 1. Setup network 2. Randomly assign weights to W
X1 Xi Yj1 Yj2 Yjk j Y1k Y11 Y12 Wijk k … 1. Setup network 2. Randomly assign weights to W 3. Set the coordinate value of the output layer Y(j,k) 4. Send the Input vector X 5. Find the winning node 6. Update weight △W using R factor 7. ηn=η_rate *ηn-1 Rn=Rrate *Rn-1 8. Repeat from 4 to 7 until converge (*)After the network is trained, the weight matrix is obtained 朝陽科技大學資訊管理系 李麗華 教授
10
Reuse the network Setup the SOM network Read the weight matrix
Set the coordinate value of the output layer Y(j,k) Read input vector Find the winning node Yjk Output the clustering result of Y. 朝陽科技大學資訊管理系 李麗華 教授
11
Computation process--Training
1. Setup network Input Vector : X1~Xi Output (2-D) Map: Yjk 2. Compute winning node j=j*& k=k* if otherwise 3. Yjk = 朝陽科技大學資訊管理系 李麗華 教授
12
Computation process (cont.)
4. Update Weights △Wijk=η.(Xi - Wijk ).RFjk Where Wijk=△Wijk+Wijk 5. Repeat 2-4 for all input ηn=η-rate *ηn-1 Rn= R-rate * Rn-1 6. 7. Repeat until converge 朝陽科技大學資訊管理系 李麗華 教授
13
SOM Exercise-1 Let there be one 2-Dimentional clustering problem.
The vector space include 5 different clusters and each has 2 sample sets as shown in the graph below. X1 X2 1 -0.9 -0.8 2 0.6 3 0.9 4 0.7 -0.4 5 -0.2 0.2 6 -0.7 -0.6 7 0.8 8 9 10 0.1 -1 1 6 4 9 5 10 8 3 7 2 X1 X2 朝陽科技大學資訊管理系 李麗華 教授
14
SOM Exercise-2 Sol: Setup a 2X9 SOM network Randomly assign weights
Wi00 -0.2 -0.8 Wi01 0.2 -0.4 Wi02 0.3 0.6 Wi10 Wi11 -0.3 Wi12 -0.6 Wi20 0.7 Wi21 0.8 Wi22 Wi10 Wi11 Wi02 Wi20 Wi12 Wi22 Wi00 Wi01 Wi21 -1 1 x1 x2 朝陽科技大學資訊管理系 李麗華 教授
15
SOM Exercise-3 Let R=2.0, η=1.0, 代入第一個pattern [-0.9, -0.8]
0.6 3 0.9 4 0.7 -0.4 5 -0.2 0.2 6 -0.7 -0.6 7 0.8 8 9 10 0.1 X1 X2 Wi00 -0.2 -0.8 Wi01 0.2 -0.4 Wi02 0.3 0.6 Wi10 .6 Wi11 -0.3 Wi12 -0.6 Wi20 0.7 Wi21 0.8 Wi22 net00=[( )2+( ) 2]=0.49 net01=[( ) 2+( ) 2]=1.37 net02=[( ) 2+( ) 2]=2.32 net10=[( ) 2+( ) 2]=1.71 net11=[( ) 2+( ) 2]=1.36 net12=[( ) 2+( ) 2]=0.45 net20=[( ) 2+( ) 2]=1.04 net21=[( ) 2+( ) 2]=2.93 net22=[( ) 2+( ) 2]=0.05 We find Min here 朝陽科技大學資訊管理系 李麗華 教授
16
SOM Exercise-4 Update weights r00= r01= r02 = r22=
Therefore, we find net22 is the winning node, i.e., J*=2, K*=2. Update weights j*= k*=2 r00= r01= r02 = r22= Calculate RF00 = 0.243 Calculate RF01 = Calculate RF22= 1 朝陽科技大學資訊管理系 李麗華 教授
17
SOM Exercise-5 Update weights (continue)
For each j, k, we update the weight, as following: Wi00=η‧(X1-Wi00)‧ RF00 =1.0 × ( )×(0.243)=-0.17 Wi01=η‧(X1-W00)‧ RF00 Wi02=η‧(X1-W01)‧ RF00 : Wijk=η‧(X1-Wij)‧ RF00 1 Wi10 Wi02 Wi11 Wi20 -1 1 △Wijk=η.(Xi - Wijk ).RFjk When Wijk=△Wijk+Wijk rjk= hint Wi12 -0.17 Wi01 Wi22 Wi21 Wi00 -1 朝陽科技大學資訊管理系 李麗華 教授
18
Input Vectors Initial Weights 4Y10 ; 5Y11; 6Y12;
jk編號對照: 7Y20 ; 8Y21; 9Y22; 4Y10 ; 5Y11; 6Y12; 1Y00 ; 2Y01; 3Y02; Input Vectors Initial Weights
19
4Y10 ; 5Y11; 6Y12; 1Y00 ; 2Y01; 3Y02; (Y22)
jk編號對照: 7Y20 ; 8Y21; 9Y22; 4Y10 ; 5Y11; 6Y12; 1Y00 ; 2Y01; 3Y02; (Y22) Y22 win and weights are updated towards the first input vector
20
jk編號對照: 7Y20 ; 8Y21; 9Y22; 4Y10 ; 5Y11; 6Y12; 1Y00 ; 2Y01; 3Y02;
21
SOM network 補充影片 TouchBased SOM http://vimeo.com/7714780
2D模擬3Dcolor的示範程式 TouchBased SOM 自組織拓樸圖形學習 解Travelling Salesman Problem Matlab SOM 影片教學 (影像辨視) (3D圖形) (3D彩色柱狀圖) (傳統的網格圖) (影片合集) 朝陽科技大學資訊管理系 李麗華 教授
Similar presentations