Download presentation
Presentation is loading. Please wait.
1
10-6 CONTROL CHARTS FOR MONITORING VARIABLITY
Alt (1985) gives a nice introduction to the problem and presents two useful procedures. The first procedure is a direct extension of the univariate control chart. The statistic plotted on the control chart for the i th sample is UCL=
2
The second approach is based on the sample
generalized variance, It can be shown that and where
3
The parameters of the control chart for
would be In practice will be estimated by
4
Example 10-1
5
10-7 LATENT STRUCTURE METHODS
Conventional multivariate control-charting procedures are reasonably effective as long as p is not very large. As p increases,the average run-length performance to detect a specified shift in the mean of these variables for multivariate control charts also increases,because the shift is “diluted” in the p-dimensional space of the process variables.
7
10-7.1 Principal Components Analysis (PCA)
Methods for discovering the subdimensions in which the process moves about are sometimes called latent structure methods. Principal Components Analysis (PCA) The basic intent of principal components: Find the new set of orthogonal directions that define the maximum variability in the original data, and hopefully, this will lead to a description of the process requiring considerable fewer than the original p variable.
8
X1的散佈範圍比X2廣,因此可推論X1的變異比X2的變異大
10
將X1,X2的座標軸轉換成Z1,Z2
11
將X1,X2,…,Xp的座標軸轉換成Z1,Z2,…,Zp,即找出
C1,C2,…,Cp 使得:
12
在轉換後的座標軸中,我們只關心 能解釋大部分變異的座標軸
13
由定理知: 欲求的座標轉換向量為共變異矩陣(covariance matrix) 的固有向量(eigenvectors),而對應的固有值(eigenvalues)為轉換後變數的變異數,因此,我們只要將固有值由大到小排列 且其對應的固有向量為 由判斷式: 由此比值來判斷Zi是否為要列入考慮的變數 通常,若原始資料的單位精確度差異很大時,最好先將原始資料標準化(standardized),再做座標轉換
14
範例:
15
原始資料的散佈圖矩陣(matrix of scatter plots):
可看出第一與第二變數的相關性很高
16
求原始數據的樣本共變異矩陣 : 求此矩陣的固有值和對應的固有向量:
17
由於轉換後的變數Z1,Z2的變異數佔全部變異的83
95% confidence ellipse 由於每個點都落 入信賴域中,因此 可將此圖當作 control chart 來 monitor process 是否out of control 這樣的圖稱為 Principal component trajectory plots
18
新收集的10筆資料:
19
有許多點落在界限外,因此認為process out of control
20
10-7.2 Partial Least Squares (PLS)
It classifies the variables into x’s (or inputs) and y’s (or outputs). The goal is to create a set of weighted averages of the x’s and y’s that can be used for prediction of the y’s or linear combinations of the y’s. The procedure maximizes covariance in the same fashion that the principal component directions maximize variance. The most common applications of PLS today are in the chemometrics field, where there are often many variables, both process and response.
Similar presentations