Chapter 9 Validation Prof. Dehan Luo

Slides:



Advertisements
Similar presentations
Chapter 2 Combinatorial Analysis 主講人 : 虞台文. Content Basic Procedure for Probability Calculation Counting – Ordered Samples with Replacement – Ordered.
Advertisements

高考短文改错专题 张柱平. 高考短文改错专题 一. 对短文改错的要求 高考短文改错的目的在于测试考生判断发现, 纠正语篇中 语言使用错误的能力, 以及考察考生在语篇中综合运用英 语知识的能力. 二. 高考短文改错的命题特点 高考短文改错题的形式有说明文. 短文故事. 书信等, 具有很 强的实用性.
期末考试作文讲解 % 的同学赞成住校 30% 的学生反对住校 1. 有利于培养我们良好的学 习和生活习惯; 1. 学生住校不利于了解外 界信息; 2 可与老师及同学充分交流有 利于共同进步。 2. 和家人交流少。 在寄宿制高中,大部分学生住校,但仍有一部分学生选 择走读。你校就就此开展了一次问卷调查,主题为.
考研英语复试 口语准备 考研英语口语复试. 考研英语复试 口语准备 服装 谦虚、微笑、自信 态度积极 乐观沉稳.
Keys to the Success of Teen Challenge Part 1 第 1 部分 Keys to the Success of Teen Challenge Part 1 青年挑战事工成功的关键 第 1 部分 By David Batty 大卫. 巴迪 Global Teen Challenge.
听力满分不是梦 博智 —— Anna钟小娜.
第五章 動詞 動詞用來表示一種動作 動詞有及物與不及物之分,及物動詞之後需要受詞,有的動詞甚至需要兩個受詞:一個直接受詞,一個間接受詞
CHAPTER 9 虛擬記憶體管理 9.2 分頁需求 9.3 寫入時複製 9.4 分頁替換 9.5 欄的配置法則 9.6 輾轉現象
宏 观 经 济 学 N.Gregory Mankiw 上海杉达学院.
Business English Reading
即兴中文讲演比赛 On-Site Speech 新型比赛项目
-Artificial Neural Network- Hopfield Neural Network(HNN) 朝陽科技大學 資訊管理系 李麗華 教授.
Chapter 8 Liner Regression and Correlation 第八章 直线回归和相关
SHARE with YOU Why am I here? (堅持……) What did I do?
Academic Year TFC EFL Data Collection Outline 学年美丽中国英语测试数据收集概述
Writing 促销英文信 促销的目的就是要卖出产品,那么怎样才能把促销信写得吸引人、让人一看就对产品感兴趣呢?下面就教你促销信的四步写法。
深層學習 暑期訓練 (2017).
Some Effective Techniques for Naive Bayes Text Classification
Rate and Distortion Optimization for Reversible Data Hiding Using Multiple Histogram Shifting Source: IEEE Transactions On Cybernetics, Vol. 47, No. 2,February.
Platypus — Indoor Localization and Identification through Sensing Electric Potential Changes in Human Bodies.
Thinking of Instrumentation Survivability Under Severe Accident
指導教授:許子衡 教授 報告學生:翁偉傑 Qiangyuan Yu , Geert Heijenk
Population proportion and sample proportion
模式识别 Pattern Recognition
Differential Equations (DE)
On Some Fuzzy Optimization Problems
Sampling Theory and Some Important Sampling Distributions
Guide to Freshman Life Prepared by Sam Wu.
创建型设计模式.
971研究方法課程第九次上課 認識、理解及選擇一項適當的研究策略
This Is English 3 双向视频文稿.
Interval Estimation區間估計
子博弈完美Nash均衡 我们知道,一个博弈可以有多于一个的Nash均衡。在某些情况下,我们可以按照“子博弈完美”的要求,把不符合这个要求的均衡去掉。 扩展型博弈G的一部分g叫做一个子博弈,如果g包含某个节点和它所有的后继点,并且一个G的信息集或者和g不相交,或者整个含于g。 一个Nash均衡称为子博弈完美的,如果它在每.
Towards Emotional Awareness in Software Development Teams
職業 Random Slide Show Menu
IBM SWG Overall Introduction
统 计 学 (第三版) 2008 作者 贾俊平 统计学.
谈模式识别方法在林业管理问题中的应用 报告人:管理工程系 马宁 报告地点:学研B107
Guide to a successful PowerPoint design – simple is best
相關統計觀念復習 Review II.
前向人工神经网络敏感性研究 曾晓勤 河海大学计算机及信息工程学院 2003年10月.
BORROWING SUBTRACTION WITHIN 20
中国科学技术大学计算机系 陈香兰 2013Fall 第七讲 存储器管理 中国科学技术大学计算机系 陈香兰 2013Fall.
中央社新聞— <LTTC:台灣學生英語聽說提升 讀寫相對下降>
True friendship is like sound health;
Learn Question Focus and Dependency Relations from Web Search Results for Question Classification 各位老師大家好,這是我今天要報告的論文題目,…… 那在題目上的括號是因為,前陣子我們有投airs的paper,那有reviewer對model的名稱產生意見.
计算机问题求解 – 论题 算法方法 2016年11月28日.
爬蟲類動物2 Random Slide Show Menu
A Data Mining Algorithm for Generalized Web Prefetching
高考应试作文写作训练 5. 正反观点对比.
An Efficient MSB Prediction-based Method for High-capacity Reversible Data Hiding in Encrypted Images 基于有效MSB预测的加密图像大容量可逆数据隐藏方法。 本文目的: 做到既有较高的藏量(1bpp),
An organizational learning approach to information systems development
李宏毅專題 Track A, B, C 的時間、地點開學前通知
Efficient Query Relaxation for Complex Relationship Search on Graph Data 李舒馨
Review of Statistics.
磁共振原理的临床应用.
Resolving Conflicts 解决冲突
More About Auto-encoder
Speaker : YI-CHENG HUNG
何正斌 博士 國立屏東科技大學工業管理研究所 教授
怎樣把同一評估 給與在不同班級的學生 How to administer the Same assessment to students from Different classes and groups.
Class imbalance in Classification
Hospitality English 酒店商务英语 讲师:罗云利 工商与公共管理学院.
簡單迴歸分析與相關分析 莊文忠 副教授 世新大學行政管理學系 計量分析一(莊文忠副教授) 2019/8/3.
Principle and application of optical information technology
以分为镜知对错 以卷为鉴晓得失 —邯郸市一模得与失
WiFi is a powerful sensing medium
Gaussian Process Ruohua Shi Meeting
Presentation transcript:

Chapter 9 Validation Prof. Dehan Luo 第九章 确定方法 Section One Motivation (第一节 问题引发) Section Two The Holdout method (第二节 保持方法) Section Three Re-sampling techniques (第三节 重复采样技术) Section Four Three-way data splits (第四节 三路数据分离) Intelligent Sensors System 9-1 School of Information Engineering

Section One Motivation Chapter9 Validation Prof. Dehan Luo Section One Motivation (第一节 问题引发) (1) Validation techniques are motivated by two fundamental problems in pattern recognition: model selection and performance estimation (确认技术是由模式识别中的两个基本问题而引发产生,即模型选择和 性能估计) (2)Model selection(模型选择) (a)Almost invariably, all pattern recognition techniques have one or more free parameters (几乎都一样,所有模式识别技术有一个或多个自由参数) Intelligent Sensors System 9-2 School of Information Engineering

Section One Motivation Chapter9 Validation Prof. Dehan Luo Section One Motivation (第一节 问题引发) (2)Model selection(模型选择) 几乎都一样,所有模式识别技术有一个或多个自由参数 The number of neighbors in a kNN classification rule The network size, learning parameters and weights in MLPs (b)How do we select the “optimal” parameter(s) or model for a given classification problem? (对给定的分类问题如何选择最佳参数或模型?) Intelligent Sensors System 9-2 School of Information Engineering

Chapter9 Validation Prof. Dehan Luo Motivation (Cont.)(续) (3)Performance estimation(性能估计) Once we have chosen a model, how do we estimate its performance? Performance is typically measured by the TRUE ERROR RATE, the classifier’s error rate on the ENTIRE POPULATION (4) If we had access to an unlimited number of examples these questions have a straightforward answer (假如有无限样本数,这些问题就有直接答案) Choose the model that provides the lowest error rate on the entire population and, of course, that error rate is the true error rate Intelligent Sensors System 9-3 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo Motivation (Cont.)(续) (5) In real applications we only have access to a finite set of examples, usually smaller than we wanted Model selection (在实际应用中,所用样本数有限,通常小于模型选择所需要的样本数) One approach is to use the entire training data to select our classifier and estimate the error rate。 This naive approach has two fundamental problems (a)The final model will normally over fit (不适合)the training data. This problem is more pronounced (更突出)with models that have a large number of parameters Intelligent Sensors System 9-4 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo Motivation (Cont.)(续) (5) (在实际应用中,所用样本数有限,通常小于模型选择所需要的样本数) (b) The error rate estimate will be overly optimistic (lower than the true error rate),In fact, it is not uncommon to have 100% correct classification on training data (6) A much better approach is to split the training data into disjoint subsets: the holdout method (更好的方法是将训练数据分解为子数组,即保持方法) Intelligent Sensors System 9-4 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo Section Two The Holdout method (第二节 保持方法) 1、Split dataset into two groups(将数据组分为两组) (1) Training set: used to train the classifier (2) Test set: used to estimate the error rate of the trained classifier 2、A typical application the holdout method is determining a stopping point for the back propagation error (保持方法的典型应用是确 定反向误差传播的停止点) Intelligent Sensors System 9-5 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo The Holdout method (Cont.) (续) 3、The holdout method has two basic drawbacks (1) In problems where we have a sparse dataset we may not be able to afford the “luxury” of setting aside a portion of the dataset for testing 只有很少的数据组,担负不起将一部分数据放置旁边用于 测试的“奢侈” (2)Since it is a single train-and-test experiment, the holdout estimate of error rate will be misleading if we happen to get an “unfortunate” split (由于这是单一的训练与测试实验,假如碰巧遇到“不幸”数据分 离, 保持方法误差率估算将产生误导) Intelligent Sensors System 9-6 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo The Holdout method (Cont.) (续) 4、The limitations of the holdout can be overcome with a family of re-sampling methods at the expense of more computations (保持方法缺点可通过重复采样,花费更多的计算来克服) (1)Cross Validation (a)Random Subsampling (b)K-Fold Cross-Validation (c)Leave-one-out Cross-Validation Intelligent Sensors System 9-6 School of Information Engineering

Section Three Re-sampling techniques Chapter 9 Validation Prof. Dehan Luo Section Three Re-sampling techniques (第三节 重复采样技术) 1、Random Subsampling (1) Random Subsampling performs K data splits of the dataset (随机二次采样将数据组分成K个数据块) (2)Each split randomly selects a (fixed) no. examples without replacement (每个数据块选择(固定)的样本数而不置换) Intelligent Sensors System 9-7 School of Information Engineering

Section Three Re-sampling techniques Chapter 9 Validation Prof. Dehan Luo Section Three Re-sampling techniques (第三节 重复采样技术) 1、Random Subsampling (3)For each data split we retrain the classifier from scratch with the training examples and estimate Ei with the test examples (用每个数据块对初始用训练样本训练的分类器进行再训练并用测 试数据估计单个误差Ei ) Intelligent Sensors System 9-7 School of Information Engineering 总样本数 测试样本数

Chapter 9 Validation Prof. Dehan Luo 1、Random Subsampling (Cont.) (随机二次采样) (续) The true error estimate is obtained as the average of the separate estimates Ei (实际误差估计由单个误差估计的平均值而获得) This estimate is significantly better than the holdout estimate (随机误差估计要比保持误差估计方法更好 Intelligent Sensors System 9-8 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 2、K-Fold Cross-validation(K倍交叉确定) (1)Create a K-fold partition of the the dataset (创建一个K倍的数据组分区) For each of K experiments, use K-1 folds for training and the remaining one for testing (在K个实验中,K-1个用于训练,剩余的一个用用于测试) Intelligent Sensors System 9-9 School of Information Engineering 总样本数 测试样本数

Chapter 9 Validation Prof. Dehan Luo 2、K-Fold Cross-validation(K倍交叉确定)(续) (2)K-Fold Cross validation is similar to Random Subsampling (K倍交叉确定类似于随机二次采样技术) The advantage of K-Fold Cross validation is that all the examples in the dataset are eventually used for both training and testing ( K倍交叉确定的优点是在数据组里的所有样本最终都被用于训练和测试) (3)As before, the true error is estimated as the average error rate (与前相同,实际误差由平均误差率来估计) Intelligent Sensors System 9-10 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 3、Leave-one-out Cross Validation(去一交叉确定) (1)Leave-one-out is the degenerate case of K-Fold Cross Validation, where K is chosen as the total number of examples (去一交叉确定是K倍交叉确定变种,这里K被选择为总样本数) Intelligent Sensors System 9-11 School of Information Engineering 总样本数 单个测试样本

Chapter 9 Validation Prof. Dehan Luo 3、Leave-one-out Cross Validation(去一交叉确定) For a dataset with N examples, perform N experiments For each experiment use N-1 examples for training and the remaining example for testing (对N个样本的数据组,执行N个实验,每个实验,使用N-1个样本做训练,剩 余一个样本用于测试 ) Intelligent Sensors System 9-11 School of Information Engineering 总样本数 单个测试样本

Chapter 9 Validation Prof. Dehan Luo 3、Leave-one-out Cross Validation(去一交叉确定)(续) (2)As usual, the true error is estimated as the average error rate on test examples (与一般情况相同,实际误差由测试样本平均误差率来估计) 4、How many folds are needed?(需要多大倍数?) (1) With a large number of folds(大倍数时) (a)The bias of the true error rate estimator will be small (the estimator will be very accurate) (实际误差率估算器偏差小) Intelligent Sensors System 9-12 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 3、Leave-one-out Cross Validation(去一交叉确定)(续) 4、How many folds are needed?(需要多大倍数?) (1) With a large number of folds(大倍数时) (b) The variance of the true error rate estimator will be large (实际误差率估算器不一致性变大) (c)The computational time will be very large as well (many experiments) (计算时间变长) Intelligent Sensors System 9-12 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 4、How many folds are needed?(需要多大倍数?)(续) (2)With a small number of folds (小倍数时) (a)The number of experiments and, therefore, computation time are reduced (实验数和计算时间减少) (b)The variance of the estimator will be small (实际误差率估算器不一致性变小) (c)The bias of the estimator will be large (conservative or higher than the true error rate) (实际误差率估算器偏差小,同等于 或大于实际误差率) Intelligent Sensors System 9-13 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 4、How many folds are needed?(需要多大倍数?)(续) (3)In practice, the choice of the number of folds depends on the size of the dataset (在实践中倍数K的选择取决于数据组的大小) (a)For large datasets, even 3-Fold Cross Validation will be quite accurate (大数据组时,3被的交叉确定相当精确) (b)For very sparse datasets, we may have to use leave-one-out in order to train on as many examples as possible (小数据组 时,为了尽可能多的训练样本,只能使用去一交叉确定方法) (4) A common choice for K-Fold Cross Validation is K=10 通常选择 K=10 Intelligent Sensors System 9-14 School of Information Engineering

Section Four Three-way data splits Chapter 9 Validation Prof. Dehan Luo Section Four Three-way data splits (第四节 三路数据分离) 1、 Data splits (数据分离) If model selection and true error estimates are to be computed simultaneously, the data needs to be divided into three disjoint sets (假如模型选择和实际误差估计被同时计算,则数据需要分成三组) (1) Training set: (训练数组) (a)a set of examples used for learning: to fit the parameters of the classifier (为了适应分类器参数,这组样本用于学习) Intelligent Sensors System 9-15 School of Information Engineering

Section Four Three-way data splits Chapter 9 Validation Prof. Dehan Luo Section Four Three-way data splits (第四节 三路数据分离) 1、 Data splits (数据分离) (1) Training set: (训练数组) (b)In the MLP case, we would use the training set to find the “optimal” weights with the back-prop rule (在多层神经元网络中,选择训练数据组去获取反向转传播 规则条件下的“最佳”权重) Intelligent Sensors System 9-15 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 1、 Data splits (Cont.) (续) (2) Validation set: (确定数组) (a)Validation set is a set of examples used to tune the parameters of of a classifier (确定数组是用于调整分类器参数的样本数据) (b) In the MLP case, we would use the validation set to find the “optimal” number of hidden units or determine a stopping point for the back propagation algorithm (在多层神经元网络中,用确定数组去获取反向转传播算法) Intelligent Sensors System 9-16 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 1、 Data splits (Cont.) (续) (3) Test set: (测试数组) (a)a set of examples used only to assess the performance of a fully-trained classifier (用于评定完全训练后的分类器性能) (b)In the MLP case, we would use the test to estimate the error rate after we have chosen the final model (MLP size and actual weights) (在多层神经元网络中,用测试数组去评定最终确定 的模型(多层 神经元网络的大小和权重)的误差) Intelligent Sensors System 9-17 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 1、 Data splits (Cont.) (续) (3) Test set: (测试数组) (c) After assessing the final model with the test set, YOU MUST NOT further tune the model (用测试数据完成了对最终模型的评定后,就不能再对模型进 行调整) Intelligent Sensors System 9-17 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 2、Why separate test and validation sets? (1)The error rate estimate of the final model on validation data will be biased (smaller than the true error rate) since the validation set is used to select the final model (由于校定数组用于选择最终模型,因此,基于校定数据的最终模型误 差率估计是有偏置的) (2)After assessing the final model with the test set, YOU MUST NOT tune the model any further (用测试数据完成了对最终模型的评定后,就不能再对模型进 行调整) Intelligent Sensors System 9-18 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 3、Procedure outline 1. Divide the available data into training, validation and test set 2. Select architecture and training parameters 3. Train the model using the training set 4. Evaluate the model using the validation set 5. Repeat steps 2 through 4 using different architectures and training parameters 6. Select the best model and train it using data from the training and validation set 7. Assess this final model using the test set Intelligent Sensors System 9-19 School of Information Engineering

Chapter 9 Validation Prof. Dehan Luo 4、Three-way data split picture Intelligent Sensors System 9-20 School of Information Engineering