Presentation is loading. Please wait.

Presentation is loading. Please wait.

Deep Metric Learning Improved Deep Metric Learning with Multi-class N-pair Loss Objective Kihyuk Sohn. NIPS 2016. cited :1 Deep Relative Distance Learning:

Similar presentations


Presentation on theme: "Deep Metric Learning Improved Deep Metric Learning with Multi-class N-pair Loss Objective Kihyuk Sohn. NIPS 2016. cited :1 Deep Relative Distance Learning:"— Presentation transcript:

1 Deep Metric Learning Improved Deep Metric Learning with Multi-class N-pair Loss Objective Kihyuk Sohn. NIPS cited :1 Deep Relative Distance Learning: Tell the Difference Between Similar Vehicles Liu H, Tian Y, Yang Y, et al. CVPR 2016.cited :1

2 Outline Improved Deep Metric Learning with Multi-class N-pair Loss Objective Distance Metric Learning Deep Metric Learning with Multiple Negative Examples Experimental Results Deep Relative Distance Learning: Tell the Difference Between Similar Vehicles Deep Relative Distance Learning

3 Distance Metric Learning
2018/8/9 Distance Metric Learning Contrastive loss Triplet loss Triplet loss只需要正样本与未知输入的相似性减去与负样本的不相似的值大于阈值m

4 hard negative data mining
An evident way to improve the vanilla triplet loss is to select a negative example that violates the triplet constraint. However, hard negative data mining can be expensive with a large number of output classes for deep metric learning.

5 Distance Metric Learning
Triplet loss However, during one update, the triplet loss only compares an example with one negative example while ignoring negative examples from the rest of the classes

6 Learning to identify from multiple negative examples
When N = 2:

7 Learning to identify from multiple negative examples
When N > 2, (L+1)-tuplet loss coupled with a single example per negative class can be written as follows: partition function of the likelihood P(y = y+).

8 Deep Metric Learning with Multiple Negative Examples
triplet loss (N+1)-tuplet loss

9 N-pair loss for efficient deep metric learning
The number of examples to evaluate for each batch grows in quadratic to M and N, it again becomes impractical to scale the training for a very deep convolutional networks. So we introduce an effective batch construction to avoid excessive computational burden. N pairs of examples from N different classes: The positive example: The negative examples:

10 N-pair loss for efficient deep metric learning
multi-class N-pair loss (N-pair-mc): triplet loss, one-vs-one N-pair loss (N-pair-ovo):

11 N-pair loss for efficient deep metric learning
Ideally, we would like the loss function to incorporate examples across every class all at once. But it is usually not attainable for large scale deep metric learning due to the memory bottleneck from the neural network based embedding.

12 Hard negative class mining
1.Evaluate Embedding Vectors: choose randomly a large number of output classes C; for each class, randomly pass a few (one or two) examples to extract their embedding vectors. 2.Select Negative Classes: select one class randomly from C classes from step 1. Next, greedily add a new class that violates triplet constraint the most w.r.t. the selected classes till we reach N classes. When a tie appears, we randomly pick one of tied classes. 3.Finalize N-pair: draw two examples from each selected class from step 2.

13 Experimental Results Fine-grained visual object recognition and verification

14 Experimental Results Distance metric learning for unseen object recognition [21] Song H O, Xiang Y, Jegelka S, et al. Deep metric learning via lifted structured feature embedding[J]. arXiv preprint arXiv: , 2015.

15 Experimental Results Face verification and identification
We train our networks on the WebFace database , which is composed of 494, 414 images from 10, 575 identities, and evaluate the quality of embedding networks trained with different metric learning objectives on Labeled Faces in the Wild (LFW) database

16 Experimental Results

17 Deep Relative Distance Learning: Tell the Difference Between Similar Vehicles
Liu H, Tian Y, Yang Y, et al. CVPR 2016.cited :1

18 Vehicle re-identification task
2018/8/9 Vehicle re-identification task Vehicle re-identification is the problem of identifying the same vehicle across different surveillance camera views 候选车辆中找到与检测的车辆一致

19 2018/8/9 Contribution present a new vehicle re-identification dataset named “VehicleID”,the dataset includes over 200,000 images of about 26,000 propose an end-to-end framework DRDL that are suited for both vehicle retrieval and vehicle re-identification tasks. DRDL的基本思想是最小化相同车辆图像的距离,并使其他车辆的距离最大化。

20 Framework Framework of our model for vehicle re-identification

21 Triplet Loss 2018/8/9 Triplet loss function
Some special cases that the triplet loss may judge fasely when processing randomly selected triplet units 在图中反应为,蓝色点 “Anchor” 与同标签的红色点 “Positive”,之间的距离大于 Anchor 与 Negative 之间的距离。所以损失函数可以较容易的去学习。 而上一幅图的右边的情况就不同了。 三元损失函数为 0,因为蓝色点 “Anchor” 与同标签的红色“Positive”之间的距离小于 “Anchor” 点与不同标签的绿色 “Negative” 点之间的距离。因此,这个神经网络在反向传播学习阶段,会忽视这个三元组。 此外,由于三元组损失函数在反向传播中,实际上是要将同标签的越“拉”越近(Anchor 与 Positive),不同标签的越“推”越远(Anchor 与 Negative),所以损失函数对于 Anchor 点的选择是相当敏感的。所以,Anchor 点选择不好的话,在训练阶段会造成极大的干扰,使得网络收敛的很慢。需要很多个正确的三元组样本点去纠正它。

22 Coupled Clusters Loss 2018/8/9 The positve set and the negative set
It is assumed that samples belong to the same identity should locate around a common center point in the d-dimensional Euclidean space

23 Coupled Clusters Loss Estimate the center point as the mean value of all positive samples The relative distance relationship is reflected as

24 Coupled Clusters Loss The coupled clusters loss:
Where is the nearest negative sample to the center point

25 Coupled Clusters Loss The advantage of the coupled clusters loss
Distances are measured between samples and a cluster center , it ensures the distances we get and the direction the samples will be moved to. guarantees all positive samples which are not close enough to the center will move closer. The selection of the nearest negative sample will further prevent the relative distance relationship Eq.(5) being too easily satisfied compared with a randomly selected negative reference.

26 Mixed Difference Network Structure
2018/8/9 Mixed Difference Network Structure There is a small but quite important difference between identifying a specific vehicle and person,two vehicles running on road may have the same visual appearance if they belong to the same vehicle model There may exist some special makers to distinguish Measure the distance: Whether they belong to the same vehicle model Whether they are same vehicle 由于车辆再识别可能存在识别两辆模型一致的车,所以仍需要判断车上是否存在一些特殊的标志,所以为了解决这种情况,在比较两辆车的时候应当考虑两个两面,首先判断它们是否属于同一个型号的车,然后再判断是否是同一辆车

27 Mixed Difference Network Structure
2018/8/9 Mixed Difference Network Structure 由于单支的网络架构无法同时提取车辆的模型信息和the instance difference between two candidates of the same vehicle model both

28 2018/8/9 VehicleID Dataset “VehicleID” dataset contains images of vehicles in total 考虑到文中的数据比原先的人形再检测的的测试数据要大的多,所以进一步将再分为大中小三个子集

29 Experiments Vehicle Model Verification 2018/8/9
我们首先在“CompCars”数据集上进行人脸验证管道之后执行车辆模型验证任务,以概述我们提出的方法。 给定两个车辆图像,我们需要验证它们是否属于相同的车辆模型;注意,我们的方法和标准三重损失网络都不是为车辆模型验证或分类任务而设计的。 我们并没有真正期望我们的模型具有可比性的结果与其他成熟的解决方案,

30 Experiments Vehicle Retrieval

31 Experiments Vehicle Re-identification

32 Experiments Vehicle Re-identification

33 THANK YOU


Download ppt "Deep Metric Learning Improved Deep Metric Learning with Multi-class N-pair Loss Objective Kihyuk Sohn. NIPS 2016. cited :1 Deep Relative Distance Learning:"

Similar presentations


Ads by Google