Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | |
7 | 8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 | 20 |
21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 | 29 | 30 |
Tags
- 3d medical image
- objective functions for machine learning
- freebooze
- sidleup
- remove outliers
- straightup
- non parametic softmax
- thresholding
- Policy Gradient
- shadowing
- MRI
- model-free control
- checkitout
- noise contrast estimation
- normalization
- resample
- Inorder Traversal
- fastapi
- loss functions
- Actor-Critic
- domain adaptation
- REINFORCE
- scowl
- clip intensity values
- sample rows
- rest-api
- Knowledge Distillation
- Excel
- 자료구조
- pulloff
Archives
- Today
- Total
Let's Run Jinyeah
Unsupervised Feature Learning via Non-Parametric Instance Discrimination 본문
Paper Review/Self-Supervised learning
Unsupervised Feature Learning via Non-Parametric Instance Discrimination
jinyeah 2022. 5. 9. 20:28Inspiration
- The top-5 classification error is significantly lower than top-1 error
- second highest responding class in the softmax output to an image is more likely to be visually correlated
Apparent Similarity is learned not from semantic annotations, but from visual data themselves
Instance-level discrimination
- An image is distinctive in its own right, and each could differ significantly from other images in the same semantic category
- Major challenge - The number of classes is the size of the entire training set
Method
1. Non-Parametric Softmax Classifier
- {Wj}: weight vector for class j, {Vj}: embedded feature for all the images j
- use {Vj} instaded of {Wj}
- {Vj} for all the images are needed --> maintain a feature memory bank for storing them
2. Noise-Contrastive Estimation (NCE)
- Computing the non-parametric softmax classifier for every instance is costly
- cast multi-class classification into a set of binary classification problem (positive/negative)
- discriminate between data samples and noise samples
- data samples (positive samples): past embedded feature for data i in memory bank
- noise samples (negative samples): randomly sampled embedded data for other data in memory bank
3. [Test] Weighted k-Nearest Neighbor Classifier
- compute the embedded feature of test image
- compare it against the embeddings of all the images in the memory bank using cosine similarity
- top k nearest neightbors would then ben used to make the prediction via weighted voting (k=200)
Experiment
1. Setting
- backbone network (AlexNet, VGG16, Resnet18, Resnet50) and linear classification (SVM)
- Dataset: ImageNet ILSVRC
2. Evaluate Performance
- Perform linear SVM on the intermediate features from conv1 to conv5
- Perform KNN on the output features
Reference
https://velog.io/@dongdori/InfoNCE-Metric-Learning
https://www.youtube.com/watch?v=wyBzB9iRveI
Comments