일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | |
7 | 8 | 9 | 10 | 11 | 12 | 13 |
14 | 15 | 16 | 17 | 18 | 19 | 20 |
21 | 22 | 23 | 24 | 25 | 26 | 27 |
28 | 29 | 30 |
- 3d medical image
- resample
- model-free control
- clip intensity values
- domain adaptation
- Policy Gradient
- checkitout
- non parametic softmax
- objective functions for machine learning
- 자료구조
- shadowing
- Inorder Traversal
- sample rows
- loss functions
- REINFORCE
- fastapi
- normalization
- pulloff
- noise contrast estimation
- rest-api
- sidleup
- straightup
- Excel
- Knowledge Distillation
- thresholding
- scowl
- freebooze
- MRI
- remove outliers
- Actor-Critic
- Today
- Total
목록Paper Review (2)
Let's Run Jinyeah

[논문] Distilling the Knowledge in Neural Network Knowledge Distillation의 초창기 논문 (2014 NIPS workshop) one model compression technique for bringing computations to edge devices. Where the goal is to have a small and compact model to mimic the performance of the cumbersome model Supervised learning (Label 존재) Response based knowledge (Teacher의 soft target) + Offline distillation (pre-trained Teacher..

Inspiration The top-5 classification error is significantly lower than top-1 error second highest responding class in the softmax output to an image is more likely to be visually correlated Apparent Similarity is learned not from semantic annotations, but from visual data themselves Instance-level discrimination An image is distinctive in its own right, and each could differ significantly from o..