Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
Tags
- normalization
- pulloff
- rest-api
- checkitout
- Knowledge Distillation
- non parametic softmax
- domain adaptation
- Excel
- straightup
- noise contrast estimation
- fastapi
- MRI
- REINFORCE
- objective functions for machine learning
- thresholding
- Actor-Critic
- remove outliers
- sample rows
- clip intensity values
- freebooze
- 자료구조
- model-free control
- Policy Gradient
- scowl
- loss functions
- sidleup
- 3d medical image
- Inorder Traversal
- shadowing
- resample
Archives
- Today
- Total
목록temperature (1)
Let's Run Jinyeah

[논문] Distilling the Knowledge in Neural Network Knowledge Distillation의 초창기 논문 (2014 NIPS workshop) one model compression technique for bringing computations to edge devices. Where the goal is to have a small and compact model to mimic the performance of the cumbersome model Supervised learning (Label 존재) Response based knowledge (Teacher의 soft target) + Offline distillation (pre-trained Teacher..
Paper Review/Knowledge Distillation
2022. 5. 10. 13:06