| 일 | 월 | 화 | 수 | 목 | 금 | 토 | 
|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | |||
| 5 | 6 | 7 | 8 | 9 | 10 | 11 | 
| 12 | 13 | 14 | 15 | 16 | 17 | 18 | 
| 19 | 20 | 21 | 22 | 23 | 24 | 25 | 
| 26 | 27 | 28 | 29 | 30 | 31 | 
- 3d medical image
- Knowledge Distillation
- rest-api
- noise contrast estimation
- thresholding
- pulloff
- Inorder Traversal
- shadowing
- straightup
- REINFORCE
- loss functions
- clip intensity values
- Actor-Critic
- resample
- MRI
- Policy Gradient
- objective functions for machine learning
- checkitout
- sidleup
- model-free control
- Excel
- 자료구조
- scowl
- normalization
- remove outliers
- domain adaptation
- freebooze
- fastapi
- non parametic softmax
- sample rows
- Today
- Total
목록Paper Review (2)
Let's Run Jinyeah
 Distilling the Knowledge in Neural Network
			
			
				Distilling the Knowledge in Neural Network
				[논문] Distilling the Knowledge in Neural Network Knowledge Distillation의 초창기 논문 (2014 NIPS workshop) one model compression technique for bringing computations to edge devices. Where the goal is to have a small and compact model to mimic the performance of the cumbersome model Supervised learning (Label 존재) Response based knowledge (Teacher의 soft target) + Offline distillation (pre-trained Teacher..
 Unsupervised Feature Learning via Non-Parametric Instance Discrimination
			
			
				Unsupervised Feature Learning via Non-Parametric Instance Discrimination
				Inspiration The top-5 classification error is significantly lower than top-1 error second highest responding class in the softmax output to an image is more likely to be visually correlated Apparent Similarity is learned not from semantic annotations, but from visual data themselves Instance-level discrimination An image is distinctive in its own right, and each could differ significantly from o..
