Notice
Recent Posts
Recent Comments
Link
| 일 | 월 | 화 | 수 | 목 | 금 | 토 |
|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 5 | 6 | |
| 7 | 8 | 9 | 10 | 11 | 12 | 13 |
| 14 | 15 | 16 | 17 | 18 | 19 | 20 |
| 21 | 22 | 23 | 24 | 25 | 26 | 27 |
| 28 | 29 | 30 | 31 |
Tags
- 3d medical image
- sample rows
- Knowledge Distillation
- loss functions
- Policy Gradient
- 자료구조
- objective functions for machine learning
- non parametic softmax
- Inorder Traversal
- freebooze
- domain adaptation
- REINFORCE
- shadowing
- fastapi
- MRI
- normalization
- Actor-Critic
- straightup
- model-free control
- sidleup
- Excel
- resample
- noise contrast estimation
- checkitout
- remove outliers
- thresholding
- scowl
- clip intensity values
- pulloff
- rest-api
Archives
- Today
- Total
목록loss functions (1)
Let's Run Jinyeah
To improve the performance of a Deep Learning model the goal is to the minimize or maximize the objective function. For regression, classification, and regression problems, the objective function is minimzing the difference between predictions and ground truths. Therefore, the objective function is also called loss functions. Regression Loss Functions Squared Error Loss Absolute Error Loss Huber..
Deep Learning/Theory
2022. 5. 10. 13:14