Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
31 |
Tags
- model-free control
- shadowing
- Inorder Traversal
- checkitout
- loss functions
- MRI
- 자료구조
- rest-api
- scowl
- domain adaptation
- Policy Gradient
- freebooze
- sample rows
- clip intensity values
- pulloff
- remove outliers
- Knowledge Distillation
- normalization
- fastapi
- REINFORCE
- Actor-Critic
- 3d medical image
- noise contrast estimation
- resample
- objective functions for machine learning
- straightup
- sidleup
- non parametic softmax
- thresholding
- Excel
Archives
- Today
- Total
목록loss functions (1)
Let's Run Jinyeah

To improve the performance of a Deep Learning model the goal is to the minimize or maximize the objective function. For regression, classification, and regression problems, the objective function is minimzing the difference between predictions and ground truths. Therefore, the objective function is also called loss functions. Regression Loss Functions Squared Error Loss Absolute Error Loss Huber..
Deep Learning/Theory
2022. 5. 10. 13:14