Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
Tags
- scowl
- 자료구조
- fastapi
- Excel
- model-free control
- MRI
- Knowledge Distillation
- freebooze
- rest-api
- REINFORCE
- 3d medical image
- objective functions for machine learning
- sample rows
- Policy Gradient
- pulloff
- domain adaptation
- noise contrast estimation
- Actor-Critic
- clip intensity values
- non parametic softmax
- resample
- normalization
- shadowing
- straightup
- loss functions
- checkitout
- Inorder Traversal
- remove outliers
- sidleup
- thresholding
Archives
- Today
- Total
Let's Run Jinyeah
Normalization 본문
What is the normalization formula used for?
Normalization is useful in statistics for creating a common scale to compare data sets with very different values.
Deep Learning view?
- 학습의 안정화: Gradient vanising/exploding 문제를 해결할 수 있음
- 학습시간의 단축: learning rate를 크게 할 수 있음
- 성능 개선: local optimum에서 빨리 빠져나올 수 있음
Min-Max Normalization
Method
normalization formula to [0,1] xnormalized = (x-xmin) / (xmax-xmin)
- if x==xmin, xnormalized = 0
- if x==xmax, xnormalized = 1
Normalization formula for custom ranges(a,b) xnormalized = a + (((x-xmin)*(b-a)) / (xmax-xmin))
- if x==xmin, xnormalized = a
- if x==xmax, xnormalized = a+(b-a) = b
Summary
- guaranteed to reshape both of our features to be between 0 and 1
- MinMax Scaling is that it is highly influenced by the maximum and minimum values in our data so if our data contains outliers it is going to be biased
- might compress all inliers in a narrow range if affected by outliers
- Normalizing fixed the squishing problem on the y-axis, but the x-axis is still problematic. Now if we were to compare these points, the y-axis would dominate; the y-axis can differ by 1, but the x-axis can only differ by 0.4.
Standardization
Method
scales the features so that they have μ=0 and σ=1
- μ : mean value of the feature
- σ : standard deviation of the feature
Z-Score Normalization
normalization formula xnormalized = (x-μ)/σ
- if x==mean of the all the values of the feature, xnormalized = 0
- if x < mean of the all the values of the feature, xnormalized < 0
Summary
- avoids this outlier issue
- The size of those negative, positive numbers is determined by the standard deviation of the feature. (if large standard deviation, the normalized values will be closer to 0)
Pytorch Normalization (Standardization)
- 각 channel별로 mean, std 값 할당
transform = torchvision.transforms.Compose([
torchvision.transforms.ToTensor(),
torchvision.transforms.Normalize(
mean=[mean_1, mean_2, mean_3],
std=[std_1, std_2, std_3],
),
])
reference
https://www.codecademy.com/article/normalization
https://www.indeed.com/career-advice/career-development/normalization-formula
'Deep Learning > Theory' 카테고리의 다른 글
Transfer Learning and Domain Adaptation (0) | 2022.08.10 |
---|---|
Entropy and Cross-Entropy (0) | 2022.07.31 |
Bayes Decision Theory (0) | 2022.05.15 |
Objective function/Loss function (0) | 2022.05.10 |
Variance & Bias (0) | 2021.08.30 |
Comments