김태오

Feature Scaling 본문

ML

Feature Scaling

ystc1247 2023. 4. 23. 01:13

Feature Scaling: a preprocessing step in machine learning that involves transforming the input features to a common scale, typically to improve the performance of machine learning algorithms.

It is important to keep features on a similar scale,

ex ) x1 is range 0<x1<2000, x2 is range 0<x2<5;

x1 is divided by 2000, x2 by 5 then both features are on the scale 0<x1/400,x2<1.

This results in a faster conversion process.

** Do not normalize x0 = 1.

** There is no need to do the feature normalization process with the normal equation.

When the scales of the features are adjusted to a similar scale

'ML' 카테고리의 다른 글

Extending Linear Regression for Binary Classification  (0) 2023.04.23
Learning Rate  (0) 2023.04.23
Polynomial Regression  (0) 2023.04.23
Gradient Descent vs. Normal Equation  (0) 2023.04.23
Learning Modes in ML  (0) 2023.04.22