일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 |
- eks
- Software maestro
- blue-green
- MSSQL
- slow query
- propogation
- yml
- Kafka
- Leaderboard
- keda
- docket
- Benchmarks
- 0 replica
- 스프링부트
- Helm
- Strimzi
- Database
- 동등성
- Kubernetes
- Salting
- SW 마에스트로
- SW Maestro
- spring boot
- traceId
- minreplica
- Grafana
- logback
- hammerDB
- zset
- Debezium
- Today
- Total
김태오
MSE 본문
MSE(Mean Squared Error)
Error Metric: It is a commonly used error metric in regression problems.
Calculation: MSE is calculated by taking the average of the squared differences between the predicted values and the true values.
Objective: The main goal in minimizing MSE is to improve the model's accuracy by reducing the squared errors.
Sensitivity: MSE is more sensitive to outliers, as it squares the errors, amplifying the impact of larger deviations.
Non-negative: The values of MSE are always non-negative, with a perfect prediction having an MSE of 0.
Interpretation: Lower values of MSE indicate better model performance, but the scale of the MSE depends on the scale of the target variable.
*Squared refers to the mathematic operation of raising the degree to 2.
MSE Cost
MSE Cost: A cost function used to evaluate the performance of a machine learning model in regression problems.
Calculation: MSE is the average of the squared differences between predicted values (y_pred) and true values (y_true).
Minimization: The goal of training a model is to minimize the MSE cost to achieve more accurate predictions.
Interpretation: Lower MSE cost values indicate better model performance; however, the scale of the MSE depends on the scale of the target variable.
'ML' 카테고리의 다른 글
Perceptron (0) | 2023.04.23 |
---|---|
CE Loss (0) | 2023.04.23 |
Classification vs. Regression (0) | 2023.04.23 |
Extending Linear Regression for Binary Classification (0) | 2023.04.23 |
Learning Rate (0) | 2023.04.23 |