일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 | 29 |
30 | 31 |
- Leaderboard
- Strimzi
- keda
- minreplica
- eks
- slow query
- blue-green
- 동등성
- Kafka
- Software maestro
- 0 replica
- docket
- zset
- SW Maestro
- Benchmarks
- MSSQL
- traceId
- hammerDB
- Database
- spring boot
- Salting
- propogation
- yml
- Helm
- Grafana
- logback
- 스프링부트
- Debezium
- Kubernetes
- SW 마에스트로
- Today
- Total
목록ML (16)
김태오

This is a simple table for comparison between classification and regression, as it occurs continuously in ML study. to sum it up, Classification predicts categorical labels, while regression predicts continuous values in machine learning tasks.

Examples of a extension ) Email -> Spam/Non-Spam Price -> Low/High Tumor -> Malignant/Benign Threshold - a value that is used to make a binary decision based on a continuous value. - It is commonly used in binary classification problems, where the output of the model is a probability score between 0 and 1, and the threshold is used to determine whether the input belongs to one of two classes. - ..

Learning Rate : a hyperparameter that controls how much the model adjusts its parameters in response to the error gradient during training. If the learning rate is too low, it takes too much time in optimization, while when it is too high, it has difficulty in converging itself.

Feature Scaling: a preprocessing step in machine learning that involves transforming the input features to a common scale, typically to improve the performance of machine learning algorithms. It is important to keep features on a similar scale, ex ) x1 is range 0

Polynomial: a mathematical expression consisting of variables (usually represented by x), coefficients, and exponents (powers) that are combined using arithmetic operations such as addition, subtraction, multiplication, and division. Polynomial Regression : a type of regression analysis in which the relationship between the independent variable (x) and the dependent variable (y) is modeled as an..

Gradient Descent and Normal Equation are two popular techniques used in machine learning for finding the optimal parameters of a model. Gradient Descent is an iterative optimization algorithm that tries to minimize the cost function by adjusting the model parameters in the direction of steepest descent of the cost function. It starts with an initial guess of the model parameters and iteratively ..