김태오

Extending Linear Regression for Binary Classification 본문

ML

Extending Linear Regression for Binary Classification

ystc1247 2023. 4. 23. 01:32

Examples of a extension ) 

Email -> Spam/Non-Spam

Price -> Low/High

Tumor -> Malignant/Benign

 

Threshold 

- a value that is used to make a binary decision based on a continuous value.
- It is commonly used in binary classification problems, where the output of the model is a probability score between 0 and 1, and the threshold is used to determine whether the input belongs to one of two classes.
- The threshold value can be adjusted to control the trade-off between precision and recall, which are two important metrics for evaluating classification models.
- The choice of an appropriate threshold depends on the specific problem and the costs associated with false positives and false negatives.

 

Activation Function

- An activation function is a mathematical function that is applied to the output of a neural network layer to introduce nonlinearity into the model.
- It is used to determine the output of each neuron in the layer, based on the weighted sum of the inputs and the bias term.
- There are several activation functions commonly used in machine learning, including the sigmoid function, the ReLU function, and the tanh function, each with its own advantages and disadvantages.
- The choice of an appropriate activation function depends on the specific problem and the architecture of the neural network being used.

Various types of activation functions

** A Binary Step Function is a main example of a hard threshold , where a Sigmoid Function is a example of a soft threshold.

** The MSE Cost is not used here, as we are dealing with classification, not regression. Instead, we use CE(Cross-Entropy Loss), which I will post after on.

'ML' 카테고리의 다른 글

MSE  (0) 2023.04.23
Classification vs. Regression  (0) 2023.04.23
Learning Rate  (0) 2023.04.23
Feature Scaling  (0) 2023.04.23
Polynomial Regression  (0) 2023.04.23