using numbers within a log scale (considering logspace()) as one of the params in tuning a Logistic Regression Model?
model learns by checking how far the number is from the optimum. If answer is 0.8 but model's answer is 0.7 = error is 0.1. Important. Model tries to fix it. If answer is 0.0003 but model's answer is 0.0000, error is 0.0003. Not important. Doesnt need fixing. This skews the learning to ignore small values. Log scale forces the model to learn in a way that pays attention to the ratio of right and wrong answer, rather than the difference. Its almost always good (more intuitive for a person)
Обсуждают сегодня