Focal Loss: Designed to address the class imbalance
by down-weighting the easy examples even if their number is large.
https://doi.org/10.48550/arXiv.1708.02002
Definition
Basically, Focal Loss is an advanced version of
alpha-balanced Cross Entropy Loss:
Where: - refers to the
predicted probability of corresponding category(class).
For Binary Classification, is usually calculated with
sigmoid;
For Multi-Class Classification, is usually calculated with
softmax.
eases the class
imbalance problem. by
default.
control & splits
the difficulty to detect samples, especially decrease the
loss of simple samples.
by default.
Backgrounds
In real-life scenarios, there lies situations of imbalance between
easy-samples and hard-samples, whereas the
imbalance between the loss of easy-samples and
hard-samples would significantly impact the performance of
models: they do have a preference of learning easy ones, and usually
ignore the hard ones.
Sadly we're planning to deprecate focal loss's application with in
our project... We're more focused on ensemble learnings, including
Random Forest, Gradient Boosting Decision Trees and XGBoostClassifiers
in the future, and, we're not expecting a perfect outcome of Multi-Layer
Perceptions.
However, what' I've learned about keras backends and losses
implementations is still a very fruitful experience for me ;)