From Geometric Mean to KL Divergence to Noisy Labels

If you have ever participated in a machine learning competition (like those held by Kaggle), you may be familiar with the geometric mean as a simple method that combines the output of multiple models. In this post, I will review the geometric mean and its relation to the Kullback–Leibler (KL) divergence. Based on this relation, …

From Geometric Mean to KL Divergence to Noisy Labels Read More »

Variance: Order out of Chaos

In the previous post, I wrote about expectation, a great tool that helps us capture notion of “center” for a random variable. In this post, I will write about variance, another great tool that represents the “spread” of a random variable. Consider a random variable with two probability density functions (PDFs) and , shown in …

Variance: Order out of Chaos Read More »

What am I doing here?

Machine learning (ML) with all its hypes and promises is an extremely interesting area of research. Its study in the most fundamental form requires a strong background in mathematics, statistics, computer science, and engineering. But, when it is applied to real-world problems, it even requires further knowledge in different application domains such as medicine, biology, chemistry, …

What am I doing here? Read More »