Jensen-Shannon (JS) distance is a symmetric measure of the distance between two probability distributions. It is based on the Kullback-Leibler (KL) divergence, but unlike KL, it does not depend on which distribution is considered the reference and which is considered the comparison. JS distance is also bounded between 0 and 1, which makes it easier to interpret and compare across different features.
Kolmogorov-Smirnov (KS) test is a non-parametric test that compares the cumulative distribution functions (CDFs) of two samples. It calculates the maximum difference between the CDFs and compares it to a critical value to determine if the samples are from the same distribution. KS test is sensitive to the shape, location, and scale of the distributions, but it can also be too sensitive for large datasets, where small changes can result in significant differences.
JS distance is more robust than KS test for numeric feature drift detection because it can handle large datasets without being too sensitive to minor fluctuations. It can also capture changes in the shape and spread of the distributions, not just the location. JS distance does not require any manual threshold or cutoff determinations, unlike KS test, which requires choosing a significance level and a critical value. JS distance can also be generalized to multivariate cases, while KS test is limited to univariate cases. References:
Which test is the best? We compared 5 methods to detect data drift on large datasets
The Jensen-Shannon Divergence: A Measure of Distance Between Probability Distributions
Kullback–Leibler vs Kolmogorov-Smirnov distance