The Łukaszyk–Karmowski metric (LK-metric) defines a distance between two random variables or vectors. LK-metric is not a metric as it does not satisfy the identity of indiscernibles axiom of the metric; for the same random variables, its value is greater than zero, providing they are not both degenerated.
LK-metric[1] between two continuous random variables X and Y having a joint probability density function (PDF) F(x,y) is defined as
If X and Y are independent, then
where f(x) and g(y) are PDFs of X and Y, and subscripts denote their types. For example, if X and Y have normal PDFs having the same standard deviation σ but different means μx, μy, then
LK-metric between two random variables having normal PDFs and the same
standard deviations σ = {0, 0.2, 0.4, 0.6, 0.8, 1}.
where μxy=|μx-μy|. For discrete X and Y, LK-metric has a form
and for random vectors X and Y, LK-metric becomes
where d(x,y) is a metric function, such as the Euclidean metric. In case, X and Y are mutually and internally independent, a simplified form of LK-metric can also be defined as
If X and Y are degenerated, almost sure variables having the Dirac delta (or one-point, in the discrete case) PDFs, then LK-metric becomes the metric between their mean values.
and obviously
However, in any other case
LK-metric satisfies all the remaining axioms of the metric. It is symmetric by definition, and it satisfies the triangle inequality
Thus
since
LK-metric is not the only distance function that does not satisfy the identity of indiscernibles axiom[2]. For example, the partial metric[3] also allows each object not necessarily to have zero distance from itself. However, the partial metric satisfies two additional axioms of small self-distances and modified triangle inequality, which are not satisfied by LK-metric[4]. Remarkably, the identity of indiscernibles ontological axiom, introduced to philosophy by Gottfried Wilhelm Leibniz around 1686, is also invalidated by the ugly duckling theorem[5] stated in 1969 and asserting that every two objects one perceives are equally similar (or equally dissimilar). Consequently, the identity of indiscernibles is neither a logical nor empirical principle.
This characteristic non-zero distance effect built in LK-metric allows to avoid ill-conditioning problems in radial basis function interpolation[6][7] and inverse distance weighting[8][9][10][11], where the interpolation accuracy can be improved by choosing the type of distance metric[12][11] and leads to a smooth interpolation function[13]. By preventing zero distances based on parameter uncertainty, LK-metric can, furthermore, be used in analysis of nondeterministic dynamical systems with competing attractors[14]. Since LK-metric represents the mean of distances between all the outcomes of the two uncertain objects, it can also be used in uncertain nearest neighbor classification[15]. The actual value of an uncertain object is modeled by a probability density function[16]. LK-metric has been successfully applied in various fields of science and technology[17][18][19][20][21][22][23][24][13][25][26][27][28][29][30][7][31][32][33][34][35][10][36][37][38][39][40][41][42][43][44][45][11][46][47][48][49][14][50][51].