Pub. Date | : Feb' 2024 |
---|---|
Product Name | : The IUP Journal of Telecommunications |
Product Type | : Article |
Product Code | : IJTC020224 |
Author Name | : Ismail A Mageed |
Availability | : YES |
Subject/Domain | : Arts & Humanities |
Download Format | : PDF Format |
No. of Pages | : 20 |
The paper explores the Kullback-Leibler divergence formalism (KLDF) applied to stable queue manifold. More potentially, both service time probability and cumulative functions, which make KLDF exact, are obtained. The credibility of KLDF is justified through consistency axioms. In other words, the current work provides a cutting-edge unification of information theory, combined with applied probability and divergence theory. Accordingly, this paper adds new knowledge, which extends a novel contemporary information theoretic link to some other different mathematical disciplines. The significance of Kullback-Leibler divergence (KLD) is highlighted through references to some potential applications of KLD to biometry.
It is a common practice in probabilistic inverse approaches (Agranovich and Marchenko, 2020) to treat both measurable data and model parameters that are unknown as uncertain. This practice provides a deeper understanding of the uncertainty associated with measured data and model parameters. (Pardo-Iguzquiza and Dowd, 2020; Korbel, 2021; Martini et al., 2021; Naman et al., 2021; Salvatore, 2021; Golan and Foley, 2022; Mageed and Zhang, 2022a; Mageed and Zhang, 2023a; and Mehr et al., 2023). Kullback-Leibler divergence (KLD) is a method used to compare two probability distributions (Sciullo et al., 2020; Kouvatsos and Mageed, 2021a; Mageed and Kouvatsos, 2021; Mageed et al., 2022; Mageed and Zhang, 2023b; and Mageed, 2023).
Biometrics, Extensive maximum entropy (EME), Queue, Kullback-Leibler divergence formalism (KLDF), Non-extensive maximum extropy (NME), Probability density function (PDF), Server utilization (SU), Short-range interactions