Information Measures for Truncated Random Variables

Moharana, Rajesh (2019) Information Measures for Truncated Random Variables. PhD thesis.

[img]PDF (Restriction upto -20/03/2022)
Restricted to Repository staff only

2559Kb

Abstract

The concept of entropy plays a crucial role in information theory. Many authors obtained several properties of entropy and its various generalizations so far. The divergence measure was proposed to measure the inefficiency of taking an approximate distribution when the actual distribution is known. In reliability theory we often get random observations which are truncated in nature. The purpose of the thesis is to study properties of various information measures in truncated domain. First, we consider measures based on the probability density functions. These are Shannon’s entropy, weighted Shannon’s entropy, weighted generalized entropy, weighted Kullback-Leibler divergence and weighted generalized divergence. In these cases, we study various properties. Mainly, we obtain characterizations, inequalities, bounds, uncertainty orders and the effect of monotone transformations. Nonparametric classes based on monotonicity property of the uncertainty measures are introduced. The measures based on the density functions have some drawbacks. So, we consider weighted cumulative residual entropy, weighted cumulative entropy and their generalizations. These measures are based on the distribution function and the survival function. We obtain several characterizations. The importance of the characterization result is that it determines the distribution function uniquely under certain conditions. Always it is not an easy task to obtain a closed-form expression of the proposed measure. Therefore, we obtain various bounds and inequalities. The effect of monotone transformations is discussed. New uncertainty orders and classes of lifetime distributions are introduced. In addition, we propose estimators for the case of weighted extended cumulative residual entropy and weighted generalized cumulative entropy using empirical approach. Large sample property of the proposed estimators is studied. It is noticed that there are distributions such as power-Pareto and Govindarajulu which do not have closed-form distribution functions though these have closed-form quantile functions. Motivated by this, we consider quantile-based inaccuracy measure and its dynamic versions. Various results including characterization, effect of transformation and bounds are derived. Few examples are considered to illustrate the results.

Item Type:Thesis (PhD)
Uncontrolled Keywords:Entropy; Weighted entropy; Divergence measure; Cumulative residualentropy; Cumulative entropy; Quantile-based inaccuracy measure; Truncated random variable; Characterization; Stochastic order; Order statistic; Record value.
Subjects:Mathematics and Statistics > Analytical Mathematics
Mathematics and Statistics > Applied Mathematics
Mathematics and Statistics > Statistics
Divisions: Sciences > Department of Mathematics
ID Code:10089
Deposited By:IR Staff BPCL
Deposited On:19 Mar 2020 15:24
Last Modified:19 Mar 2020 15:24
Supervisor(s):Kayal, Suchandan

Repository Staff Only: item control page