الفهرس | Only 14 pages are availabe for public view |
Abstract It is logically expected that smaller samples include less information as compared with larger samples. Entropy is a fundamental uncertainty measure of a random variable. It was introduced by Shannon (1948) as a measure of information which provides a quantitative measure of the uncertainty.Measuring of entropy is an important concept in many areas such as statistics, economics, physical, chemical and biological phenomenon. More entropy is referred to less information that found in sample. The goal of the current thesis is to discuss estimation of two kinds of entropy (Shannon entropy and Rényi entropy) for inverse Weibull and Lomax distributions at various situations. In the most life testing experiments, it is usually to terminate the test before the failure of all items.This is due to the lack of funds and / or time constrains. So, estimation of Shannon entropy for inverse Weibull distribution under multiple censored data is discussed using maximum likelihood method. In some experiments and studies one can observe or record only the observations that are more extreme than the current extreme value. This type of data is called record values and has been applied in many fields. So, Estimation of entropy for Lomax distribution based on upper record values is produced using Bayesian estimation under different loss functions in case of informative and non-informative priors. Dynamic cumulative residual entropy is a recent measure of uncertainty which plays a substantial role in reliability and survival studies. So, estimation of dynamic cumulative residual entropy in case of Shannon entropy and Rényi entropy for Lomax distribution are discussed using maximum likelihood and Bayesian methods of estimations. Finally, applications of different methods via real data are provided |