According to E.T. Jaynes and E.P...Wigner, entropy is an anthropomorphic
concept in the sense that in a physical system correspond many thermodynamic
systems. The physical system can be examined from many points of view each time
examining different variables and calculating entropy differently. In this
paper we discuss how this concept may be applied in information entropy; how
Shannon's definition of entropy can fit in Jayne's and Wigner's statement. This
is achieved by generalizing Shannon's notion of information entropy and this is
the main contribution of the paper. Then we discuss how entropy under these
considerations may be used for the comparison of password complexity and as a
measure of diversity useful in the analysis of the behavior of genetic
algorithms.(read more)