![]() ![]() It is theoretically capable of quantifying not only the statistical information but also the spatial information of a data set. First, as previously mentioned, Boltzmann entropy is the ideal, standard measure of disorder or information. There are two advantages of employing Boltzmann entropy. Rather than further improving Shannon entropy, this study introduces a novel strategy, namely shifting back from Shannon entropy to Boltzmann entropy. Therefore, considerable efforts have been made to improve the suitability of Shannon entropy for spatial data, and a number of improved Shannon entropies have been put forward. The spatial information is ignored by Shannon entropy for example, a grey image and its corresponding error image share the same Shannon entropy. However, only the statistical information of spatial data can be quantified by using Shannon entropy. ![]() It has been widely utilized to quantify the information content of geographic data (or spatial data) in either a vector format (i.e., vector data) or a raster format (i.e., raster data). This entropy has served as the cornerstone of information theory and was introduced to various fields including chemistry, biology, and geography. The quantification result was interpreted as the information content of a telegraph message, hence also the term information entropy. Shannon entropy was proposed to quantify the statistical disorder of telegraph messages in the area of communications. In practice, the widely used entropy is actually proposed by the American mathematician, electrical engineer, and cryptographer Claude Elwood Shannon in 1948, hence the term Shannon entropy. As a result, this entropy (also referred to as Boltzmann entropy and thermodynamic entropy) has remained largely at a conceptual level. As noted by the American sociologist Kenneth Bailey, “when the notion of entropy is extended beyond physics, researchers may not be certain how to specify and measure the macrostate/microstate relations” (Bailey 2009, 151). In practice, however, the Boltzmann equation involves two problems that are difficult to solve, that is the definition of the macrostate of a system and the determination of the number of possible microstates in the microstate. Its computation can be theoretically performed according to the Boltzmann equation, which was proposed by the Austrian physicist Ludwig Boltzmann in 1872. Accordingly, it has been widely regarded as an ideal measure of disorder. It is the subject of the famous Second Law of Thermodynamics, which states that “the entropy of a closed system increases continuously and irrevocably toward a maximum” (Huettner 1976, 102) or “the disorder in the universe always increases” (Framer and Cook 2013, 21). Entropy is an important concept that originated in thermodynamics. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |