Complex world
We live in a world of incredible complexity, whether it’s molecules or atoms or the universe as a whole. Maybe you don’t think deeply about the complexity of the things around you because you’re used to what you see every day. Everything that was invented before you were born is a natural part of the world, so many things feel like the world is the way it is.
How did something as complex as humans emerge? How did something as complex as a computer come into being? How did something as complex as rivers and mountains emerge? Are some things related to some other things? Many complex things around may not have any obvious rules, and many things are not deterministic relations.
What is the message
The word information makes us feel familiar and unfamiliar. Familiar because we are in an information age, with life closely related to a large number of various information, such as books, mobile phones, computers and so on. It is strange because it is difficult to define exactly what information is and how to quantify it. For example, how much information is contained in a sentence like “the earth is round”? Is it different in ancient times than in modern times?
Everything contains information, and information can be processed and used (in a broad sense called computing). In a broad sense of information and information processing, the changes we see are actually calculations of the universe. It may be said that the essence of the world is information + computing. Many scientists believe that information theory is the best hope for unifying general relativity and quantum mechanics.
Measurement information
Quantification of information is the basis of the information revolution. Before Shannon published the Mathematical Theory of Communication, information was a very abstract and undefinable thing. For example, one person said that a company has a lot of information, it has 100 million emails. In physics, chemistry and other fields, various dimensions can be used to measure the properties of various objects. In order to make information measurable, Shannon proposed the concept of bit alone, using bit to measure information, and bit has also become a dimensional member.
The information entropy
Entropy is a concept of physics. In a thermodynamic system, entropy and energy can be used to describe the changes in the system. Entropy can be understood at the molecular level. Physical systems are made up of atoms and molecules, and the instantaneous states (position, velocity) of all particles describe the overall state. Particles are constantly changing from one microscopic state to another, and entropy corresponds to the number of microscopic states, and the more microscopic states there are, the higher the entropy. In other words, entropy is a measure of uncertainty.
Referring to the concept of entropy in physics, Shannon introduced it into the field of communication. Shannon believed that a system must have multiple states to carry information, and the more states, the more information it can contain. In addition, the probability problem needs to be introduced, since the sum of the probabilities of all possible states must be 1 under the given conditions. And the more possible states there are, on average, the less likely any one of them is to happen.
Information is closely related to the number of possible states, probability, text length and the meaning of text itself, but from the perspective of communication, it is unnecessary to care about the meaning of text itself. To measure the text information, first determine how many possible states there are and their corresponding probabilities, multiply the probabilities of each state by the logarithm of the probabilities, and then add up all the terms. The result is the size of the entropy of information, in bits.
Information entropy points
Information entropy can be used to describe the ability of a system to carry information, and also to represent the amount of information carried by something. When information entropy is used to describe the amount of information about something, it lacks redundancy in consideration, such as two repeated sentences do not bring twice as much information. The two sentences in different order have the same amount of information, but their meanings are not the same from the linguistic level.
The way to reconcile the information of everyday meaning with the information entropy is to regard the information entropy as the maximum amount of information that a text can contain.
————- Recommended reading ————
Summary of my open Source projects (Machine & Deep Learning, NLP, Network IO, AIML, mysql protocol, Chatbot)
Why to write “Analysis of Tomcat Kernel Design”
2018 summary data structure algorithms
2018 Summary machine learning
2018 Summary Java in Depth
2018 Summary of natural language processing
2018 Summary of deep learning
2018 summary JDK source code
2018 Summary Java concurrency Core
2018 summary reading passage
Welcome to: Artificial intelligence, reading and feeling, talk about mathematics, distributed, machine learning, deep learning, natural language processing, algorithms and data structures, Java depth, Tomcat kernel and other related articles