Chapter 0: Fundamentals of probability theory
Chapter one: Introduction
Communication system model
Basic mission requirements for communication systems
- Reliable: the message sent by the source should be transmitted as accurately as possible, without distortion or limited distortion, to the receiving end.
- Efficient: Transmit the maximum number of messages in the shortest possible time and with the fewest possible devices.
Information, messages and signals
- Information: an abstract concept that can be described quantitatively. Information, matter and energy are the three elements that make up all systems.
- Message: is the carrier of information, relatively specific concepts, such as language, text, numbers, images.
- Signal: physical quantity representing information, amplitude, frequency, phase, etc., of electrical signal.
Classification of sources
- Continuous message source: refers to the message source that sends continuous messages with continuous distribution in time and amplitude, such as images and graphs.
- Discrete information source: refers to the information source that sends discrete messages with discrete distribution in time and amplitude, such as characters, numbers, data and other symbols.
The basic task of information theory
- Design effective and reliable communication systems.
The limitations of information theory
Shannon information theory applies to the information that can be described quantitatively, but the information that is difficult to be described quantitatively is powerless.
Chapter two: Entropy and its properties
The information I (xi)
- The nature that self-information should satisfy
- The negative
- The more likely something is, the less self-information it has.
- Probability is 1, self information is 0, probability is 0, self information is infinite.
- The combined amount of information of two independent events is equal to the sum of their respective amounts of information.
- The formula and meaning of self information.
3. Unit of self-information4. Self-information
Information entropy H(X) of Discrete Information Source
The mathematical expectation of self-information is the average amount of information of the source.
- Information entropy example
2. Significance of source entropy3. Joint entropy and conditional entropy 4. Conditional entropy5. Derivation formula between joint entropy and conditional entropy