Skip to content
thecscience
THECSICENCE

Learn everything about computer science

  • Home
  • Human values
  • NCERT Solutions
  • HackerRank solutions
    • HackerRank Algorithms problems solutions
    • HackerRank C solutions
    • HackerRank C++ solutions
    • HackerRank Java problems solutions
    • HackerRank Python problems solutions
thecscience
THECSICENCE

Learn everything about computer science

Information Theory and Coding

YASH PAL, February 10, 2021November 7, 2024

Information theory and Coding – Primarily in Information Theory and Coding, we discuss communication systems. it is a mathematical approach to studying digital signals whether it is analog or digital and the coding of information with respect to the communication.

Information theory and coding

Information

Information is an intelligence/idea or message that is used to transfer thoughts from one source to another. an information source could be electrical, picture, video, or audio or can be in speech/voice format. To pass information from the source to the destination we need an encoder, transmitter, channel, decoder, and receiver.

In information theory, there are three conditions that can occured uncertainty, surprise, and information at the time passing the information.

Uncertainty

If the information is not transmitted by the source then there is a condition of uncertainty means the receiver doesn’t know what type of information and which information can be passed to the receiver comes into an uncertainty condition.

for example, if your father is coming the home and you told him to bring a gift. but you don’t know what type of thing your father will give you in the gift because the event is not occured. so there is an uncertain condition for you.

Surprise

If the information has just been passed to the receiver then there is a surprise condition. just like your father just gives you your favorite game as a gift. so there is a surprise condition for you.

Information

If the information has passed to the receiver a time back then there is a condition of having some information. just like your father gifted you a game a time back then in the current time it is the information for you and others.

Remember these three conditions uncertainty, surprise and information occur at different times and the difference in the occurrence of these conditions will generate the probability. The information is measured in a bit. so the unit of information is a bit.

Mutual information

It is defined as the amount of information transferred where x is transmitted and y is the receiver.

Average mutual information

It is defined as the amount of source information gained per received symbol.

Properties of Information

  1. Information is always positive.
  2. While increasing in uncertainty then information also increased.
  3. If the receiver knows that message is being transmitted then the information is zero.
  4. If a source m1 is transmitting information I1 and another source m2 is transmitting information I2 then the combined information is (I1 + I2).

Entropy

It can be defined as a measure of the average information content per source symbol. it is also called Shannon’s Entropy and is denoted by H.

Formula of Entropy

H = Σ pi Logb(Pi)

Here

p = probability

i = occurrence of message

b = base of the log.

Conditional Entropy

The amount of remaining uncertainty of an input channel after observing the channel output is called conditional entropy.

The formula of conditional entropy

H(X/Y) = p(x)H(Y/X=x)

In simple words, conditional entropy H(X/Y) of x is the average uncertainty of x when y is known.

Discrete memoryless source

A source can be a discrete memoryless source if the message present in the source does not continue and the value of each message is independent of the previous value.

For example, if we have a source X = {m1, m2, …. mn} then this is a discrete memoryless source because we can count the number of messages and each value of m is not dependent on the previous value.

Source coding

In source coding when a signal is transmitted then it is encoded into a codeword or Morse word that can be decoded by the receiver.

so if a source is the discrete memoryless source of entropy H then the codeword is always greater than or equal to the source code of a signal. that means the symbols in the codeword are greater than or equal to the alphabet in the source code.

Channel coding

Channel coding is a communication system that improves the reliability of the system. it maps the data sequence into a channel input sequence and inverse maps the channel output sequence into an output data sequence. the main purpose of channel coding is to minimize the noise of the signal.

Also read

  • Characteristics of Internet
  • O level exam syllabus
information theory ITC

Post navigation

Next post

Leave a Reply

You must be logged in to post a comment.

  • HackerRank Dynamic Array Problem Solution
  • HackerRank 2D Array – DS Problem Solution
  • Hackerrank Array – DS Problem Solution
  • Von Neumann and Harvard Machine Architecture
  • Development of Computers
©2025 THECSICENCE | WordPress Theme by SuperbThemes