When we talk about bits, we are referring to a basic unit that is used to measure data or information in computing and digital communications. A bit represents a logical state using two values - 0 or 1. This type of system within the computer is called the binary value system because it can maintain accurate levels of integrity even with signal or noise interruptions.

More than one bit can correspond to the same state. Two bits can produce four possible values, and 16 bits will create 65,536 combinations of allowed states for a system.

Previously, individual pieces of data were grouped as clusters that each had 8 bits. In order to save on space, many bytes are now combined into clusters called nibbles which are half the size of one byte. It can be easy to confuse a bit with a byte if you don’t know what they are for or their functions.

8 bits = 1 byte. B is used to represent bytes in units of measurement; b stands for bits - the smallest unit of data measurement.

What will we learn?

  1. History
  2. Functionality
  3. Units


In 1948, Claude Shannon conceptualized and published a unit of measurement for information. He first published 2 two-part papers in the July and October editions of the Bell Systems Technical Journal while working at Bell Labs in the 1940s. His papers were called A Mathematical Theory of Communication .

These concepts restructure the definition of data in technology. Previously, seen as flaws, pulses and waves transmitted through analog signals, which were interfered by noise and connection interference. In this case transmission of data will be inaccurate or erroneous.

Claude Shannon, the father of Information Theory, ranks among the first experts in the field. He proposed that bits can represent two separate states.


Bits of information are transferred over an electronic device in waves, and the speed at which these bits move is measured in |bits per second.|

When downloading data over the internet, it is typically measured in bytes per second, but when transferring information for storage purposes, such as on a CD or DVD (thats why old discs measure in kBps!), it is usually measured in bits.


The word |bit| is used to refer to the storage size in a device, as well as the amount of information sent over a system.

The most commonly used unit of information size measurement is the bit, which has a number representation but must be prefixed by any metric prefix to represent progressively higher values.

Prefixed Unit - Equivalent in bits

  • Kilobit (Kb) - (1000^1)1,000 bits
  • Megabit (Mb) - (1000^2)1,000,000 bits
  • Gigabit (Gb) - (1000^3)1,000,000,000 bits
  • Terabit (Tb) - (1000^4)1,000,000,000,000 bits

However, there are even larger units of data known as Peta-, Exa-, Zetta-, and Yotta-.

Data is usually measured in bytes, but it can also be measured in other increments. 1000 bytes equals 1 kilobyte, and so on.

Even you will be amazed to learn that the data on all of the internet cant even reach these numbers. As of 2016, Cisco estimated that we were already in the Zettabyte era. These are huge numbers used by those working with large computer industries to measure information or data. Every day people refer more and more often to GB, Mb.

The prefixes used to measure data rates are the same when shown in Kbps or Mbps, but what differs is that higher numbers represent a faster rate of transmission. For example, 100 gigabits per second (or 100Gbps) has been reached in highly developed areas. But most of the world still lacks such modern technology which means internet speeds.