Gigabyte (GB)
The gigabyte, symbol GB, represents 1,000,000,000 bytes in decimal, though in binary it is often approximated as 1,073,741,824 bytes (1,024 MiB). The unit was introduced in the 1980s as personal computers and hard drives increased capacity. Gigabytes measure larger data volumes, including software, multimedia, and storage devices, and became the standard for consumer storage, networking, and cloud computing as a scale between megabytes and terabytes.
Bit (b)
The bit, symbol b, is the fundamental unit of digital information. Short for binary digit, it was introduced in 1948 by Claude Shannon in his work on information theory. A bit represents either 0 or 1, forming the basis of modern computing and digital communication. Bits measure data transfer, storage efficiency, and computational processes. Individually small, billions of bits combine to store files, images, and programs, making the bit essential in the digital era.